Troubleshooting guide#

This section provides help when you encounter issues while customizing the solution. It includes troubleshooting tips for common issues, as well as general instructions on debugging full-code customizations.

Troubleshooting common issues#

This section describes some common issues you may encounter when customizing the solution and how to resolve them.

General#

  • Issues with long file paths during solution unpacking / modifications / repacking:

    Windows has a maximum path length of 260 characters, unless long path names are enabled explicitly on the system.

    Therefore, if you are using Windows, you may encounter issues with long file paths when unpacking the solution .awa file, modifying and running the unpacked solution, or when repacking it. To avoid this, you should unpack the solution to a location with a short path, such as C:\temp\solution.

  • Error message “Please contact your IT admin to specify an optiSLang version in the .env file and try again” after clicking the Start Analysis button:

    ../_images/customization_troubleshoot_please_specify_osl_version.png

    The solution requires an environment variable named OSL_VERSION to be defined with the optiSLang version to use, in the format of XXY, where XX is the last two digits of the year and Y is the number of the software revision. For example, for optiSLang 25R2, the variable should be defined as follows:

    OSL_VERSION=252
    

    Usually, the variable is defined in the .env file in the solution folder and gets loaded automatically when you run the solution.

    If you are using Visual Studio Code and run the solution from the integrated PowerShell terminal, comments in the .env file are not treated correctly and some variables might not be loaded. To overcome this issue, you can either remove the comments in the .env file, add a line break between the affected variable and the comment, or manually define the environment variable in the terminal before running the solution. If you are using a different IDE, please check if the .env file is loaded correctly and if the variable is defined correctly.

No-code customizations#

  • Custom image at the project title/problem-setup page or custom header logo is not displayed:

  • Custom text at the problem-setup page is not displayed, or does not display as expected:

    • Ensure that the text file is placed in the correct location and that the filename is correct. For details, see the section Add custom texts - No-code customization.

    • If the custom text is displayed, but the content is not shown as expected, check the markdown syntax and file content of the custom text file (for example, with a markdown previewer).

Low-code customizations#

  • Custom image at arbitrary locations is not displayed, or is displayed at the wrong location:

    • Ensure that the image file is placed in the correct folder src\ansys\solutions\<solution-name>\ui\assets\images\.

    • Ensure that you added the correct image filename with the correct file extension in the Image enumeration in the src\ansys\solutions\<solution-name>\ui\utils\images.py file. For details on how to do this, see the section Add custom images - Low-code customization.

    • Ensure that you have added the Image.CUSTOM_IMAGE.get_div() command in the correct location in the layout definition, where CUSTOM_IMAGE is the name of the image in the Image enumeration.

  • Custom image shown with the wrong size:

    • Check that you provided the correct formatting options in the get_div() command.

  • Custom text at arbitrary locations is not shown at all, or is shown at the wrong location:

    • Ensure that the custom text markdown file is placed in the correct folder src\ansys\solutions\<solution-name>\ui\assets\texts\.

    • Ensure that you added the get_custom_text() command in the correct location in the layout definition.

    • Ensure that you provided the correct filename (with correct file extension) in the get_custom_text() command.

  • Custom text is shown, but not as expected:

Full-code customizations#

  • No data is shown in the app:

    • Be aware that before clicking the Start Analysis button, optiSLang is not started and no monitoring data is available.

    • This also applies to the navigation tree which initially only displays static data, and in the default solution gets updated with the project tree of actors after starting optiSLang.

    • Even after starting optiSLang, the monitoring data and files might not be immediately available since they need to be retrieved from the optiSLang server. So it is important to cover the no data case properly in the code.

  • The data shown in the app is not updating as expected:

    • Ensure that the optiSLang project server is running and healthy (Commands buttons are enabled).

    • Check that you are using the correct entry of the WebsocketStreamListeners enumeration (in src\ansys\solutions\<solution-name>\ui\utils\websocket.py) as Trigger or Input to the callback that you are using to update the data in the UI.

    • If you are certain that the callback is being called correctly and the monitoring data that is retrieved from the backend is being updated as expected, but the UI is not updating, this could be caused by a specific behavior of Dash, where callbacks can “overtake” each other:

      • Dash discards the response of a callback if the same callback has been triggered (and finished) again before the response of the first callback has been received.

      • This can happen, for example, if the callback is triggered multiple times in a short time span, and the response takes longer than the time between the triggers, or if the callback execution time varies significantly depending on the inputs.

      • This is especially relevant when, depending on the inputs, the callback might return no_update or raise PreventUpdate exceptions and thus finish faster than when it returns a value. In this case, when a no_update response is sent before the previous update response is received, the update response gets discarded and the UI does not change. Therefore, this scenario must be avoided.

Debug your customized solution#

This section provides general instructions on how to debug your customized solution in case the troubleshooting advice given in the previous section is not sufficient. It is assumed that you are familiar with Python and the debugging tools available in your IDE. Furthermore, you should set up the development environment and run the solution as described in the section Test the changes.

Available log files#

In the terminal where you run the solution, you find the file paths to the solution logs:

../_images/customization_troubleshooting_logs.png

If you are using Visual Studio Code and run the solution from the integrated PowerShell terminal, the log files can be opened by clicking on them with the ctrl key pressed. The most important log files are explained in the following:

  • UI log:

    • Message GLOW UI logging to ui_log_file_path, where ui_log_file_path is the path to the UI log file

    • Logs errors, warnings, and info messages from the UI side

    • Dash callback errors are logged here

  • Backend/API log:

    • Message GLOW API logging to api_log_file_path where api_log_file_path is the path to the API log file

    • Logs errors, warnings, and info messages from the backend, except from long-running methods (see below for specific logs)

    • Should normally not be needed when not modifying the backend

  • Specific log for startup of optiSLang:

    • Message GLOW METHOD RUNNER logging to base_log_file_path/long_running_methods/log_start_osl_project_XXXXXXXX.log, where base_log_file_path is the path to the folder containing all log files and XXXXXXXX is a unique identifier for the log file

    • Logs errors, warnings, and info messages that occur when starting optiSLang

    • Could be useful to check if having trouble starting optiSLang

  • Specific logs for backend data and result files monitoring:

    • Message GLOW METHOD RUNNER logging to base_log_file_path/long_running_methods/log_update_data_XXXXXXXX.log and others, where base_log_file_path is the path to the folder containing all log files and XXXXXXXX is a unique identifier for the log file

    • Logs errors, warnings, and info messages that occur when monitoring data and result files

    • Should normally not be needed when not modifying the backend

To enable additional debug output in the log files, you can add the variable GLOW_LOGGING_LEVEL=DEBUG in the .env file before running the solution.

Debugging options#

  • Print statements:

    • You can use print statements to log information to the terminal where you run the solution. This is useful for simple debugging purposes, but should be removed in production code.

  • Run the solution UI in a browser:

    • When you run the solution on Linux, the run_solution command requires you to open the UI in a browser always. This is useful for debugging purposes, as you can use the browser developer tools to inspect the UI layout, Dash callback calls, and websocket streams.

    • To open the UI in a browser on Windows, you can proceed as follows:

      • Use the run_solution command as usual.

      • Open the UI log and search for the line Running GLOW UI server on http://host:port, where host is the IP address of the machine and port is the port number.

      • In the UI log, search for the line "GET /projects/project_id HTTP/1.1" 200. The project_id is the ID of the project that is being opened in the UI.

        ../_images/customization_troubleshooting_ui_log_content.png
      • Open a web browser and enter the URL http://host:port/projects/project_id. This opens the UI in the browser, where you can use the developer tools (usually by pressing the F12 key) to inspect the UI layout, Dash callback calls, and websocket streams.

      • Note that you need to keep the normal solution window open while debugging, since both the UI and the backend are shut down when you close the solution window.

      ../_images/customization_troubleshoot_run_in_browser.png
  • Enable debug mode:

    • General debug mode:

      • Set the variable GLOW_DEBUG=True in the .env file before running the solution.

      • This allows you to attach a debugger to the backend code.

      • Furthermore, this runs Dash in debug mode, which enables hot reloading of the UI code (that is, changes to the UI code are automatically applied without restarting the UI server) and provides details in case of a callback error directly in the UI:

        ../_images/customization_troubleshoot_dash_debug_mode.png
    • Debug mode for the UI:

      • Set the variable GLOW_UI_PYTHON_DEBUGGING=1 in the .env file before running the solution.

      • This allows you to attach a debugger to the UI code (Dash callbacks).

    • For more details about the different debug modes and how to attach to the solution with a debugger, see the section Debugging in the SAF User Guide.