Tag : WebUI

  • 24.03: Release Update

    By Lablup

    24.03, the first release of Backend.AI in 2024, has been released. This update brings significant improvements to the UI and user experience, making it even more functional to install and operate. It includes the following updates since 23. 9, which include

    Backend.AI Core & WebUI

    • We've made installing Backend.AI even easier with the addition of a TUI-based installer, which automates the download process and makes it easy for users to install and get started with Backend.AI.

    • New: Added trash functionality to vfolders. Files in unused vFolders are now moved to the trash instead of deleting them completely, and are then deleted through a complete deletion process. New: Added an argument value to indicate the state of the vfolder.
    • New: Added Backend.AI Model Store, where you can now store, search, and easily utilize various machine learning and deep learning models.
    • Added metadata for indexing to vfolders to utilize indexes instead of full directory scans for queries.
    • Improved system resource utilization by introducing a limit policy for session creation based on the number of pending sessions and requested resource slots. This new resource policy option helps filter and set the maximum value for resource presets and custom resource sliders in the Session Launcher.
    • We've added dark themes to the WebUI, so users can now choose from a variety of options to suit their personal preferences.

    • Improved screen tuning for unaligned line breaks, whitespace, and extruded announcements in the WebUI, as well as stability improvements such as session name validation.
    • The Session Launcher for Model Serving also limits UI input so that only as much input is available as the allocated resources.
    • Added an allowAppDownloadPanel argument to hide the WebUI app download panel in the config.toml file for different UI user options.

    Backend.AI is constantly evolving to provide a more powerful and user-friendly experience while supporting a variety of environments in the ever-changing AI ecosystem. We look forward to seeing what's next! Make your AI accessible with Backend.AI!

    29 March 2024

  • 2023 Winter Intern in Lablup

    By Byeongjo Kim

    Overview

    I applied to the Open Source Contribution Academy (hereinafter referred to as the Contribution Academy) hosted by OpenUp, and worked as a fall intern at Lablup Inc. (hereinafter referred to as Lablup) from November to December for 8 weeks. Afterwards, I extended it for an additional 8 weeks from January to February, working a total of 16 weeks.

    After being discharged from the military, I have written about the experiences I had while working at Lablup, my first company as a developer.

    Motivation for Applying

    Even before the Contribution Academy, I was interested in Lablup, and coincidentally, I had an opportunity to contribute through the Contribution Academy.

    During the Contribution Academy period, I worked on resolving issues and refactoring the webui of Backend.AI.

    While participating in the Contribution Academy, I felt a lot of affection, interest, and enjoyment towards Backend.AI, and I began to think that I wanted to continue contributing after the program ended.

    Lablup happened to provide an opportunity to work there in conjunction with the Contribution Academy, so I applied without hesitation.

    Onboarding

    For the first 3 weeks of the internship, I underwent an onboarding process.

    I went through implementing a RealTime Chat, setting up the Backend.AI environment, and then the Pebble Seminar in that order.

    RealTime Chat

    This was the first assignment to become familiar with the core side of Backend.AI's code. I implemented a real-time chat app using Python, utilizing the aiohttp, aioredis, and asyncio libraries.

    Since there was no condition to store the chat contents, I used the in-memory database redis.

    I made it so that when a user enters a chat room, they subscribe to that chat room, and when a user enters a message, it publishes the entered message to the subscribed users, meaning the other users in the same chat room.

    RealTime Chat in action

    While I was able to handle Python at a basic level from preparing for coding tests, I had no experience using libraries like aiohttp, asyncio, and aioredis, so it took me some time to understand and grasp the concepts.

    However, this assignment helped me a lot in understanding the core side of Backend.AI's code, and it was good to be able to study new libraries.

    Setting up the Backend.AI environment

    Since I had already installed Backend.AI during the Contribution Academy, setting up the environment during the internship period wasn't too difficult.

    However, I was well aware that installing Backend.AI is not easy, as I had encountered many errors and failures while trying to install it during the Contribution Academy, and the other person doing the internship with me also had a lot of difficulties during the installation process.

    Since I had already experienced those failures and knew the solutions, I was able to help, and we were able to install it quickly and move on to other tasks.

    While setting up the environment, I also configured a virtual machine and VPN, and set up the environment on a virtual machine as well, so that I could work even if there were problems on my local machine. After setting up the configuration on the virtual machine, I mainly used the local for development during the subsequent work, and the virtual machine as a test server. The company's VM Farm, which allows for easy management and configuration of virtual machines, made it great for setting up development and testing environments.

    Pebble Seminar

    After completing the RealTime Chat and setting up the Backend.AI environment, I prepared a short seminar based on understanding the structure and code of Backend.AI. I was tasked with presenting on GraphQL and Relay, which are used in the Backend.AI WebUI.

    While I had experience with GraphQL, I felt that my knowledge was lacking for presenting in front of others, and Relay was a new library to me, so I was quite worried about preparing for the Pebble Seminar and read through many documents to prepare. First, I familiarized myself with the concepts by reading the official documentation for GraphQL and Relay, and then analyzed the Backend.AI code one by one to understand how they were applied and functioning in Backend.AI.

    Pebble Seminar preparation materials

    By analyzing the code while preparing for the Pebble Seminar, I naturally came to understand the code running in the WebUI, and this greatly helped me in finding and resolving issues during the subsequent work.

    Resolving Backend.AI issues and implementing features

    After completing the onboarding, I finally joined the frontend team and started resolving Backend.AI issues and implementing features. I had a coffee chat with the frontend lead to define the categories of work for this internship period:

    1. Creating a Table Column Setting component
    2. Researching E2E Testing
    3. Daily tasks

    During the 8-week internship period from November to December, I created a total of 19 Pull Requests, of which 18 were merged, and 1 is still under review. Since I had experience finding and assigning issues during the Contribution Academy, I had less difficulty with it, and because I enjoyed resolving issues, I was able to resolve more issues than others.

    Feature Addition PRs

    1. Implementing Table Columns Setting

    https://github.com/lablup/backend.ai-webui/pull/2071

    This was one of the issues I aimed to work on during the internship period. It was the only component that I conceived and implemented from scratch during the fall internship, rather than refactoring an existing component. Before implementing this feature, I thought it was a simple task that I could finish quickly, but things turned out differently from my expectations.

    First, I realized that I had been thinking too simplistically about creating new components. Even though I had designed and considered the props to be received before creating components in the past, through this issue, I felt that when creating a new component, I should invest more time and effort while considering scalability. I also realized that I should pay more attention to how other sites are designed and what features are applied.

    Table Columns Setting

    2. Adding service endpoint and owner columns to the Model Serving page table

    https://github.com/lablup/backend.ai-webui/pull/2047

    Previously, when creating a model service, users had to go to the detail page to check the endpoint, which is a frequently used feature. So there was a request to add the endpoint to the table column. Additionally, since the admin account can see services of users in the same group, there was a suggestion to have a column showing the service owner. Since the GraphQL fields for retrieving this data were already implemented, I added the fields to the query to fetch the endpoint and service owner data, and then added columns to the table to display the data. The owner column is only shown for admin accounts.

    Implementation view. Screen for admin account (left) and user account (right)

    3. Disabling log button for sessions in CANCELLED state

    https://github.com/lablup/backend.ai-webui/pull/2045

    The CANCELLED state means that the container has never been created or failed to be created. Previously, the log button was enabled even for sessions in the CANCELLED state, and if a user clicked the log button, the agent could not find the container information, resulting in a 500 error. In this PR, I made it so that the log button is disabled for sessions in the CANCELLED state, preventing users from clicking it.

    Session in TERMINATED state (session 1) and CANCELLED state (session 2)

    4. Testing and creating a custom hook for dark mode

    https://github.com/lablup/backend.ai-webui/pull/2120

    Before implementing dark mode, I found components with hardcoded colors and implemented a custom hook named useThemeMode for applying dark mode. When creating the custom hook, I tried to use the useLocalStorageState hook from ahooks, but contrary to my expectations that it would automatically manage states with the same key value, I found that they operated independently. To handle states with the same key value automatically updating when the value changes, I added a custom hook named useLocalStorageGlobalState, and then used that to create the useThemeMode custom hook for setting the dark mode.

    Bug fix PR

    1. Allowing signup without invitation token

    https://github.com/lablup/backend.ai-webui/pull/2046

    In the config.toml, when the allowSignupWithoutConfirmation option is set to true, users can sign up without an invitation token. However, when a user clicked the sign up button, an error occurred because the token value was undefined. In this PR, I modified it so that if allowSignupWithoutConfirmation is true, the token variable is not used. Additionally, previously users could modify other input values while the core was processing the data after clicking the sign up button, and the previous data remained when the dialog was closed and reopened. In this PR, I made it so that other input values cannot be entered while data is being processed, and the previous input values are cleared when the dialog is closed.

    2. Displaying the correct screen for the selected sub-tab on the user management page

    https://github.com/lablup/backend.ai-webui/pull/2055

    On the user management page, there are sub-tabs for displaying active users and deactivated users. However, if a user navigated to another page and then returned to the user management page, even though the sub-tab was set to inactive, the screen displayed the list of active users, causing confusion. This PR resolved that issue by remembering the current sub-tab when navigating to another page, and displaying the appropriate screen for that sub-tab when returning to the user management page.

    Before (left) and after (right) the fix

    Extending the internship

    As I resolved issues, the 8-week period flew by, and it was time to wrap up the fall internship.

    Working at Lablup as my first job after being discharged from the military was an important period for me. During the internship at Lablup, I was able to experience my strengths and weaknesses, what skills I needed to further prepare, and how other developers work. The 2-month period felt very short, and since I had enjoyed working so much during that time, I wanted to continue working. So I expressed my desire to extend the internship to the lead, and we agreed to extend it for another 8 weeks until February. During the fall internship, I had thought a lot about my weaknesses, but I couldn't find my strengths. So, I started with these 3 personal goals:

    1. Find my strengths during this period
    2. Read the documentation whenever I have time
    3. Work even harder, leaving no regrets

    Resolving issues and implementing features during the extended period

    The work during the extended period did not differ much from before. Without the onboarding process and installation, I could focus more on resolving issues.

    Feature Addition PRs

    1. Refactoring ErrorLogList

    https://github.com/lablup/backend.ai-webui/pull/2131

    I refactored the ErrorLog List, which was previously implemented using Lit elements, to React. This feature was the most satisfying issue for me, as I personally use it frequently after the refactoring.

    Before (left) and after (right) refactoring

    During the refactoring, new Search and Error filter features were added.

    Added Search feature (left) and Filter feature (right)

    2. Modal drag functionality

    https://github.com/lablup/backend.ai-webui/pull/2179

    I used the React-draggable library to add functionality for modals to be dragged. By adding the Draggable prop to a modal, it can be applied to any modal that requires dragging.

    Draggable Modal

    By clicking the icon on the left side of the modal title and moving the mouse, the modal can be moved to the desired position on the screen.

    Currently, it is applied to the modal for viewing user information on the user management page and the modal for changing user settings, where you can check it.

    While it is not being used in many places yet, I think this PR will be useful as more components and features are added.

    Bug fix PR

    1. Modifying Vfolder invitation permissions

    https://github.com/lablup/backend.ai-webui/pull/2143

    There was an issue where the user permissions for group vfolders were not being updated. When trying to modify the permissions, the items were not displayed or selectable properly in the select. Previously, the items were being displayed using option tags, but I changed it to use mwc-list-item to display the items and modified the overflow option to resolve this issue.

    Before (left) and after (right) the PR

    2. ResourceGroupSelect extending outside the card

    https://github.com/lablup/backend.ai-webui/pull/2166

    There was an issue where the ResourceGroupSelect value would be displayed outside the card if it was too large.

    Symptoms of the issue

    To resolve this issue, I set the max-width CSS on the Select component so that it cannot exceed the width of the card.

    Additionally, in this PR, I added a Search feature to the Select component, for which I used the useControllableValue hook from ahooks. The useControllableValue hook helps manage props from either the parent or itself. While it was a simple PR, it took me longer than expected since it was my first time using useControllableValue. I was able to resolve this issue with the help of the lead and another intern.

    3. Key pair list not showing when clicking the generate & manage key pair button on the summary page

    https://github.com/lablup/backend.ai-webui/pull/2194

    On the summary page, there are buttons for "Generate New Key Pair" and "Manage Key Pairs." However, when clicking these buttons, instead of showing the key pair list, it simply navigated to the user management page, displaying the user list.

    "Generate New Key Pair" and "Manage Key Pairs" buttons on the summary page

    When clicking the "Generate New Key Pair" button (left) and when clicking "Manage Key Pairs" (right)

    While this issue was not critical, I resolved it because I had experienced a lot of confusion when I first used Backend.AI and didn't fully understand the key pair feature.

    After resolving this issue, I could confirm that the key pair list was displayed on the screen as intended.

    After resolving the issue, when clicking the "Generate New Key Pair" button (left) and when clicking "Manage Key Pairs" (right)

    Completing the Internship

    Thanks to the Contribution Academy, which started from a friend's recommendation after being discharged from the military, I was able to contribute at Lablup for an extended period. Since I had no previous internship or project experience at other companies, this was a very important period for me as I was starting anew after being discharged. It was great that I could experience my strengths and weaknesses, the skills I lack, and the culture of an open-source company at Lablup. How many companies make you want to go to work every day with their horizontal structure, free atmosphere, pleasant working environment, and good equipment? Although I worked at Lablup for 4 months, I felt like I wanted to go to work every day, and if it was Lablup, I could work at a company while doing interesting and desirable work for a long time. Over the 4-month period, I also developed a fondness for Backend.AI, the service provided by Lablup, and I plan to attend the conference hosted by Lablup every year whenever possible to see its advancements and technologies.

    Lablup Office

    This post was also published on the author's personal blog. https://gee05053.tistory.com/32 This post is automatically translated from Korean

    11 March 2024

  • 23.09: September 2023 Update

    By Lablup

    23.09: September 2023 update

    In the second half of 2023, we released 23.09, a major release of Backend.AI. In 23.09, we've significantly enhanced the development, fine-tuning, and operational automation of generative AI. We've automatically scaled and load-balanced AI models based on workload, expanded support for various GPUs/NPUs, and increased stability when managing a single node as well as 100-2000+ nodes. The team is working hard to squeeze every last bit out of it. Here are the main improvements since the last [23.03 July update] (/posts/2023/07/31/Backend.AI-23.03-update).

    Backend.AI Core & UI

    • The Backend.AI Model Service feature has been officially released. You can now use Backend.AI to more efficiently prepare environments for inference services as well as training of large models such as LLM. For more information, see the blog Backend.AI Model Service sneak peek.
    • Added the ability to sign in to Backend.AI using OpenID single sign-on (SSO).
    • If your kernel image supports it, you can enable the sudo command without a password in your compute session.
    • Support for Redis Sentinel without HAProxy. To test this, we added the --configure-ha setting to the install-dev.sh file.
    • Added the ability to use the RPC channel between Backend.AI Manager and Agent for authenticated and encrypted communication.
    • Improved the CLI logging feature of Backend.AI Manager.
    • Fixed an issue where Manager could not make an RPC connection when Backend.AI Agent was placed under a NAT environment.
    • The Raft algorithm library, riteraft-py, will be renamed and developed as raftify.
    • Support for the following new storage backends
      • VAST Data
      • KT Cloud NAS (Enterprise only)

    Backend.AI FastTrack

    • Improved UI for supporting various heterogeneous accelerators.
    • Deleting a VFolder now uses an independent unique ID value instead of the storage name.
    • Upgraded Django version to 4.2.5 and Node.js version to 20.
    • Added pipeline template feature to create pipelines in a preset form.
    • If a folder dedicated to a pipeline is deleted, it will be marked as disabled on the FastTrack UI.
    • Improved the process of deleting pipelines.
    • Added a per-task (session) accessible BACKENDAI_PIPELINE_TASK_ID environment variable.
    • Actual execution time per task (session) is now displayed.

    Contribution Academy

    Especially in the past period, the following code contributions were made by junior developer mentees through the 2023 Open Source Contribution Academy organized by NIPA.

    • Created a button to copy an SSH/SFTP connection example to the clipboard.
    • Refactored several Lit elements of the existing WebUI to React.
    • Wrote various test code.
    • Found and fixed environment variable and message errors that were not working properly.

    Backend.AI is constantly evolving to provide a more powerful and user-friendly experience while supporting various environments in the AI ecosystem. Stay tuned for more updates!
    Make your AI accessible with Backend.AI!

    This post is automatically translated from Korean

    26 September 2023

We're here for you!

Complete the form and we'll be in touch soon

Contact Us

Headquarter & HPC Lab

Namyoung Bldg. 4F/5F, 34, Seolleung-ro 100-gil, Gangnam-gu, Seoul, Republic of Korea

© Lablup Inc. All rights reserved.