Hit Enter to search or Escape to close
We had previously developed a robust application ecosystem for our client over many years, working with cross-functional teams in their digital R&D department. The applications were used by researchers and clinicians to execute on research-related tasks such as ordering biological material or the automation of HTP data analysis. Our client had an internal facing LLM, but it lacked the ability to understand this application environment and how to answer questions from users.
The Arrayo team utilized our client’s LLM capabilities to interpret user text input, elucidating their intentions, and generate a structured response outlining the sequential steps required for any of our client’s applications. The team was able to furnish a versatile LLM module imbued with contextual understanding, dissecting potential actions within the application.
This initiative established a universal framework for furnishing contextual information to any language model, tailored specifically to our client’s applications ecosystem. Subsequently, it crafts a consistent and predictable response that can be seamlessly parsed into the relevant endpoints, facilitating smooth execution when users request specific actions. This LLM module allowed our client to significantly decrease time and effort spent on training new users on this digital research ecosystem.