Engineering Services Company Utilizes AWS Bedrock to Build Q&A System for Materials Science Knowledge

Case Studies

A pioneer and market leader in Integrated Computational Materials Engineering (ICME) created a SaaS offering that packaged all of the company’s IP and expertise into a client-accessible software platform. The company’s insights have enabled new products and increased performance across a range of industries, including automotive, aerospace, and medical devices.


The client’s customer team wanted to build a generative AI Q&A platform that customers could query to gain knowledge about materials science and engineering literature. Questions could involve anything related to information on specific products to specific properties of materials. The AllCloud team worked together with the client’s materials science SMEs to build a solution capable of answering questions based on materials science research papers. Due to the domain specificity required in the field of materials science, prompting a Foundational LLM model directly will lead to poor results, so AllCloud used retrieval-augmented generation (RAG) to facilitate retrieval of relevant contextual information and achieve more accurate, relevant results with direct answer traceability.  


The engagement began with providing a cost estimate of the services that were to be used to ensure the client had an understanding of the AWS technology that would be involved and an accurate estimate for the cost of their use case. In addition, high-level architectural documentation was assembled and reviewed with the internal Data Scientists at the Customer to align on a joint solution that had the best outcomes given the specific nature of the questions they were attempting to answer. The application was built using LangChain to coordinate RAG, Amazon OpenSearch Service for the vector database, and Streamlit to provide a chatbot interface to interact with the RAG implementation. 

On the backend, Amazon OpenSearch Service uses Foundation Models (e.g. Titan Text Embed) in Amazon Bedrock to create word embeddings of materials science papers. AllCloud also leverages LLMs in Bedrock (Command, Claude V2, and Jurassic Ultra) to provide answers to questions. 

The process a customer uses to engage with the solution is as follows: 

  1. A natural language query by a user is entered through the Streamlit web UI. 
  2. A semantic similarity search is performed between the query and materials science literature to identify relevant contextual text information. 
  3. The query and the retrieved context are passed to Bedrock LLMs for a final answer.
  4. The answer and source information are both rendered back to the user in the web UI.

Now that V1 of the solution has launched, the next version of the platform will be expanded to include a broader scope of documents and prompts, making the tool even more valuable.


The initial phase of the project was an overwhelming success for the customer team. The project, which was originally scoped to support five different types of documents and various questions, resulted in support for over 90 different styles of text files being supported for vector embeddings and being able to accurately answer a significantly higher number of questions beyond the three outlined in the engagement.

In addition, the AllCloud team assisted the customer in embedding the solution into the production deployment of their larger application, making it available for internal use. As the project was verified by customer SMEs while the project was developed, it already has buy-in from key stakeholders, has gained usage in production, and a next engagement has been agreed upon to extend the solution to be utilized by the company’s end customers. This value, accuracy, adoption, and innovation were achieved with a cost to operate, from a services perspective, that remains low. 

With this phase of the project closed, the AllCloud team feels confident in the joint path forward and is thrilled that the solution has been rolled out with interested and engaged users. This solution puts the knowledge base at the fingertips of employees, providing quick reference to data that previously was time-consuming to find or required SMEs’ involvement.