Driving Next-gen EV Experience through GenAI Technology
.jpg?itok=fcslJSwj)
[The content of this article has been produced by our advertising partner.]
Empowering brands to translate technological prowess into market success, Amazon Web Services (AWS) has worked hand in hand with BMW Group to create a one-of-a-kind in-console cloud assistant solution to maximize product performance and efficiency, identifying operational bottlenecks as well as opportunities for improvement at all times. By offering secure encryption and open access to high-performing large language models (LLM) through Amazon Bedrock, companies now have the freedom to experiment and switch between models depending on specific use cases, all while offering a peace of mind that no data is transferred beyond the BMW Cloud Room.

Furthermore, it is clear that scalability is of essence for global brands as well-known as BMW Group, making sure that original products and solutions are ready to hit the market once available. Enabling business transformation and ensuring a smooth data migration from Cloud Data Hub, AWS has processed a record-breaking volume of 14.3 billion requests and 145 terabytes of traffic in one single day, scaling technology effectively across 22.3 million of BMW vehicles to help the brand meet fierce customer demand. “Using Amazon Bedrock, we’ve been able to scale our cloud governance, reduce costs and time to market, and provide a better service for our customers,” noted Dr. Jens Kohl, Head of BMW Group’s Offboard Architecture.

Due to a large quantity of internal documents as well as high complexity of content nature encompassing images, videos and text, it has been a long-standing challenge for employees to obtain relevant company or project information from millions of sources within the database. Information would often be lost after the person-in-charge had left the company and new hires were left to spend extra time and effort to locate answers, creating unwanted workflow friction across the board. However, AWS managed to build a convenient digital assistant and related user applications, so confused employees could fire questions in natural language in anticipation for exact solutions, resolving issues such as ordering a company car or finding the corresponding project manager within mere seconds.
Implementing a severless and secure approach, the combined use of AWS Lambda, an event-driven compute service, and AWS Glue, a data integration service, automated data migration and comprehensive data cleaning from multiple sources were made possible. In order to achieve natural language processing to convert text to numbers using embeddings, Amazon SageMaker was employed for ML model development, monitoring and deployment. The data embeddings were then accessed and analyzed by Amazon OpenSearch Service to unlock near-real-time search and vector search to produce answers to the initial question. In the end, the top three generated results were found to contain all necessary information, with the first result providing the best answer relevant to the initial question successfully.
Harnessing the team’s knowhow and bringing the company’s internal knowledge to a whole new level, the project played a pivotal role in expanding skill development for the Mercedes-Benz Consulting team. It is no easy feat to develop all-round solutions that are flexible, scalable and secure, thus making it even more exciting to hear that the company is looking to spread the use of solutions to workshops and showrooms. Dr. Gavneet Singh Chadha, management consultant at Mercedes-Benz Consulting, remarked, “With this AWS engagement and everything we’ve developed in the past year, we have moved our focus from doing projects to creating products.”
