Automating candidate screening processes
%201%20(1).png)
Gojob, a French HR-tech platform specializing in temporary worker staffing, uses Mistral AI to automate the candidate screening process. By integrating Mistral Large into their recruitment platform on Azure, they employ a GenAI recruiting assistant that conducts real-time conversations with potential candidates, facilitating pre-qualification and automating initial screening. This streamlines the hiring process, enhances productivity, and improves the efficiency of matching candidates with job opportunities.
.png)
14
AI use cases in
Human Resources
.png)

%201%20(1).png)
micro1 uses Anthropic's Claude to conduct AI-powered technical interviews, automating over 3,000 interviews per day. Claude evaluates technical skills through dynamic conversations, providing consistent, unbiased assessments and creating anxiety-free interview experiences.
%20(1).png)

%201%20(1).png)
Gojob developed Aglae, a virtual assistant powered by Azure OpenAI Service to automate candidate engagement and prequalification via text messaging. They implemented a system that leverages large language models and customized prompts to search a database of 2 million profiles and conduct natural language conversations, while personalizing the assistant for each recruiter. This integration streamlined the recruitment workflow by reducing screening time and allowing recruiters to focus on high-value interactions.
%20(1).png)

%201%20(1).png)
Skillfully, a public benefit corporation dedicated to creating a more meritocratic job market, uses Anthropic's Claude to power AI-driven job simulations that let candidates demonstrate their actual skills in realistic workplace scenarios. By leveraging Claude without complex fine-tuning, Skillfully generates a large volume of AI-powered job simulations weekly, transforming traditional hiring practices.
%20(1).png)
172
companies using
Customer Agents
.png)

%201%20(1).png)
NVIDIA partnered with Google Cloud to enable on-premises agentic AI by integrating Google Gemini models with NVIDIA Blackwell platforms and Confidential Computing, ensuring data sovereignty and regulatory compliance for sensitive enterprise operations. The solution further optimizes AI inference and observability by deploying a GKE Inference Gateway alongside NVIDIA Triton Inference Server, NVIDIA NeMo Guardrails, and NVIDIA Dynamo to enhance secure routing and load balancing for enterprise workloads.
%20(1).png)

%201%20(1).png)
Intuit integrated Google Cloud’s Document AI and Gemini models into its GenOS platform to automate the autofill of ten common U.S. tax forms, including complex 1099 and 1040 forms. The solution extracts and categorizes data from uploaded documents, drastically reducing manual data entry for TurboTax customers. This integration streamlines tax preparation workflows and improves speed and accuracy.
%20(1).png)

%201%20(1).png)
Capgemini partnered with Google Cloud to develop industry-specific agentic AI solutions that automate customer request handling across multiple channels such as web, social, and phone. The implementation integrates Google Agentspace, Customer Engagement Suite, and Agent2Agent interoperability protocol into existing customer service infrastructures to enhance personalized support, call routing, and workflow automation. This advanced solution transforms customer experience by streamlining communications and enabling proactive engagement.
%20(1).png)
19
solutions powered by
Mistral
.png)

%201%20(1).png)
BigDataCorp, a data analytics and consulting firm from Brazil, uses Mistral AI models hosted on AWS Bedrock to enable client businesses to dive deeper into their data using natural language.
%20(1).png)

%201%20(1).png)
Lamini, an all-in-one LLM platform for enterprises to build open LLMs on proprietary data, uses Mistral-7B as one of the most popular open base models. Lamini enables Fortune 1000 customers across various industries to tune and deploy models in production efficiently, even on AMD GPUs with performance parity to NVIDIA GPUs. Mistral-7B's ease of use and high-quality results help customers move from proof-of-concept to production, deploying proprietary LLMs effectively.
%20(1).png)

%201%20(1).png)
HuggingFace introduced HuggingChat and HuggingChat Assistants, platforms that allow users to try out different open-source models and create customized assistants with unique personalities. By default, both platforms are powered by Mistral 8x7B, providing users with powerful, high-quality experiences and contextually relevant responses. Mistral 8x7B is currently the most popular model used on HuggingChat, enhancing user engagement.
%20(1).png)
159
AI use cases in
Europe
.png)

%201%20(1).png)
Deutsche Bank developed DB Lumina, an AI-powered research agent built on Gemini and Vertex AI through a partnership with Google Cloud. The solution automates the creation of financial research reports by rapidly condensing extensive market data—such as converting a 400-page report into a three-page summary—thereby streamlining analysis workflows while maintaining rigorous data privacy standards.
%20(1).png)

%201%20(1).png)
Capgemini partnered with Google Cloud to develop industry-specific agentic AI solutions that automate customer request handling across multiple channels such as web, social, and phone. The implementation integrates Google Agentspace, Customer Engagement Suite, and Agent2Agent interoperability protocol into existing customer service infrastructures to enhance personalized support, call routing, and workflow automation. This advanced solution transforms customer experience by streamlining communications and enabling proactive engagement.
%20(1).png)

%201%20(1).png)
wealthAPI implemented a next‐gen contract detection solution by integrating DataStax Astra DB on Google Cloud and leveraging Google Gemini models for AI‐powered analysis. They deployed DataStax’s vector search and real‐time insights capabilities to scale contract detection across millions of users in less than three months, streamlining wealth management workflows by dramatically reducing response times and efficiently handling massive data volumes.
%20(1).png)