AI integration on mobile
%201%20(1).png)
OPPO integrated Google Cloud’s Vertex AI, AutoML, and Gemini large language model into its mobile devices to automate user feedback analysis, power AI Recording Summary features, and enable AI Toolbox functionalities such as AI Writer and AI Reply. They re-engineered their hardware platform, operating system, and third-party ecosystem to embed AI agents that optimize power consumption and reduce computing latency, streamlining mobile development workflows and enhancing user experience.
.png)
30% reduction in labor workload; 27% reduction in power consumption; 40% reduction in computing latency
7
AI use cases in
Consumer Electronics
.png)

%201%20(1).png)
Lenovo integrated AI-powered Copilot into its Dynamics 365 Contact Center and Customer Service platforms to automate inquiry response handling and routine tasks. They implemented a unified, multilingual chat service where customer inquiries are processed in natural language and historical service data is leveraged to provide real-time resolution suggestions and generate detailed post-call summaries. This integration streamlined support workflows and enabled agents to focus on more complex customer issues.
%20(1).png)

%201%20(1).png)
Devoteam, an AI-driven tech consulting company, is rolling out 4,000 Gemini for Google Workspace licenses across all its business units. By empowering employees with Gemini's generative AI features, Devoteam enhances internal productivity and collaboration, enabling staff to work smarter, faster, and more creatively.
%20(1).png)

%201%20(1).png)
Hitachi embedded Microsoft’s generative AI tools—including Azure Open AI Service, Copilot for Microsoft 365, and GitHub Copilot—into its Lumada Solutions and JP1 Cloud Services to automate mission-critical system development, enhance customer service with faster alert response times, and improve predictive maintenance for rail infrastructure. They implemented these integrations across multiple business workflows and launched a comprehensive training program for over 50,000 GenAI Professionals to upskill 270,000 employees, driving measurable improvements in productivity and operational efficiency.
%20(1).png)
166
companies using
Customer Agents
.png)

%201%20(1).png)
Tidio integrated Anthropic’s Claude model to develop their Lyro AI agent, automating customer support interactions across both live chat and email channels. They implemented a network of specialized AI agents for conversation rating, summarization, and a dynamic routing system that selects the optimal API between native Anthropic API and Google Cloud Vertex AI based on performance metrics, streamlining support workflows and enabling personalized product recommendations.
%20(1).png)

%201%20(1).png)
Zendesk integrated OpenAI's models to create adaptive AI service agents that autonomously manage customer conversations and execute resolution tasks. They implemented a multi-agent architecture featuring task identification, conversational RAG, procedure compilation, and procedure execution agents integrated with existing support workflows through API calls and natural language procedure definitions, while providing real-time chain-of-thought visibility. This solution transitions from traditional intent-based bots to a hybrid model of scripted and generative reasoning, streamlining customer service processes.
%20(1).png)

%201%20(1).png)
Hebbia built Matrix, a multi-agent AI platform that orchestrates OpenAI models including o3‑mini, o1, and GPT‑4o to automate complex financial and legal research tasks. The platform decomposes intricate queries into structured analytical steps and integrates modules like OCR, hallucination validation, and artifact generation to process complete documents, creating an infinite effective context window. This solution streamlines due diligence, contract review, and market research workflows, drastically reducing manual processing time.
%20(1).png)
255
solutions powered by
.png)

%201%20(1).png)
704 Apps implemented an AI solution using Vertex AI and Gemini 1.5 Pro to automate and accelerate driver identity verification and safety monitoring. They integrated these AI models into their existing cloud infrastructure built on Firebase and Google Kubernetes Engine, centralizing real-time data for document validation and audio sentiment analysis. The system alerts the central monitoring team when risk-related language is detected, streamlining operational decision-making and enhancing security.
%20(1).png)

%201%20(1).png)
Dataïads built an AI-powered “Post-Click Experience” system that automatically generates personalized landing pages by analyzing user context such as ad origin, product type, and behavior. The solution is implemented by integrating API access from Google Ads with Google Cloud managed services (Cloud Run and BigQuery) for automated scaling and controlled cost management, while planning to incorporate Vertex AI for further optimization. This implementation directly enhances ad campaign management and improves ecommerce conversion processes.
%20(1).png)

%201%20(1).png)
Wited revamped its learning platform by partnering with Google Cloud and Axmos Technologies to migrate from legacy systems to a robust infrastructure using Cloud SQL, Cloud Storage, Compute Engine, and Google Kubernetes Engine, ensuring reliable scalability and stability for high user demand. They then integrated generative AI by deploying Gemini and Vertex AI to power Max AI—a 24/7 virtual assistant that supports students through real-time assistance and educational guidance—thereby streamlining support processes and enabling the team to focus on innovation.
%20(1).png)
59
AIÂ use cases in
Asia
.png)

%201%20(1).png)
LY Corporation leveraged OpenAI’s API to integrate advanced generative AI into its flagship services, including a GPT‑4o-powered LINE AI Assistant and GPT‑4 enhancements in Yahoo! JAPAN Search for summarizing reviews and generating travel plans. They also deployed SeekAI, an in-house productivity tool using RAG to rapidly retrieve information from internal documentation, streamlining employee inquiries and operations.
%20(1).png)

%201%20(1).png)
SK Telecom integrated Anthropic's Claude on the Amazon Bedrock platform to power both in-call assist and post-call processing solutions. They implemented a custom in-house RAG model combined with real-time document search and automated summarization, classification, and sentiment analysis to augment call center operations and support culturally nuanced customer interactions.
%20(1).png)

%201%20(1).png)
Mercari integrated OpenAI’s API with a multi-model approach to optimize product listings. Initially, GPT‑4 analyzed top listings offline while GPT‑3.5 Turbo provided real-time suggestions for active listings. Later, they shifted to GPT‑4o mini to automatically generate complete titles, descriptions, and category suggestions from uploaded photos, streamlining the seller listing workflow.
%20(1).png)