Expanding AI capabilities for Odia language
%201%20(1).png)
Odia Generative AI is an open-source project aimed at expanding LLM capabilities for the Odia language, spoken by 40 million in India. Using Meta Llama 2, OdiaGenAI has developed the Odia Llama, a fine-tuned LLM for Odia, and is exploring applications like AI chatbots and AI tutors, thereby enhancing accessibility and digital inclusion for Odia speakers.
.png)
19
AI use cases in
Non-profit
.png)

%201%20(1).png)
Kenya Red Cross worked with Pathways Technologies to develop an AI-powered mental health chatbot called Chat Care using Azure AI services. The chatbot initiates conversations on mental health, suggests ways to help such as breathing exercises and in-person services, operates in English and Swahili, and is available 24/7. It expands the reach of mental health services without overstretching human counselors and helps overcome stigma.
%20(1).png)

%201%20(1).png)
Qatar Charity partnered with Netways to implement INNOV8 for Copilot Studio on a unified cloud-based call center platform built on Microsoft Azure and Dynamics 365 Customer Service, consolidating disparate communication channels into one integrated interface. They leveraged low-code tools such as Microsoft Power Apps and Azure Logic Apps alongside AI-driven automation via Microsoft Copilot Studio and Azure AI Services to streamline call handling processes, consolidate data management, and reduce IT maintenance complexity.
%20(1).png)

%201%20(1).png)
British Heart Foundation, Europe's largest funder of heart and circulatory disease research, is testing Microsoft 365 Copilot with about 300 employees. Staff are using the AI assistant to stay up to date on office communication, craft emails and documents, and search across the nonprofitās tech platforms. They delegate tasks to Copilot, which helps them save time on routine tasks and focus on their mission.
%20(1).png)
172
companies using
Customer Agents
.png)

%201%20(1).png)
NVIDIA partnered with Google Cloud to enable on-premises agentic AI by integrating Google Gemini models with NVIDIA Blackwell platforms and Confidential Computing, ensuring data sovereignty and regulatory compliance for sensitive enterprise operations. The solution further optimizes AI inference and observability by deploying a GKE Inference Gateway alongside NVIDIA Triton Inference Server, NVIDIA NeMo Guardrails, and NVIDIA Dynamo to enhance secure routing and load balancing for enterprise workloads.
%20(1).png)

%201%20(1).png)
Intuit integrated Google Cloudās Document AI and Gemini models into its GenOS platform to automate the autofill of ten common U.S. tax forms, including complex 1099 and 1040 forms. The solution extracts and categorizes data from uploaded documents, drastically reducing manual data entry for TurboTax customers. This integration streamlines tax preparation workflows and improves speed and accuracy.
%20(1).png)

%201%20(1).png)
Capgemini partnered with Google Cloud to develop industry-specific agentic AI solutions that automate customer request handling across multiple channels such as web, social, and phone. The implementation integrates Google Agentspace, Customer Engagement Suite, and Agent2Agent interoperability protocol into existing customer service infrastructures to enhance personalized support, call routing, and workflow automation. This advanced solution transforms customer experience by streamlining communications and enabling proactive engagement.
%20(1).png)
49
solutions powered by
Meta
.png)

%201%20(1).png)
Roboflow uses Meta's Segment Anything Model (SAM) to enable users to automatically segment objects in images and videos, significantly reducing the time required to create training datasets for computer vision models.
%20(1).png)

%201%20(1).png)
Untukmu.AI, an online gifting site in Indonesia, uses Meta's Llama 3.1 8B model with split inference processing to protect customer privacy. By running part of the AI model on customers' devices and the rest on their servers, they deliver personalized gift recommendations without accessing or storing personal data. This ensures customer privacy while still providing high-quality, tailored suggestions, enhancing trust and satisfaction.
%20(1).png)

%201%20(1).png)
CodeGPT, a popular coding assistant with over 1.4 million downloads, integrates Meta's Llama models to enhance developer productivity. By using Llama 3.2 (90B), CodeGPT helps developers not just generate code but also answer questions about their codebase, debug code, and onboard new team members. It includes a codebase graph mechanism that lets Llama understand entire repositories, allowing developers to effectively "talk" with their code. This integration leads to at least a 30% increase in productivity and accelerates onboarding from months to days.
%20(1).png)
78
AIĀ use cases in
Asia
.png)

%201%20(1).png)
TCS partnered with Google Cloud to integrate advanced AI and generative AI capabilities into retail service offerings. They launched the Google Cloud Gemini Experience Center at their Retail Innovation Lab in Chennai, enabling retail clients to ideate, prototype, and co-develop tailored AI solutions that optimize supply chain, warehouse receiving, customer insights, and content creation. This approach automated processes using tools like Vertex AI Vision for warehouse receiving and leveraged Vertex AI with Gemini 1.5 Pro and speech-to-text to transform service centers.
%20(1).png)

%201%20(1).png)
LY Corporation leveraged OpenAIās API to integrate advanced generative AI into its flagship services, including a GPTā4o-powered LINE AI Assistant and GPTā4 enhancements in Yahoo! JAPAN Search for summarizing reviews and generating travel plans. They also deployed SeekAI, an in-house productivity tool using RAG to rapidly retrieve information from internal documentation, streamlining employee inquiries and operations.
%20(1).png)

%201%20(1).png)
Physics Wallah developed 'Gyan Guru', a hyperpersonalized conversational study companion to address the unique academic and support needs of its 2 million daily users. The system was implemented by indexing over one million Q&As and ten million solved doubts in a vector database, then leveraging a Retrieval-Augmented Generation (RAG) approach integrated with Azure OpenAI to deliver individualized, context-aware responses. This integration streamlined various student interactions including academic queries, product-related issues, and general support, reducing reliance on human subject matter experts.
%20(1).png)