Explore how companies use AI {agent type}
Turing is customizing Gemini Code Assist on their private codebase, empowering their developers with highly personalized and contextually relevant coding suggestions that have increased productivity around 30 percent and made day-to-day coding more enjoyable.
Commerzbank is enhancing developer efficiency through Code Assist's robust security and compliance features.
Capgemini has been using Code Assist to improve software engineering productivity, quality, security, and developer experience, with early results showing workload gains for coding and more stable code quality.
Replit developers will get access to Google Cloud infrastructure, services, and foundation models via Ghostwriter, Replit's software development AI, while Google Cloud and Workspace developers will get access to Replit’s collaborative code editing platform.
ClearPoint integrated Meta's Llama 3 with Google's code scanner lint to perform an automated refactor of over 500 code files in a large Kotlin codebase. With less than 500 lines of automation code, they produced a migration that runs with a single build command without human intervention, saving an estimated 11 engineering weeks and improving efficiency without risking IP or vendor lock-in.
Ironclad uses OpenAI’s tech to power their AI assistant, a contract lifecycle management tool that identifies and redlines contract irregularities. The integration allows legal teams to reduce contract redlining time from 40 minutes to 2 minutes, improving efficiency without replacing human oversight.
Harvey, an AI platform for law and professional services firms, partnered with OpenAI to build a custom-trained case law model. It has enabled its AI to assist with complex legal reasoning, contract review, drafting legal documents, and case law research.
Their model has proven to be significantly more accurate and thorough than standard GPT for legal professionals, reducing errors like hallucinations and providing detailed, source-cited responses.
Robin AI uses Anthropic’s Claude to streamline contract review processes by accelerating legal workflows up to 8-10 times. By integrating Claude via Amazon Bedrock, Robin AI ensures data security and trust, which are critical to its legal users, allowing lawyers to work with AI as a co-pilot in contract negotiation and review.
JetBrains integrates OpenAI's API into its AI Assistant product, enhancing developer productivity by providing intelligent code suggestions, writing tests, generating documentation, and refactoring code. This AI Assistant has become JetBrains' fastest-growing product, with 77% of developers reporting increased productivity and 55% finding more time for engaging tasks.
Retool uses GPT-4 to help businesses quickly build AI-powered applications for various business processes, such as inventory management, customer support, and custom product experiences. The platform provides pre-built AI actions like generating text and chatbots, making it easier for companies to integrate AI with secure data access and promote efficiency.
GitLab uses Claude to power AI-driven features across their DevSecOps platform. These include generative code generation, chat, and summarization tools. By integrating Claude, GitLab accelerates development while maintaining security and stability, ensuring privacy-first AI solutions across the software lifecycle.
MongoDB extends Mistral's performance and efficiency from the model layer to the data layer by securely unifying application data, metadata, and vector embeddings in a single platform. Developers can create rich, real-time AI applications across various use cases, from conversational AI to complex reasoning, gaining a competitive edge by building with Mistral AI and MongoDB.
Researchers at UC Berkeley have introduced RAFT (Retrieval-Augmented Fine-Tuning) by leveraging Meta Llama 2 on Azure AI Studio. RAFT enhances domain adaptation in language models by improving their ability to retrieve and integrate relevant information. This novel approach benefits specialized applications by making Meta Llama 2 more versatile and adaptable to various domain-specific tasks.