Docker unveiled several new AI initiatives at its annual DockerCon event to streamline integration of AI and machine learning for developers. The highlight was the launch of the GenAI Stack, which allows developers to deploy full-stack generative AI with a few clicks.
The GenAI Stack provides preconfigured large language models and management tools to get AI projects off the ground quickly. Neo4j graphs act as the default database to uncover patterns and enhance model accuracy. Docker also incorporated the LangChain orchestration framework to facilitate communication between models and applications. These components create a smooth AI development process.
Here’s what’s included in the new GenAI Stack:
1. Pre-configured LLMs: We provide preconfigured Large Language Models (LLMs), such as Llama2, GPT-3.5, and GPT-4, to jumpstart your AI projects.
2. Ollama management: Ollama simplifies the local management of open source LLMs, making your AI development process smoother.
3. Neo4j as the default database: Neo4j serves as the default database, offering graph and native vector search capabilities. This helps uncover data patterns and relationships, ultimately enhancing the speed and accuracy of AI/ML models. Neo4j also serves as a long-term memory for these models.
4. Neo4j knowledge graphs: Neo4j knowledge graphs to ground LLMs for more precise GenAI predictions and outcomes.
5. LangChain orchestration: LangChain facilitates communication between the LLM, your application, and the database, along with a robust vector index. LangChain serves as a framework for developing applications powered by LLMs. This includes LangSmith, an exciting new way to debug, test, evaluate, and monitor your LLM applications.
6. Comprehensive support: To support your GenAI journey, we provide a range of helpful tools, code templates, how-to guides, and GenAI best practices. These resources ensure you have the guidance you need.
The GenAI Stack is currently available in Early Access, accessible through the Docker Desktop Learning Center or GitHub. Docker also encourages developers to showcase their AI/ML solutions built on Docker by participating in the Docker AI/ML Hackathon.
Additionally, Docker announced Docker AI, its first AI-powered product leveraging collective wisdom from millions of Docker developers over the past decade. Docker AI generates context-specific recommendations to boost productivity when building applications. For example, it can suggest best practices when editing Dockerfiles or identifying the most up-to-date container images.
Docker AI aims to meet developers where they are by integrating AI into existing developer workflows. According to Katie Norton of IDC, these tools mirror the 10x productivity gains of code generation through automating infrastructure configuration. Docker AI allows developers to spend more time on core app functionality and less on tools.
The rapid proliferation of AI models is making integration accessible to more developers. Docker's innovation reflects its commitment to democratizing AI and machine learning. With intuitive solutions like the GenAI Stack and Docker AI, Docker is empowering developers to tap into the transformative capabilities of AI.
Docker's announcements signal its intention to further democratize AI development. By providing intuitive tools that remove complexity, Docker is lowering the barriers for integrating AI models. The GenAI Stack abstracts away infrastructure requirements to let developers focus on creating. Meanwhile, Docker AI leverages collective wisdom to guide developers through best practices.
Docker CEO, Scott Johnston emphasized the company's commitment to fostering innovation and making emerging technologies accessible. The early access launch of these AI products represents an exciting opportunity for developers to shape the future platform. With developer-friendly solutions, Docker aims to unleash creativity as AI becomes an essential part of modern applications. Ultimately, Docker seeks to empower developers of all skill levels to take advantage of AI's immense possibilities.