-->
Imagine having the power to build AI applications exactly the way you envision them, without being boxed in by platform limitations or vendor restrictions. That’s exactly what bolt.diy brings to the table. Unlike its cloud-based sibling bolt.new, this revolutionary open-source platform puts you in the driver’s seat of your AI development journey.
Think of bolt.diy as your personal AI development workshop. Just as a master craftsperson carefully selects their tools and materials, bolt.diy lets you choose and configure any Large Language Model (LLM) that fits your needs. Whether you’re drawn to the powerful capabilities of OpenAI’s models or prefer the control and privacy of running your own local instances, bolt.diy adapts to your vision rather than forcing you to adapt to it.
What truly sets bolt.diy apart is its foundation built on four essential freedoms that every developer craves:
First, there’s extensibility—imagine having a universal adapter that connects to any AI model you choose. That’s what bolt.diy offers, letting you seamlessly integrate with any LLM provider, from industry giants to your own custom solutions.
Second comes portability, which works like a development passport. Your projects can freely move between your local machine and remote servers, making the transition from development to production as smooth as silk.
The third pillar is its open-source nature. Just as a thriving garden grows through collective care, bolt.diy flourishes through community contributions. You can peek under the hood, customize the engine, and even contribute improvements that benefit the entire developer community.
Finally, there’s the freedom from vendor lock-in. No more golden handcuffs—you’re free to switch between different AI providers or use multiple ones simultaneously, ensuring your project’s future remains in your hands.
Setting up your development environment with bolt.diy is like building with sophisticated yet straightforward building blocks. Let’s walk through each step:
git clone https://github.com/stackblitz-labs/bolt.diy.git
cd bolt.diy
npm install
.env
file:OPENAI_API_KEY=YOUR_OPENAI_API_KEY
MY_CUSTOM_LLM_ENDPOINT=https://api.your-custom-llm.com
npm run dev
bolt.diy’s architecture is like a well-designed power grid, capable of connecting to multiple power sources simultaneously. This versatility manifests in several powerful ways:
Consider how you might connect your home to both solar panels and the city grid. Similarly, bolt.diy lets you seamlessly integrate with cloud providers while maintaining your local processing capabilities. You could be running preliminary tests on a local model while using OpenAI’s GPT-4 for production—all within the same application.
The platform’s flexibility shines when you need to create custom integrations. For instance, you might have a specialized model running on your company’s servers that needs to work alongside public APIs. bolt.diy handles these complex scenarios with grace.
The configuration system in bolt.diy is like a sophisticated control panel, giving you precise control over your AI infrastructure:
export const config = {
defaultProvider: 'openai',
customProvider: {
endpoint: process.env.MY_CUSTOM_LLM_ENDPOINT,
apiKey: process.env.MY_CUSTOM_LLM_KEY
},
// Add custom parameters for your specific needs
modelParameters: {
temperature: 0.7,
maxTokens: 2048
},
// Define fallback behavior
fallback: {
provider: 'local',
modelPath: './models/fallback-model'
}
}
Developing locally with bolt.diy is like having a fully equipped laboratory right at your fingertips. Here’s how to make the most of it:
Begin by selecting and installing your preferred model—whether it’s an efficient model like GPT-J or a specialized one trained on your data.
Configure your model to communicate through a standardized endpoint, much like setting up a universal translator for different AI languages.
Direct bolt.diy to your local setup, allowing for rapid prototyping and experimentation without any external dependencies.
This approach gives you the freedom of a private workshop combined with the power of professional-grade tools. You can experiment freely, iterate quickly, and perfect your application before deploying it to the world.
When your creation is ready to meet the world, bolt.diy ensures a smooth transition to production:
Choose your ideal hosting environment, whether it’s a cloud provider like AWS or your own infrastructure.
Set up your production endpoints, carefully configuring security and performance parameters.
Deploy your application using familiar Node.js practices, with bolt.diy handling the complex AI interactions behind the scenes.
The platform’s architecture acts like a well-oiled machine, ensuring that what worked perfectly in development continues to shine in production.
bolt.diy isn’t just a development platform—it’s a gateway to the future of AI application development. It stands as a testament to what’s possible when we combine the power of modern AI with the flexibility developers need to innovate.
Whether you’re building a groundbreaking AI application, experimenting with different language models, or scaling an existing solution, bolt.diy provides the perfect blend of power and freedom. It’s more than just a tool—it’s your partner in pushing the boundaries of what’s possible with AI.
The future of AI development is open, flexible, and powerful. With bolt.diy, that future is in your hands. What will you build?