Background
Break News
How to add local font to Tailwind Css and NextJS? - Tutorial Design Pattern? - Blockchain Technology, How to create own Bitcoin virtual currency - Zustand mordern management state - Design Pattern - Flyweight Pattern? - Docker Full training Topic

How to create your Chatbot use the Ollama Free Model in Langflow: A Complete Guide

Wednesday, 6 August 2025
|
Read: Completed in minutes

How to create your Chatbot use the Ollama Free Model in Langflow: A Complete Guide

Welcome back to the Learn Tech Tips blog! Today, we’ll explore how to use the Ollama model (free) in Langflow. If you’re not familiar with Langflow, don’t worry—we’ll cover that too. By the end of this tutorial, you’ll be able to build a complete chatbot using the Ollama model.

How to create your Chatbot use the Ollama Free Model in Langflow: A Complete Guide

 

What is Langflow?

Langflow is a powerful tool designed to simplify the development of applications that utilize large language models (LLMs). It offers a user-friendly interface for integrating LLMs into your projects, enabling you to create chatbots, content generators, and more without extensive coding knowledge.

How to Use the Ollama Model in Langflow: A Complete Guide


 

What is the Ollama Model?

The Ollama model is an open-source language model that can be easily integrated into applications for various tasks like text generation, conversation, and more. It's lightweight and free to use, making it an excellent choice for developers looking to incorporate AI capabilities into their projects.

Here is the docker compose file for Langflow, if you dont know how to start a docker, you can check ref on my blog with below link.

Check out the github here langflow source code: https://github.com/langflow-ai/langflow 

services:
langflow:
image: langflowai/langflow:latest # or another version tag on https://hub.docker.com/r/langflowai/langflow
pull_policy: always # set to 'always' when using 'latest' image
ports:
- "7860:7860"
depends_on:
- postgres
environment:
- LANGFLOW_DATABASE_URL=postgresql://langflow:langflow@postgres:5432/langflow
# This variable defines where the logs, file storage, monitor data and secret keys are stored.
- LANGFLOW_CONFIG_DIR=app/langflow
volumes:
- langflow-data:/app/langflow

postgres:
image: postgres:16
environment:
POSTGRES_USER: langflow
POSTGRES_PASSWORD: langflow
POSTGRES_DB: langflow
ports:
- "5433:5432"
volumes:
- langflow-postgres:/var/lib/postgresql/data

volumes:
langflow-postgres:
langflow-data:


 Dockerfile for Langflow

FROM langflowai/langflow:latest

CMD ["python", "-m", "langflow", "run", "--host", "0.0.0.0", "--port", "7860"]


After start this one on docker with command: docker compose up -d, you can access Langflow with URL: http://localhost:7860, on this place you can make a process for your Agent. Here I will make a Chatbot simple flow

Go to Ollama and download ollama docker compose file and run it on your localhost

services:
ollama:
image: ollama/ollama:latest
container_name: ollama
ports:
- "11434:11434"
volumes:
- ollama_data:/root/.ollama
restart: unless-stopped

volumes:
ollama_data:
name: ollama_data

And after we start with command docker compose up -d

Access with URL: localhost:11434 (you will see Ollama is running) --> it already succeed start

Now go inside the docker Ollama by command: docker exec -it <docker_id> bash and exec ollama run qqwen, (qwen is a any free model, you can reference on Ollama/library page for more model details, on this tutorial I will use qwen for demo)

Access this URL: http://localhost:11434/api/tags
If you get below response it already succeed

How to Use the Ollama Model in Langflow: A Complete Guide
Now switch to Langflow. Demo with a Chatbot,

First go drag a Ollama Components into the dashboard like this and you will see the Ollama component but actually it cannot the qwen Model. So if you need to load it smoothly, Please switch to Python code and I will help you how to easy to do it

How to Use the Ollama Model in Langflow: A Complete Guide

click on Ollama component and choose code:

On the DropdownInput please update like this one, this will help your componet contain default qwen model

DropdownInput(
            name="model_name",
            display_name="Model Name",
            options=[
                "qwen"
            ],
            info="Refer to https://ollama.com/library for more models.",
            refresh_button=True,
            real_time_refresh=True,
        ), 

and on the update_build_config function, please hide/remove this one like my source code below. Explain, the hide code is validate the model when it load from local, so we no need use it, we just use the qwen model so we just define it.


async def update_build_config(self, build_config: dict, field_value: Any, field_name: str | None = None):
if field_name == "mirostat":
if field_value == "Disabled":
build_config["mirostat_eta"]["advanced"] = True
build_config["mirostat_tau"]["advanced"] = True
build_config["mirostat_eta"]["value"] = None
build_config["mirostat_tau"]["value"] = None

else:
build_config["mirostat_eta"]["advanced"] = False
build_config["mirostat_tau"]["advanced"] = False

if field_value == "Mirostat 2.0":
build_config["mirostat_eta"]["value"] = 0.2
build_config["mirostat_tau"]["value"] = 10
else:
build_config["mirostat_eta"]["value"] = 0.1
build_config["mirostat_tau"]["value"] = 5

if field_name in {"base_url", "model_name"}:
if build_config["base_url"].get("load_from_db", False):
base_url_value = await self.get_variables(build_config["base_url"].get("value", ""), "base_url")
else:
base_url_value = build_config["base_url"].get("value", "")

if not await self.is_valid_ollama_url(base_url_value):
# Check if any URL in the list is valid
valid_url = ""
check_urls = URL_LIST
if self.base_url:
check_urls = [self.base_url, *URL_LIST]
for url in check_urls:
if await self.is_valid_ollama_url(url):
valid_url = url
break
if valid_url != "":
build_config["base_url"]["value"] = valid_url
else:
msg = "No valid Ollama URL found."
raise ValueError(msg)
''' if field_name in {"model_name", "base_url", "tool_model_enabled"}:
if await self.is_valid_ollama_url(self.base_url):
tool_model_enabled = build_config["tool_model_enabled"].get("value", False) or self.tool_model_enabled
build_config["model_name"]["options"] = await self.get_models(self.base_url, tool_model_enabled)
elif await self.is_valid_ollama_url(build_config["base_url"].get("value", "")):
tool_model_enabled = build_config["tool_model_enabled"].get("value", False) or self.tool_model_enabled
build_config["model_name"]["options"] = await self.get_models(
build_config["base_url"].get("value", ""), tool_model_enabled
)
else:
build_config["model_name"]["options"] = []
if field_name == "keep_alive_flag":
if field_value == "Keep":
build_config["keep_alive"]["value"] = "-1"
build_config["keep_alive"]["advanced"] = True
elif field_value == "Immediately":
build_config["keep_alive"]["value"] = "0"
build_config["keep_alive"]["advanced"] = True
else:
build_config["keep_alive"]["advanced"] = False '''

return build_config
  

After update above code, you can see the qwen on the Ollama now. that perfect, so you can make a flow

Click on the share with Embed into side, you can get the javascript code like this one



<script src="https://cdn.jsdelivr.net/gh/logspace-ai/langflow-embedded-chat@v1.0.7/dist/build/static/js/bundle.min.js">
</script>
<langflow-chat flow_id="3784e580-ac42-4328-b9f9-fc9528eca508" host_url="http://localhost:7860" window_title="Basic Prompting">
</langflow-chat>

tada, here is your chat box free with Ollama and Langflow 

How to create your Chatbot with Use the Ollama Free Model in Langflow: A Complete Guide


Thanks for reading. Any feedback and questions. Leave your comment on below post, we can discuss about it.
✋✋✋✋ Learn Tech Tips - I am Zidane, See you next time

🙇🏼🙇🏼 We Appreciate Your Comments and Suggestions - Webzone, all things Tech Tips web development
Popular Webzone Tech Tips topic maybe you will be like it - by Webzone Tech Tips - Zidane
As a student, I found Blogspot very useful when I joined in 2014. I have been a developer for years . To give back and share what I learned, I started Webzone, a blog with tech tips. You can also search for tech tips zidane on Google and find my helpful posts. Love you all,

I am glad you visited my blog. I hope you find it useful for learning tech tips and webzone tricks. If you have any technical issues, feel free to browse my posts and see if they can help you solve them. You can also leave a comment or contact me if you need more assistance. Here is my blog address: https://learn-tech-tips.blogspot.com.

My blog where I share my passion for web development, webzone design, and tech tips. You will find tutorials on how to build websites from scratch, using hot trends frameworks like nestjs, nextjs, cakephp, devops, docker, and more. You will also learn how to fix common bugs on development, like a mini stackoverflow. Plus, you will discover how to easily learn programming languages such as PHP (CAKEPHP, LARAVEL), C#, C++, Web(HTML, CSS, javascript), and other useful things like Office (Excel, Photoshop). I hope you enjoy my blog and find it helpful for your projects. :)

Thanks and Best Regards!
Follow me on Tiktok @learntechtips and send me a direct message. I will be happy to chat with you.
Webzone - Zidane (huuvi168@gmail.com)
I'm developer, I like code, I like to learn new technology and want to be friend with people for learn each other
I'm a developer who loves coding, learning new technologies, and making friends with people who share the same passion. I have been a full stack developer since 2015, with more than years of experience in web development.
Copyright @2022(November) Version 1.0.0 - By Webzone, all things Tech Tips for Web Development Zidane
https://learn-tech-tips.blogspot.com