ServiceNow

Interested in integrating Generative AI or AI use cases into ServiceNow? Click here for step-by-step guides on implementing various use cases.

Docker LLM Ollama

How run a LLM on Docker using Ollama.

When I first considered adding a large language model (LLM) to a Docker image, I encountered a significant challenge: LLMs are enormous, and including them in a Docker container can drastically increase the container’s size. Moreover, running such a Docker image requires considerable system resources. After extensive research, I discovered a few key strategies to […]

Prompt Engineering

Mastering Accuracy of ServiceNow LLM with Prompt Engineering.

“The quality of your output is only as good as the prompt that guides it.” While working on a multi-agent project using ServiceNow LLM, I realized that, even today, the accuracy of outputs from Large Language Models (LLMs) heavily depends on prompt engineering. In the paper arXiv:2404.11584, researchers evaluated various types of multiple agents and

Scroll to Top