Getting Splunk Data into AI using n8n

By Jay Young, Senior Splunk Consultant 

This blog is about Integrating Splunk with n8n to use AI Agents to extract and transform the data received from Splunk. I will be using a self-hosted n8n instance. n8n is a low to no-code environment that offers a lot of functionality for connections to several 3rd party applications. I will demonstrate how to set up the n8n credentials to connect to Ollama, OpenAI, and Splunk and then explain the nodes needed to complete the workflow. This straightforward process can help automate different daily tasks and is becoming a very popular way to automate AI agents to do scheduled tasks in security, marketing, and information technology. 
 
Required Software to use this blog. 
1. You must have a Splunk instance; preferably, you should use a Splunk test instance. 
2. n8n: You can choose between the cloud version and the self-host version. I am using the (free) Self-Hosted version. 
3. AI Connection via API using OpenAI ($) or Use Local Ollama it is (free).  
 
I suggest using docker to install n8n self-hosted. 
n8n installation: https://docs.n8n.io/hosting/installation/docker/ 
 
Example: n8n Docker Run Command. 

docker run -d –name n8n-nodes -p 5678:5678 -e N8N_HOST=(Your_HOST) -e WEBHOOK_URL =(Your_HOST) -e GENERIC_TIMEZONE=America/Chicago -e N8N_SECURE_COOKIE=false -v /path/to/n8n_data_storage/n8n-data:/home/node/.n8n docker.n8n.io/n8nio/n8n 

 
Notice: Please make sure you set up -v volumes for your n8n self-host instance. Failure to do this will cause data loss when the container restarts or upgrades. 

 
Example: Volumes 
-v /path/to/n8n_data_storage/n8n-data:/home/node/.n8n 
                       (Local directory    dir. divider :  directory in the docker container. 

This connects a local directory outside the Docker Container with the Directory inside the Docker Container to automate file transfers between them.                          


You can get the n8n workflow for this project from my GitHub Repo: 
n8n_splunk_work_flow

1. Create a new workflow in n8n. 
2. Download and import the json file into the workflow. If you copy the json text, you can right-click the workflow area and paste it into n8n. 
 
OpenAI Api can be used by purchasing credits with the API Key. It is best to use a model like GPT-40-mini or the newer GPT-41-mini; these models cost very little to use, and for this blog, $5 provides plenty of use.   
ref: https://openai.com/index/openai-api/ 
ref: https://openai.com/api/pricing/ 

Ollama is free to use, but the models are not as powerful as the OpenAI models. If you choose this option, you will want to use Models from Ollama that have tool capabilities for the best results.  
ref: https://ollama.com/download 
ref: https://ollama.com/search?c=tools 
 
You can switch these nodes out in n8n simply by changing with one is connected to the AI Agent Node. 


n8n Credentials: 
n8n Credentials are secure ways to store and manage authentication details (like API keys, usernames, passwords, tokens) that workflows need to connect to external services. Instead of embedding sensitive information directly in workflows, credentials are saved centrally and referenced when needed, enhancing security and reusability across multiple workflows. 
 
OpenAI Credentials Setup: 

1. Enter the API Key you create once you have an OpenAI Account. 
2. The base URL is the OpenAI API Endpoint. 
3. Click Save and test the Connection  

Ollama Credential Setup: 

1. Base URL: The host address for the Ollama Server. 
2. You may need to add OLLAMA_HOST=SERVER_IP:11434 add to path or service. 
 
Splunk Credentials: 

1. The Auth Token is the token you created in splunk>settings>token 
2. Base URL: The Host Address of the Splunk server using Port: 8089 

Gmail Credentials: https://cloud.google.com/endpoints/docs/openapi/enable-api 

  1. To use Gmail for a test environment you will need a Gmail account and connect to the Google cloud Console to set up the API connection (Free) to get the Client ID, and the Client Secret. I have provided a link above the image that links to instructions on setting this up. These Google APIs can be used to connect to just about any of the Google services i.e. Sheets, Docs, Calendar, etc.  
  1. The OAuth redirect URL will populate with the N8N_HOST=(Your_HOST) from the Docker Run Command. 
  1. The OAuth redirect URL Will need to be added to the Google Cloud Console. 

Setting up the Splunk Token:

  1. splunk>settings>token 
  1. Click “New Token” in the upper right corner. 
  1. Fill out the information on the new token panel. 
    • User: The Splunk User that will use the token. *Required* 
    • Audience: Purpose of the token. *Required* 
    • Expiration: Determines when the Token expires. 
    • Not Before: Determines token activation date.
    • Click Create. Please copy the token and save it. 

n8n node explanations:

OpenAI Mode Node:
 

  1. Select the OpenAI credentials created earlier. 
  1. Select the OpenAI Model, I suggest using the low-cost models. (gpt-40-mini or gpt-41-mini) 

Ollama Model Node: 

  1. Select the Ollama Credentials created earlier. 
  1. Select the AI Model you wish to use models. I suggest the (llama3) models. 

 
Splunk Search Node: 

  1. Select the OpenAI or Ollama Credentials created earlier. 
  1. Resource: Search 
  1. Operation: Create 
  1. Query: Your search Query *You Must add search before index= or the search will error out* 

Example Search: 
search index=WinEventLogs EventCode=* earliest=-2h  
| stats count as total by EventCode 
| where total < 5 
| table EventCode total 

Splunk Search GetID Node: 

1. This node gets the Search ID from the previous node to get the search results. 
2. You can click and drag the ID tag on the left to the Search Job Field 

3. Select “By ID” and Drag the ID tag to the text box and it will add {{ $json.id }} 

Notice: The search ID number listed below the box will be identical to the ID tag you added to the text box. 

n8n wait node: set to 5 seconds.  

1. The wait node is set to 5 seconds here but could need to be adjusted depending on the size of the query you execute in the previous node. This ensures the previous node has time to complete.  

Edit Fields Node: 

1. This node gets the results field from the previous Splunk Search GetID node. 
2. Mode: Set to Manual Mapping. 
3. You can click and drag the results tag on the left to the “fields to set>results”  

Notice: The array listed below the {{ json.results }} box, it is sample event from the results tag on the left. 

AI Agent Parse Events: 

1. The AI Agent Parses events of the {{ $json.results }} passed from the Edit Fields Node.  
2. The AI Agent uses a system prompt to process the events to the user’s specifications. 
3. You can write your own prompt or add the simple system prompt I provide below. 

System Prompt Used: 

You are a Splunk Administrator who reviews Windows EventCodes from log files. Please include the EventCode Number and write a brief one-sentence explanation of the EventCode.  

The severity Choices are as follows:

  • High
  • Medium
  • Low

Please alphabetize the severity from High to Medium to Low. 

Please highlight the High Severity in Red. 

Please highlight the Medium Severity in Orange. 

Please highlight the Low Severity in Green. 

Below the severity list, please provide more information on fixes or possible causes of the High-Severity Event Codes. 

Please use a list format and an HTML output with a black background

Gmail Node:  

1. To use an email service like Gmail you must first setup the Google Cloud Console API for Gmail 

Received Email From n8n: OpenAI Model

We can see in the above email messages the OpenAI AI Agent color coded Severity with Red, Orange and Green. It also provided a brief explanation of the event code, with a more detailed message about the High Severity Event Code. 
 
Received Email From n8n: Ollama llama 3.2 Model 

In the above email messages, the Ollama 3.2 model color-coded the Severity words with Red, Orange, and Green, not the Event ID. It also briefly explained the event code, with a slightly less detailed message about the High Severity Event Code. 
 
 
In closing, this process can be used on an Ip address to show the location, county, and owner of the Ip block using n8n. I see this tool is widely used in organizations for automation and repetitive tasks. 
Below is an example of the Open-WebUI front-end and ChatGPT using the GPT-4.1-mini model. 


My next blog will cover sending IP addresses from Splunk to n8n to check location and IP Reputation. I will also cover the Model Context Protocol (MCP) and the different services you can use to integrate Splunk and its events with other third-party providers using AI. 

Follow TekStream for more technical walkthroughs like this.
Stay ahead of the curve with Splunk automation, AI integration, and real-world SOC modernization insights, read more on our blog page.

About the Author

Jay Young has been involved in Information Technology for 31 years, working with Internet and Agriculture technologies. For 15 years, he worked with and designed Oracle Databases used nationally and internationally. Jay has held IT management and development roles. For the last six years, he has focused his expertise on Splunk, Splunk Cloud, AWS, Data Onboarding, and Enterprise Security. Jay holds a bachelor’s in computer information systems and earned top Splunk certifications: Splunk Core Consultant (Recertified Oct 2023), Admin, and Architect. He also has accreditations in Enterprise Security. Jay currently resides in Abilene, Texas.