GETTING STARTED WITH SPLUNK http:// techniqpatch.blogspot.in 1 By Sanjib Dhar Jan 09, 2014
Oct 21, 2014
http://techniqpatch.blogspot.in 1
GETTING STARTED WITH SPLUNK
By Sanjib DharJan 09, 2014
http://techniqpatch.blogspot.in 2
Splunk overview
Splunk Enterprise is the industry-leading platform for operational intelligence.
Collect and index any machine data from virtually any source in real time.
Search, monitor, analyze and visualize data to gain new insights and intelligence
http://techniqpatch.blogspot.in 3
Why Splunk ??
Operational Intelligence for IT and the Business
Powerful Analytics for Every User Can be used for troubleshooting, security
incident investigations, network monitoring, compliance reporting, business analytics etc.
http://techniqpatch.blogspot.in 4
Data-source for Splunk It Collects and indexes
any machine-generated data from virtually any source or location in real time.
Data from custom applications, application servers, web servers, databases, networks, virtual machines, telecoms equipment, operating systems, sensors and much more .
http://techniqpatch.blogspot.in 5
Splunk Has Two Primary Component
Indexing and Search Services (Indexer)
Data Collection and Forwarding (Forwarder)
http://techniqpatch.blogspot.in 6
Installation
Component System IP
Splunk Indexer 192.168.***.***
Splunk Forwarder(1) 192.168.***.***
Splunk Forwarder(2) 192.168.***.***
http://techniqpatch.blogspot.in 7
Installation Steps(Indexer on CentOS)
rpm file download link.http://www.splunk.com/download
Command to install the setup:rpm -i splunk-6.0.1-189883-linux-2.6-x86_64.rpm
This will install splunk in /opt/splunk directory.
http://techniqpatch.blogspot.in 8
Indexer configuration
Add Data: The following local log files are added for indexing and monitoring from splunk web:
> monitoring.log > engine.log > ingest.log > alarms.logConfig example :- /opt/splunk/etc/system/local/inputs.conf :[monitor:///sboss/log/monitoring.log]disabled = falseindex = mainsourcetype = monitoring
http://techniqpatch.blogspot.in 9
Indexer configuration (cont.)
Receiver: Receiver is configured in indexer through splunk web. Receiver port address is specified e.g. 9997 Indexer will listen to that port and index the events
those are received through this port.
http://techniqpatch.blogspot.in 10
Installation Step(Universal Forwarder)
For Windows Installer file download link
http://www.splunk.com/download/universalforwarder splunkforwarder-6.0.1-189883-x86-release.msi
Double click on the installer and follow the step
For CentOS Rpm file download Link
http://www.splunk.com/download/universalforwarder
Following command is used for installation rpm -i splunkforwarder-6.0.1-189883-linux-2.6-
x86_64.rpm
http://techniqpatch.blogspot.in 11
Forwarder configuration
Data is added to the forwarder by modifying the inputs.conf:[monitor://C:\\sboss\\log\\tb.log]
index = maindisabled = falsesourcetype=TB_LOG
The output.conf of forwarder is configured for forwarding data to the indexer:
[tcpout-server://192.168.***.***:9998]sendCookedData = false
http://techniqpatch.blogspot.in 12
Index
The index is the repository for Splunk Enterprise data. Splunk Enterprise transforms incoming data into events, which it stores in indexes.
By default all the events are stored in main index Index can be created which can hold
specific events defined in the configuration.
http://techniqpatch.blogspot.in 13
How to create Index
Login to splunk Web. Click on Settings menu. Select ‘Indexes’ from ‘Data’ Section. Click on New Button. Enter Index name Click save.
http://techniqpatch.blogspot.in 14
How to create Index(cont.)
http://techniqpatch.blogspot.in 15
Searching Events
Login to splunk Web. Click on ‘Search’ link in Splunk Home
page. Write search query. Click the All time button that
appears beside the search timeline and set the time.
Click on Search Button.
http://techniqpatch.blogspot.in 16
Searching Events (cont.)
http://techniqpatch.blogspot.in 17
Fields In Events
Fields are searchable name/value pairings in event data. All fields have names and can be searched with those names. Example:
host=foo
http://techniqpatch.blogspot.in 18
Field Extraction
Field extraction is done by the following two ways: Automatic field extraction. Manual field extraction.
http://techniqpatch.blogspot.in 19
Manual Field Extraction
Click on Right arrow on search result events.
Click on Event Actions. Select Extract Fields Enter the Field value Examples in Text
box with new line delimiter. Click on Generate Button. Click on save.
http://techniqpatch.blogspot.in 20
Manual Field Extraction(cont.)
http://techniqpatch.blogspot.in 21
Reports
Whenever a search is performed it can be saved for later use as a report.
http://techniqpatch.blogspot.in 22
Steps to create a Report
Login to splunk Web. Click on ‘Search’ link in Splunk Home
page. Write query in search field. Click on ‘All time’ to apply time frame. Click on search icon. Click on ‘Save As’. Select ‘Report’.
http://techniqpatch.blogspot.in 23
Steps to create a Report(cont.)
http://techniqpatch.blogspot.in 24
Alerts
Splunk alerts are based on reports that run on a regular interval over a set historical time range or in real time.
Types: Alerts based on real-time searches that trigger every
time the base search returns a result. Alerts based on historical searches that run on
a regular schedule. Alerts based on real-time searches that monitor events
within a rolling time "window".
http://techniqpatch.blogspot.in 25
Steps to Create Alert
Run a search. Click the Save As button that appears above the
search timeline. Select Alert to open the Save As Alert dialog. Give the alert a Name and, optionally,
a Description. Select the Alert Type of the alert you want to
configure: Real-time or Scheduled. Click Next. Configure Actions. Click save.
http://techniqpatch.blogspot.in 26
Steps to Create Alert(cont.)
http://techniqpatch.blogspot.in 27
Alerts Example
Error in Monitoring log: This alert is schedule to run every day at 1.00 am . It will check for the count of error entry in monitoring.log for last 24 hour. If the number of count is more than 10 then it will send mail to the specified user with the attachment of search result in PDF format.
The Query used : index=main sourcetype="monitoring" LOG_LEVEL=ERROR
http://techniqpatch.blogspot.in 28
Generated Alert When the alert runs at specified time it
generates the following PDF.
http://techniqpatch.blogspot.in 29
Integration with Other app
A standalone application is also can be created using splunk-sdk for JavaScript and node.js. This will fetch the required data from the splunk and will show those data in its own UI and this sdk can be integrated with other app also.
http://192.168.***.***:6969/examples/browser/testApp.html
http://techniqpatch.blogspot.in 30
Thank you!