Quantcast
Channel: Questions in topic: "universal-forwarder"
Viewing all 1551 articles
Browse latest View live

Will universal forwarder installs only work on specific OS versions?

$
0
0
I was told that it didn't matter what version of the Universal forwarder I installed on my servers. Does it matter that much? If I have Server 2003, 2008 or 2012, can they all use the same version of the forwarder?

Why does uninstalling a deployment app from a server class uninstall the app from all server classes?

$
0
0
I was doing some testing where I setup a UF on a test machine. I created a test endpoint server class and added my test host to it. Then I deployed some existing apps (via the deployment server) to test the UF host. Some of the apps I deployed to the test host were also installed on some existing servers. When I uninstalled the app from my test server class that caused some app (also deployed to clients in other server classes) to become disabled. Is this a known issue or some sort of glitch? Where is this documented? Thank you

Bug Alert ***Why does uninstalling a deployment app from a server class uninstall the app from all server classes?

$
0
0
I was doing some testing where I setup a UF on a test machine. I created a test endpoint server class and added my test host to it. Then I deployed some existing apps (via the deployment server) to test the UF host. Some of the apps I deployed to the test host were also installed on some existing servers. When I uninstalled the app from my test server class that caused some app (also deployed to clients in other server classes) to become disabled. Is this a known issue or some sort of glitch? Where is this documented? Thank you

Is there any advantage to sending data from UFs to an intermediate HF instead of directly to indexers?

$
0
0
Is there any advantage to sending data from UFs to an intermediate HF instead of directly to indexers? I recall reading that by relaying data UF > HF > indexer, there are certain advantages (e.g. running scripts on the HF ) before indexing, but in regards to DR will the HF store events for delayed transmission, if the indexers go down or the connection to the indexers goes down? I know there is a link on this topic but unfortunately I cannot find it. Thank you

Is there any advantage to sending data from universal forwarder to an intermediate heavy forwarder instead of directly to indexers?

$
0
0
Is there any advantage to sending data from UFs to an intermediate HF instead of directly to indexers? I recall reading that by relaying data UF > HF > indexer, there are certain advantages (e.g. running scripts on the HF ) before indexing, but in regards to DR, will the HF store events for delayed transmission if the indexers go down or the connection to the indexers goes down? I know there is a link on this topic but unfortunately I cannot find it. Thank you

Do TLS/SSL and CipherSuite configs on the Indexer force autonegotiation with the Universal Forwarders?

$
0
0
If a Splunk forwarder is configured with the default TLS/SSL settings in the various .conf files as below, and the indexer/intermediate-forwarder is configured with the various .conf files as below, will the Universal Forwarder be forced to auto-negotiate the TLS version and cipher suite based on the limited TLS version and cipher suites available in the indexer/intermediate-forwarders settings? Also, does the inputs.conf or server.conf SSL/CipherSuite configurations on the Indexer/Intermediate-forwarder control the version/cipher that is used for accepting data from the Universal Forwarders? Universal Forwarder’s Configs: outputs.conf: [tcpout:my_group] sslVersions = *,-ssl2 server.conf [sslConfig] sslVersions = *,-ssl2 Indexer/Intermediate-Forwarder’s Configs: inputs.conf [SSL] sslVersions = *,-tls1.1,-tls1.0,-sslv3,-sslv2 cipherSuite = ALL:!aNULL:!eNULL:!LOW:!EXP:RC4+RSA:+HIGH:+MEDIUM server.conf [sslConfig] cipherSuite = TLSv1.2:!eNULL:!aNULL sslVersions = *,-tls1.1,-tls1.0,-sslv3,-sslv2

Input & Outputs file conf for SSL encryption

$
0
0
Hi, Can someone share with me the recent inputs & outputs conf file for SSL encryption? I am having some trouble for securing the connection between forwarder and indexer.

deploy server.conf via deployment server and point individual ssl certs

$
0
0
hi all I am a splunk noob. I have created individual server.pem files that are sha256 compliant from my windows ca my deployment server and clients are mostly windows the clients ( servers with universalforwarder installed) were individually installed. my environment is using 6.5.3 i am trying to push a hardened server.conf with the sslconfig pointing to each individual servers pem file from the deployment server. Is there any way i can do that e.g. in the deploymentserver\etc\deployment-apps\appname\local\server.confs can i put the server.conf file this way? [general] servername= server1 [sslconfig] allowsslcompression=false sslversions = tls1.1, tls1.2 requireclientcert=false enablesplunkdssl=true sslkey files=server1.pem sslkeysfilepassword=password caPath=$splunk_home\etc\auth sslPassword=password [general] servername= server2 [sslconfig] allowsslcompression=false sslversions = tls1.1, tls1.2 requireclientcert=false enablesplunkdssl=true sslkey files=server2.pem sslkeysfilepassword=password caPath=$splunk_home\etc\auth sslPassword=password

inputs.conf and outputs.conf for SSL encryption

$
0
0
Hi, Can someone share with me the recent inputs & outputs conf file for SSL encryption? I am having some trouble for securing the connection between forwarder and indexer.

6.6.2 universal forwarder on Windows - Splunk/Windows compatibility?

$
0
0
I am trying to install the universal forwarder on a windows 2008 R1 server. since there is potentially other splunkd services running I have to use a scripted process that unzips a pre-installed copy to a folder, pokes the registry and then I need to do a splunk clone-prep-clear-config to clear the cloning. When I do this I get an error indicating that the version of Splunk is not compatible with the versions of windows? I think this is a x64 and that is what I am using,. Is there a compatibility issue with 6.6.2 and Windows 2008 r1

Universal forwarder Windows installation (x86 and x64) fails when being pushed by GPO; missing next button?

$
0
0
Testing this out on two separate machines in our environment as we need to get Splunk up and running on all server by this Friday. The installations process just fine when done manually, and uninstall as well without any errors, but when we try to push them via GPO (auto-install or assign via Computer Policy; publish, assign, or auto-install via User Policy), the application either fails to show up completely, or shows up in the Network Applications but fails to install due to the Next button just... not being there (screenshot attached). I've gone through nearly every single option on the AD side, including recreating the policies wholesale, as well as re-downloading the .MSI files, and checked that this occurs on multiple servers within differing layers of GPO application, including some with nearly zero other policies being applied at all. Would it be worth a shot at modifying/transforming the MSI or writing a script to have a GPO run the installers silently? Or does this potentially point to a larger issue in my environment between GP and these specific packages? We also have PDQ available to deploy these if necessary, but I wasn't able to find an EXE-version of the installer; just ZIP files. Windows Server 2008 R2 DCs, 2008 non-R2 and Windows 7 SP1 member servers that we're trying to install the forwarders on. Splunk Enterprise, installed and working, but potentially not fully configured. Thanks in advance! ![alt text][1] [1]: https://i.imgur.com/ykOzb63.png

WARN message when configuring universal forwarder to send data to Splunk Cloud free trial

$
0
0
I already configured my Splunk universal forwarder to send data to my Splunk cloud trial and I am getting this error. 10-24-2017 21:22:27.533 -0500 WARN TcpOutputProc - Tcpout Processor: The TCP output processor has paused the data flow. Forwarding to output group default-autolb-group has been blocked for 800 seconds. This will probably stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data. Dose anybody know what I am doing wrong?

Splunk universal forwarder upgrade from 4.3.x to 7.0

$
0
0
HI, I'm looking for information about updating UFs from version 4.3.x to 7.0. I checked Splunk docs (Forwarder Manual), but there are no version dependency requirements for the upgrade. So the question: is supported (and safe) the one step upgrade from 4.3.x to 7.0? The conjecture that yes, can someone confirm it? Regards, István

Splunk Universal Forwarder reqruirements on Windows 8 r1 x86

$
0
0
I am trying to install the 6.6.2 version of the universal forwarder and I am getting an error indicating that the minimum requirements have not been met to install. What are the minimum requirements ( can't seem to find them) Should I be running earlier version?

JSON parsing error in the universal forwarder

$
0
0
Hi, I'm getting errors with parsing of json files in the universal forwarder. I'm generating json outputs - a new file is generated every time a run a routine. Output has the below: [ { "datetime":"2017-10-25 14:33:16+01:00", "user":"", "category":"ST", "type":"ABC", "frontend":"3.0", "backend":"", "r_version":"", "b_version":"", "status":"R", "next_planned_r_version":"", "next_planned_b_version":"", "comment":"" } ] Splunk forwarder gives me the following log entries in *splunkd.log*: 10-25-2017 14:33:16.273 +0100 ERROR JsonLineBreaker - JSON StreamId:16742053991537090041 had parsing error:Unexpected character: ':' - data_source="/root/status-update/environment_health_status_50.json", data_host="hostxyz", data_sourcetype="_json" The line above repeats about the same number of lines with ":" in the output. Then lines below: 10-25-2017 14:33:16.273 +0100 ERROR JsonLineBreaker - JSON StreamId:16742053991537090041 had parsing error:Unexpected character: '}' - data_source="/root/status-update/environment_health_status_50.json", data_host="hostxyz", data_sourcetype="_json" 10-25-2017 14:33:16.273 +0100 ERROR JsonLineBreaker - JSON StreamId:16742053991537090041 had parsing error:Unexpected character: ']' - data_source="/root/status-update/environment_health_status_50.json", data_host="hostxyz", data_sourcetype="_json" I've tried universal forwarders versions 7.0 and 6.5.3. I've been trying to isolated the root cause but had no luck with that - even without changing anything. Sometimes it goes fine, but mostly it doesn't. If I stop splunk, erase fishbucket and start it again, it will ingest all files just fine. However, when I run my test afterwards that is creating new files, it will fail. (or not, as I explained). monitor in inputs.conf: [monitor:///root/status-update/environment_health_status_*.json] index=dev_test sourcetype=_json _json stanza on the forwarder by using btool: *PS:* I haven't made any config in props.conf, only inputs. [_json] ANNOTATE_PUNCT = True AUTO_KV_JSON = true BREAK_ONLY_BEFORE = BREAK_ONLY_BEFORE_DATE = True CHARSET = UTF-8 DATETIME_CONFIG = /etc/datetime.xml HEADER_MODE = INDEXED_EXTRACTIONS = json KV_MODE = none LEARN_MODEL = true LEARN_SOURCETYPE = true LINE_BREAKER_LOOKBEHIND = 100 MATCH_LIMIT = 100000 MAX_DAYS_AGO = 2000 MAX_DAYS_HENCE = 2 MAX_DIFF_SECS_AGO = 3600 MAX_DIFF_SECS_HENCE = 604800 MAX_EVENTS = 256 MAX_TIMESTAMP_LOOKAHEAD = 128 MUST_BREAK_AFTER = MUST_NOT_BREAK_AFTER = MUST_NOT_BREAK_BEFORE = SEGMENTATION = indexing SEGMENTATION-all = full SEGMENTATION-inner = inner SEGMENTATION-outer = outer SEGMENTATION-raw = none SEGMENTATION-standard = standard SHOULD_LINEMERGE = True TRANSFORMS = TRUNCATE = 10000 category = Structured description = JavaScript Object Notation format. For more information, visit http://json.org/ detect_trailing_nulls = false maxDist = 100 priority = pulldown_type = true sourcetype =

Questions about various steps for network device integration with Splunk

$
0
0
Hello Splunk Experts, I'm working on networking device integration with Splunk. I'm considering using OneBox universal forwarder to receive the application deployment from Splunk server. Here are the steps I have in mind, and I have question for every step :(. Appreciate share your experience/insights on this with me. 1. Install Splunk App ( GUI & backend scirpts/libs ) to Splunk server, with wizard input from user, generating some configuration files for splunk forwarders and backend scripts. QUESTION: Based on user input, Individual endpoints may have completely different configs, is there any example, apps I can reference to acheive that ? 2. COPY/PUT what I need in forwarder to from apps/ to deployment-apps so I can have all the stuffs ready on selected forwarders. QUESTION: Can I add some link, button on Application frontend GUI to trigger this backend operation ? 3. Forwarder monitor the data output and send to Splunk server, but need to help me to trigger backend scripts first to start to collect the data QUESTION: How Can I start, schedule external script at forwarder boot ? Best Regards, Yanyu

Is it possible to combine these two search results to create 1 alert?

$
0
0
I have two very different search queries that I am having a hard time combining into one search. Search 1 yields results if the indexer hasn't received any data from the server's universal forwarder in over 5 minutes: | metadata type=hosts index=* | search host=WinServer1 | where now()-lastTime>=300 | table host lastTime | eval lastTime=strftime(lastTime, "%c") Search 2 ingests the Windows Update Logs (C:\Windows\WindowsUpdate.log) and searches for the log entry "AU initiates service shutdown" which is generated when the server is shut down gracefully: host=WinServer1 "AU initiates service shutdown" The purpose of combining these searches is to create two alerts: One that will indicate the server has been shut down gracefully and another if the server has experience a hard shutdown. For example, if the server has a graceful shutdown, the search terms would be combined as: Search 1 AND Search 2. If the server has a hard shutdown, the search terms would be combined as: Search 1 NOT Search 2. I am unable to find the right way to use boolean operators to combine these 2 searches, and am not sure if it would be even possible considering they are both looking for very different data. Any help is greatly appreciated.

Universal Forwarder Disk Usage

$
0
0
HI Fellow Splunkers, Need some help out here. What would be the minimum Disk Space required when installing a Universal Forwarder? or is there an ideal disk space for a universal forwarder? Just wanted to make sure the Forwarder itself doesn't utilize that much of Disk space when installed. Thanks!

Universal forwarder deprecated for Windows 2008 on Splunk 7.0.0?

$
0
0
we are in the process of rolling SPLUNK to production very soon and we going with SPLUNK Enterprise 6.6.3 as we stood up some of the infrastructure before 7.0 release. Looking at the deprecated features support for Windows 2008 is removed. What is the alternative? Our process for deploying to endpoints has 45 day lead times, so a January deployment had a mid October deadline for submission. This is why we went with 6.6.2 UF. Windows 8 x86_64: Enterprise and Free/Trial support is removed. Universal Forwarder support is deprecated and might be removed entirely in a future release. • Windows 8 x86_32: All support (Enterprise, Free/Trial, and Universal Forwarder) is removed. • Windows Server 2008 R2 SP1 x86_64: Enterprise and Free/Trial support is removed. Universal Forwarder support is deprecated and might be removed entirely in a future release.

How to skip header in CSV files before indexing?

$
0
0
My input files are in the following format (CSV): Icon Statistics Time;26.10.2017 00:00 - 27.10.2017 04:40 Service;Servicename Statistic;Report_servicename Date;Time;IncomingRequest;InternalSystemDBError;InternalSystemDataError;InternalSystemErrorOther;OK;SDUPTimeout;SDUPError;InvalidIncomingRequest;counter8;counter9;counter10;counter11;counter12;counter13;counter14;counter15;counter16;counter17;counter18;counter19 26.10.2017;00:00;4;0;0;0;4;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0 26.10.2017;00:10;2;0;0;0;2;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0 26.10.2017;00:20;5;0;0;0;5;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0 Total;;1,234;0;0;0;1,224;0;10;0;0;0;0;0;0;0;0;0;0;0;0;0 Before indexing these files, the "header" should be removed. I configured the Splunk Universal Forwarder to monitor these files in the following way: [monitor:///opt/ect/data/sdp/mail/statistics/*SDUP*.csv] index=csdp_prod_stats source=statistics sourcetype=csv crcSalt = ignoreOlderThan=14d On the main Splunk instance, I configured the props.conf: [csv] TRANSFORMS-eliminate_header = eliminate_header INDEXED_EXTRACTIONS = CSV FIELD_DELIMITER = ; TIMESTAMP_FIELDS = Date,Time HEADER_FIELD_LINE_NUMBER = 7 And transforms.conf as following: [eliminate_header] REGEX = ^(?:Icon|Time|Service|Statistic|Total) DEST_KEY = queue FORMAT = nullQueue When I check the search in Splunk, it seems like the remove of the header is not working. The complete file is being indexed. What am I doing wrong? Also I want to use the column names in the CSV as field names in Splunk from the line I did not remove from the CSV file. Is this the correct way of specifying this automatic extraction of fields in Spunk? ("HEADER_FIELD_LINE_NUMBER = 7" as seen above in props.conf) Thank you in advance!
Viewing all 1551 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>