Quantcast
Channel: Questions in topic: "universal-forwarder"
Viewing all 1551 articles
Browse latest View live

Can you index a certain sourcetype and forward the remaining?

$
0
0
Hi I am new to Splunk and am trying to forward a specific sourcetype of data out. That part is successful but now I am having trouble with the next part; indexing the remaining sourcetypes. I am using a Windows Universal Forwarder to forward all logs to a Splunk Enterprise Instance. I want to Index the Perfmon logs but forward the Security and Application logs to a third Party source. How can I achieve this? So far all the documentation seems to indicate using Selective Indexing but there the information suggests setting the entire log to either be indexed or forwarded or both, not just the specific sourcetype.

Universal Forwarder Credentials

$
0
0
What is the correct way to upgrade the credentials on a universal forwarder. Ours will expire soon, When I run splunk install app -update, getting the following error: Cannot perform action "POST" without a target name to act on.

replaced with new index with old one in inputs.conf

$
0
0
I have changed the index name for a log ingestion to a new one but the logs are still ingesting to the old index. I cannot understand why the logs are not ingesting to new index. Please let me know if anyone have any idea. Thanks.

DNS Server NOT Forwarding Windows Security Events

$
0
0
One of our DNS servers running a universal forwarder, suddenly stopped sending Windows Event logs to our indexers. DNS events are still being forwarded.

Universal Forwarder DNS resolution

$
0
0
Good day to all, Since I didn't find an search results on this topic, does UF do any DNS resolution for the events (windows or whatsoever) that reads ? I believe that doesn't do but I would like some second opinion. thanks!

Preferred distro for UF & Syslog-NG instance

$
0
0
We have a requirement to run a Universal Forwarder that will act as an Intermediate Forwarder for our domain controllers & will also run syslog-NG to receive logs from our firewalls before sending them up to Splunk Cloud. We are looking to run this on Linux. The following KB mentions the Linux Kernel versions that are supported: https://docs.splunk.com/Documentation/Splunk/7.3.2/Installation/Systemrequirements#Supported_OSes I'm assuming from that document that Splunk only support distros with 2.6 kernel version. I've seen a few posts/articles stating that people are running RHEL 6.x. Is anyone running a more recent version? are people running other distributions? Thanks in advance.

Set up log-to-metrics from Universal Forwarder to Splunk Enterprise

$
0
0
I've followed the docs for setting up log-to-metrics but I haven't been able to get it to work as intended. I have a CSV file being monitored by a universal forwarder that then gets sent to Splunk enterprise. I want every value in the CSV (except for the date and time) to be saved as a metric in splunk, with a metric_name matching the field name and a _value from the CSV file. According to the documentation on https://docs.splunk.com/Documentation/Splunk/7.3.2/Metrics/L2MConfiguration, for structured data like a CSV, the configuration files should be located on the forwarder, which is what I have done. Here is what I have in SplunkForwarder/etc/apps/search/local/props.conf: [csv-logtometrics] FIELD_DELIMITER=, FIELD_NAMES=Date,Time,Field1,Field2,Field3 INDEXED_EXTRACTIONS = csv METRIC-SCHEMA-TRANSFORMS = metric-schema:csv-logtometrics In SplunkForwarder/etc/apps/search/local/transforms.conf: [metric-schema:csv-logtometrics] METRIC-SCHEMA-MEASURES=_ALLNUMS_ And in SplunkForwarder/etc/apps/search/local/inputs.conf [monitor:///path/to/stats.csv] sourcetype = csv-logtometrics disabled = false With this config, when I search in splunk I can get results for metric_name and _value, but they are only for the first csv column (Field1 in this case). How do I get values for the other csv columns to show up as metrics as well? My understanding was that using \_ALLNUMS\_ should cause each individual field in the csv to be read as a metric, but it appears that it is only applying to the first field. I also haven't figured out how to get these searchable results into a metrics index, rather than just being searchable like a normal event log. I tried creating a matching metrics sourcetype on the splunk enterprise end, but that didn't seem to work. I get no results when running | mcatalog values(metric_name)

How to fetch Windows Services details using Splunk App For Infrastructure?

$
0
0
Dear Splunkers, I have Splunk App for Infrastructure installed on Splunk Cloud and have already onboarded windows details using easy install script but no where I can see Services data to perform real time monitoring of services.msc Could you please guide here?

How to configure Splunk to read a csv file from a universal forwarder?

$
0
0
Hi, I have one csv file at location /apps/data_splunk/.csv And this CSV file has data like below JAN-18 | 31-JAN-2018 | -1 | 1 | 31-JAN-18 | 01-FEB-18 | 727 JAN-18 | 01-FEB-2018 | 1 | 1 | 01-FEB-18 | 02-FEB-18 | 751 JAN-18 | 02-FEB-2018 | 2 | 1 | 02-FEB-18 | 02-FEB-18 | 342 JAN-18 | 06-FEB-2018 | 4 | 1 | 06-FEB-18 | 06-FEB-18 | 323 I want to forward this data to my splunk. Here is what I have done, but it's not working. I have these setup on the Splunk UF server. Inputs.conf [monitor://data_splunk/.csv] disabled = false index = _idx2 sourcetype = mycsvfileData Props.conf [mycsvfileData] INDEXED_EXTRACTIONS = csv SHOULD_LINEMERGE = false NO_BINARY_CHECK = true KV_MODE = none category = Structured FIELD_DELIMITER = | Please let me know, what I am doing wrong. Please suggest the better way. Thanks in advance.

FormatMessage was unable to decode error (193), (0xc1)

$
0
0
10-07-2019 13:33:23.696 -0700 ERROR ExecProcessor - Couldn't start command ""C:\Program Files\SplunkUniversalForwarder\etc\apps\test\bin\abc.ps1"": FormatMessage was unable to decode error (193), (0xc1)

Intermediate Forwarder Not Sending Data

$
0
0
I have a UF sending to a UF sending to Splunk. The intermediate UF is sending data but just from that host. The first UF's data is not getting to Splunk. Intermediate UF IP 10.0.1.18 Splunk IP 10.0.1.65 Here are the conf file info: First UF: inputs.conf [default] host = SP-DB outputs.conf [tcpout] defaultGroup = default-autolb-group [tcpout:default-autolb-group] server = 10.0.1.18:9997 [tcpout-server:// 10.0.1.18:9997] Intermediate UF: inputs.conf [default] host = SPLUNK2 [splunktcp://:9997] compressed = true disabled = 0 outputs.conf [tcpout] defaultGroup = default-autolb-group [tcpout:default-autolb-group] server = 10.0.1.65:9001 [tcpout-server://10.0.1.65:9001]

Intermediate forwarder not sending data

$
0
0
I have a UF sending to a UF sending to Splunk. The intermediate UF is sending data but just from that host. The first UF's data is not getting to Splunk. Intermediate UF IP 10.0.1.18 Splunk IP 10.0.1.65 Here are the conf file info: First UF: **inputs.conf [default] host = SP-DB** **outputs.conf [tcpout] defaultGroup = default-autolb-group [tcpout:default-autolb-group] server = 10.0.1.18:9997 [tcpout-server:// 10.0.1.18:9997]** Intermediate UF: **inputs.conf [default] host = SPLUNK2** **[splunktcp://:9997] compressed = true disabled = 0 outputs.conf [tcpout] defaultGroup = default-autolb-group [tcpout:default-autolb-group] server = 10.0.1.65:9001 [tcpout-server://10.0.1.65:9001]**

Does Splunk ingest files that existed before the remote folder monitor was created?

$
0
0
I have a client server with a universal forwarder configured to forward data to an index server. On the client server, I have a folder "X" full of CSV files. If I create a remote folder monitor for the client server folder "X" on my deployment server and deploy it to the client server. Will Splunk process the CSV files that are already there. or will Splunk not do anything until the folder contents change?

Receiving error after restarting docker-splunk, proceeds to add forward-server

$
0
0
Hi, I am setting up a Splunk universal forwarder by pulling the universalforwarder docker image from docker-hub and as part of docker run command I also add forward-server like below: docker run -e SPLUNK_START_ARGS="--accept-license --answer-yes --no-prompt" -e SPLUNK_ADD="monitor , forward-server $INDEXER:$PORT " splunk/universalforwarder:latest This works fine first time, however, if I restart the docker container, it tries to add the forward-server again and throws the below exception `"TCPOut - forwarded-server already present"` in the splunkd.log I tried -e "SPLUNK_FORWARD_SERVER" but it didn't add the forward-server at all. This seems to me like an idempotency issue where the splunk-ansible should not have tried to add again when the forward-server already exist. Any help or thoughts appreciated. Thanks, Chinmaya

Forwarder Resend Data After Connect To Indexer

$
0
0
Hi, Splunkers: I have a forwarder that is target to a incorrect indexer and it was paused to send data for 3700s. Now I have configured to a correct indexer URI and how can I make the forwarder restarting send the data to indexer?

Recommended way to ingest files from remote server into clustered indexers?

$
0
0
We have a clustered search head and indexer environment with 16 indexers and a Deployment server On a remote Windows server we have a PS script that runs a Microsoft API call every hour to pull alerts from Azure and then dumps the output into a .csv file on that Windows server. This server is not running a UF. I’m not seeing any of the four/five Azure add-ons that pull the Azure AD related alerts so with that I would like to assistance in pulling those .csv files into an index on Splunk. Is the best way to get the files from the remote Windows server via a UF that is set to monitor the ,csv files in the directory? Thx

SAML cert db registration with KVStore failed

$
0
0
After upgrade from 7.1.2 to 7.3.2. I am seeing below error. INFO loader - SAML cert db registration with KVStore failed

After upgrade Splunk Universal Forwarder is not sending logs to Indexer tier

$
0
0
After upgrading universal fowarder from 7.1.2 to 7.3.1, the universal forwardre stop sending logs to splunk.

How to configure outputs.conf to forward data in a fail-over method

$
0
0
We have HF 1 and HF2 that are located in DC1 and DC2 respectively. How can we configure outputs.conf in below method. - All servers in DC 1 should forward data to HF 1 primarily and only send data to HF 2 in case of a fail over (ie HF1 goes down.) - All servers in DC 2 should forward data to HF 2 primarily and only send data to HF 1 in case of a fail over (ie HF2 goes down.) Please note that we are not looking for a autolb feature and just looking for a failover kind of configuration in outputs.conf

Universal Forwarder inputs.conf perfmon stanza : Why counters with "-" in their name are not selectable?

$
0
0
## Initial case (working) : In an UF add to an inputs.conf (depending of if your using an app, creating local conf or default one, etc.) [perfmon://< any performance monitoring input>] counters = *< others parameters tested and correctly set> => All counters are forwarded. ------------------------------------ ***Objective*** : I want to select a list of counters to limit my Splunk license usage, and to forward & index only the required ones. ------------------------------------ ## Failed cases : Counters selection fails when counters name are <1st word - 2nd word - etc.>, worse : ***no counters are forwarded and/or indexed*** => No counters is forwarded. counters = * #Host Queue - Instance State Msg Refs - Length;Host Queue - Length;Host Queue - Number of Instances;Host Queue - Suspended Msgs - Length OR counters = Host Queue - Instance State Msg Refs - Length;Host Queue - Length;Host Queue - Number of Instances;Host Queue - Suspended Msgs - Length OR counters = Host Queue \\- Instance State Msg Refs \\- Length;Host Queue \\- Length;Host Queue \\- Number of Instances;Host Queue \\- Suspended Msgs \\- Length OR counters = "Host Queue \\- Instance State Msg Refs \\- Length";"Host Queue \\- Length";"Host Queue \\- Number of Instances";"Host Queue \\- Suspended Msgs \\- Length" OR counters = "Host Queue - Instance State Msg Refs - Length";"Host Queue - Length";"Host Queue - Number of Instances";"Host Queue - Suspended Msgs - Length" OR counters = \"Host Queue \\- Instance State Msg Refs \\- Length\";\"Host Queue \\- Length\";\"Host Queue \\- Number of Instances\";\"Host Queue \\- Suspended Msgs \\- Length\"
Viewing all 1551 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>