Coder Social home page Coder Social logo

logshark's Introduction

LogShark

Community Supported

LogShark is a tool for analyzing and troubleshooting Tableau Server and Tableau Desktop. LogShark extracts data from log files and builds workbooks that can help you identify and understand error conditions, performance bottlenecks, and background activity. LogShark works by running a set of targeted plugins that pull specific data out of the log files, building data sources, and generating Tableau workbooks which can be used for analysis.

LogShark can help you:

  • Troubleshoot issues that are recorded in the logs.
  • Analyze system metrics from log data.
  • Solve problems in Tableau without exposing sensitive corporate information.
  • Validate Tableau Server application behavior against historical data when taking a new build or making a system change.

See the releases page for a full list of updates.

Sample Apache Workbook Screenshot

Getting Started

There are 3 ways you can use LogShark.

Self-Contained Application

Download and unzip the precompiled self-contained application from the following links:

Download LogShark for Win

Setup LogShark

Note that LogShark is configured by the LogSharkConfig.json file in the Config directory. If you are replacing an existing copy of LogShark, be mindful of any changes made to this configuration file.

Compile It Yourself

LogShark is a .NET Core 3.1 application. To compile it yourself:

  1. Make sure you have .NET Core 3.1 SDK installed
  2. Clone or download the repository
  3. Run the following command from the directory where LogShark.sln file is. Make sure to replace <insert_version> with the actual version, i.e. 4.2.1:

Windows

dotnet publish LogShark -c Release -r win-x64 /p:Version=<insert_version> --self-contained true 

Linux

dotnet publish LogShark -c Release -r linux-x64 /p:Version=<insert_version> --self-contained true 

Build and Run It Using Docker

Below are instructions on how to build and run LogShark using Docker on your own machine.

To build Docker image

  1. Install Docker Desktop (if you don’t have it already)
  2. Clone or download LogShark source code from this repository
  3. Build LogShark container image by running the following command from the directory where LogShark.sln file is
docker build -f LogShark/Dockerfile -t logshark .
  • Note the . at the end of the command - it is required
  • -t parameter specifies Docker image name and tag. Use whatever makes sense for your environment, or leave logshark there.

To process a log set

Use docker run command to run LogShark in a container. For example:

docker run -v ~/TestLogSets/logs_clean_tsm.zip:/app/logs.zip -v ~/TestLogSets/LogSharkDocker/Output:/app/Output -v ~/TestLogSets/LogSharkDocker/ProdConfig.json:/app/Config/LogSharkConfig.json logshark:latest logs.zip --plugins "Apache;Config" -p

Let’s break down this command part by part:

  • -v parameter maps a file or directory on local machine to a file/directly within container. This way LogShark in container can read files from the local machine (log set to process and config) and save output so it is available even after container is done and destroyed.
    • -v ~/TestLogSets/logs_clean_tsm.zip:/app/logs.zip maps ~/TestLogSets/logs_clean_tsm.zip file on host OS to /app/logs.zip file in container.
    • -v ~/TestLogSets/LogSharkDocker/Output:/app/Output maps Output directory of LogShark within container to a directory on host OS.
    • -v ~/TestLogSets/LogSharkDocker/ProdConfig.json:/app/Config/LogSharkConfig.json maps json config on host OS to a default LogShark configuration file within container (and hides the original file included with LogShark). This way there is no need to explicitly tell LogShark what config file to use.
  • logshark:latest this is the name/tag of the container image to use. If you used a different name/tag while building the Docker image, use it here.
  • logs.zip --plugins "Apache;Config" -p the rest of the line is passed to LogShark directly as arguments. See full list of available arguments here.
    • The first parameter is the name of the log set to process and it is required. It points to a file we mapped with the first -v statement

OS Compatibility

  • Above instructions are for macOS and Linux host OS running Linux containers.
  • Windows host OS running Linux containers: Same instructions apply, but path format is different for mapping local files and dirs when running the container. I.e. -v ~/TestLogSets/logs_clean_tsm.zip:/app/logs.zip becomes -v C:\tmp\logs_clean_tsm.zip:/app/logs.zip
  • Windows host OS running Windows containers: LogShark can run on Windows so this should be doable, but we do not use/test/support this scenario currently.

Analysis

The best way to analyze results is to run LogShark on your own logset and explore the generated workbooks via Tableau. Beyond what is included, you can configure LogShark to output your own custom workbooks. See the installation guide for more details on how to do this.

Support

LogShark is released as a Community-Supported tool. It is intended to be a self-service tool and includes this user guide. Any bugs discovered should be filed in the LogShark Git issue tracker.

Contributing

Code contributions & improvements by the community are welcomed and encouraged! See the LICENSE file for current open-source licensing & use information. Before we can accept pull requests from contributors, we do require a Contributor License Agreement. See http://tableau.github.io for more details.

logshark's People

Contributors

benlower avatar bermetj avatar bjamankulova avatar d45 avatar jdomingu avatar jmangue avatar pendulin avatar rahulmotwani avatar sammmmm avatar scoots avatar sijingshensalesforce avatar tableaushopp avatar xantrul avatar zhuoyanggao avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logshark's Issues

Logshark run fails due to GuidRepresentation error in LogsetValidator

I've tried running Logshark on my computer's Tableau logs, but I am receiving the following exception:

The GuidRepresentation for the reader is CSharpLegacy, which requires the binary sub type to be UuidLegacy, not UuidStandard

According to the Logshark Logs, this exception is causing it to think mongo contains invalid log data and the run fails.

Logshark ResourceManager plugin fails when run against Tableau Server 10.4 logs

Hello

I have installed Logshark 2.0 and when i try pull server log to visualize it it is throwing following error

2017-10-25 01:16:33,469 wsjcit1043566_17102508145683_srmlogszip (null) [1] ERROR Logshark.Core.Controller.Plugin.PluginExecutor - Encountered uncaught exception while executing plugin 'ResourceManager': No total memory limit found for { "_id" : "worker2/vizqlserver/Logs/backgrounder_2-1_2017_10_25_00_44_38.txt-572", "ts" : ISODate("2017-10-25T00:44:43.424Z"), "pid" : 601136, "tid" : "93f9c", "sev" : "info", "k" : "msg", "v" : "Resource Manager: listening on port 2233", "file_path" : "worker2\vizqlserver\Logs", "file" : "backgrounder_2-1_2017_10_25_00_44_38.txt", "worker" : 2, "line" : 572 }
2017-10-25 01:16:33,471 wsjcit1043566_17102508145683_srmlogszip (null) [1] DEBUG Logshark.Core.Controller.Plugin.PluginExecutor - <log4net.Error>Exception during StringFormat: Input string was not in a correct format. System.Exception: No total memory limit found for { "_id" : "worker2/vizqlserver/Logs/backgrounder_2-1_2017_10_25_00_44_38.txt-572", "ts" : ISODate("2017-10-25T00:44:43.424Z"), "pid" : 601136, "tid" : "93f9c", "sev" : "info", "k" : "msg", "v" : "Resource Manager: listening on port 2233", "file_path" : "worker2\vizqlserver\Logs", "file" : "backgrounder_2-1_2017_10_25_00_44_38.txt", "worker" : 2, "line" : 572 }
at Logshark.Plugins.ResourceManager.Helpers.MongoQueryHelper.GetTotalMemoryLimit(BsonDocument srmStartEvent, IMongoCollection1 collection) at Logshark.Plugins.ResourceManager.Helpers.MongoQueryHelper.GetThreshold(BsonDocument srmStartEvent, IMongoCollection1 collection)
at Logshark.Plugins.ResourceManager.ResourceManager.PersistThresholds(Int32 workerId, IMongoCollection`1 collection)
at Logshark.Plugins.ResourceManager.ResourceManager.ProcessSrmEvents()
at Logshark.Plugins.ResourceManager.ResourceManager.Execute(IPluginRequest pluginRequest)
at Logshark.Core.Controller.Plugin.PluginExecutor.ExecutePlugin(Type pluginType){}</log4net.Error>
2017-10-25 01:16:33,481 wsjcit1043566_17102508145683_srmlogszip (null) [1] ERROR Logshark.Core.Controller.Plugin.PluginExecutor - ResourceManager failed to execute successfully.
2017-10-25 01:16:33,482 wsjcit1043566_17102508145683_srmlogszip (null) [1] INFO Logshark.Core.Controller.Plugin.PluginExecutor - Initializing SearchServer plugin..
2017-10-25 01:16:33,484 wsjcit1043566_17102508145683_srmlogszip (null) [1] INFO Logshark.Core.Controller.Plugin.PluginExecutor - Execution of SearchServer plugin started at 1:16 AM..
Logshark.log.txt

Not able to successfully run the Logshark

Hi,

I am unable to successfully run logshark after following all the steps in the Doc.
When running :
C:\Program Files\Logshark>logshark C:\Users\Alex\Downloads\mylogs10262017 --startlocalmongo
Loading Logshark user configuration..
Target must be a valid file, directory or MD5 hash!
C:\Program Files\Logshark>

Please see below and let me know if there is anything wrong with my settings:

image

image

Thank you

Tableau 2018.2 Linux

We moved to Tableau 2018.2 on Linux and it looks like the log structure has changed significantly. Is 2018.2/Linux supported? I get the errors below on all except the Topology and Full Config dashboards.

Initializing VizqlServer plugin..
Execution of VizqlServer plugin started at 12:00 PM..
0/{ItemsExpected} VizqlServerSession items have been persisted to the database. [{PercentComplete}]
Failed to persist any data from Vizqlserver logs!
Skipped saving workbooks for VizqlServer because the plugin did not generate any backing data.

Feature Request: Switch to Disable Query field getting truncated to 1024 characters

The Query column in the Vizql_End_Query table and the Query_Abstract column and Query_Compiled columns in the vizql_qp_batch_summary_job table are getting truncated to 1024 characters.

We'd like the complete query to be logged as a partial query doesn't have much utility. An option to disable the truncation or set a new limit would be wonderful.

Thank you!
Ray

    public VizqlEndQuery(BsonDocument document)
    {
        ValidateArguments("end-query", document);

        SetEventMetadata(document);
        BsonDocument values = BsonDocumentHelper.GetValuesStruct(document);

        Query = BsonDocumentHelper.TruncateString(BsonDocumentHelper.GetString("query", values), 1024);
        ProtocolId = BsonDocumentHelper.GetNullableLong("protocol-id", values);
        Cols = BsonDocumentHelper.GetInt("cols", values);
        Rows = BsonDocumentHelper.GetInt("rows", values);
        QueryHash = BsonDocumentHelper.GetNullableLong("query-hash", values);
        Elapsed = BsonDocumentHelper.GetDouble("elapsed", values);
    }

Logshark not working with Tableau Server 2018.1

Since we've upgraded to 2018.1, Logshark has been unable (or rather, "unwilling") to process the log files for our Tableau Server. The error thrown is:

[2018-05-14T02-20-30] Failed to initialize Logshark: No compatible artifact processor found for payload! Is this a valid logset?

I've looked into the issue and seem to have found that this is due to the checks being done in TableauServerLogProcessor.cs. This file is, if I'm not mistaken, looking for the buildversion.txt file to confirm that the logs are coming from server. As of 2018.1, it seems like this little guy isn't included in the ziplogs anymore.

And effectively, if I manually copy a buildversion.txt from an older (10.5) zip file into the new one, then run Logshark, it agrees to proceed normally.

After that, it doesn't finish correctly. It seems to get stuck on log files that have lines of over 100K characters. But perhaps it's more suitable to open a different issue for this?

System.AccessViolationException: Attempted to read or write protected memory w Logshark 3.0.1 and Tableau 2018.2

Hello,

New to Tableau/Logshark. The directions have mostly worked, however after processing the apache logs I get the following error (i think this is the relevant part - about 1300 lines that look like things are working between these messages):

Initializing Backgrounder plugin..\r\nExecution of Backgrounder plugin started at 5:32 PM..\r\nProcessing Backgrounder job events..\r\nBuilding new extract 'BackgrounderJobs.hyper'..\r\nBuilding new extract 'BackgrounderJobErrors.hyper'..\r\nBuilding new extract 'BackgrounderExtractJobDetails.hyper'..\r\nBuilding new extract 'BackgrounderSubscriptionJobDetails.hyper'..\r\n2307 BackgrounderJob records have been persisted.\r\n4371 BackgrounderJob records have been persisted.\r\n6445 BackgrounderJob records have been persisted.\r\nFinished processing Backgrounder job events!\r\n*** exception in native code \r\nSystem.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.\r\n at com.tableausoftware.hyperextract.Extract.TabExtractClose(Pointer )\r\n",
"stdout_lines": [
"Loading Logshark user configuration..",
"Purging existing MongoDB data..",
"Starting local MongoDB instance on port 27017..",
"MongoDB started successfully!",
......lots of messages such as.......
"Completed processing of node1\backuprestore_0.20182.18.0627.2230523418254334175489\config\backuprestore_0.20182.18.0627.2230\topology.yml (4Kb) [00.10]",
...............................................................
"Apache execution completed! [23.38]",
"Initializing Backgrounder plugin..",
"Execution of Backgrounder plugin started at 5:32 PM..",
"Processing Backgrounder job events..",
"Building new extract 'BackgrounderJobs.hyper'..",
"Building new extract 'BackgrounderJobErrors.hyper'..",
"Building new extract 'BackgrounderExtractJobDetails.hyper'..",
"Building new extract 'BackgrounderSubscriptionJobDetails.hyper'..",
"2307 BackgrounderJob records have been persisted.",
"4371 BackgrounderJob records have been persisted.",
"6445 BackgrounderJob records have been persisted.",
"Finished processing Backgrounder job events!",
"
exception in native code ***",
"System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.",
" at com.tableausoftware.hyperextract.Extract.TabExtractClose(Pointer )"
]
}

Box this is being installed on is Windows 2016
Command used to install:
Setup_Logshark_v3.0.1.exe /install /quiet

Command used to run Logshark (happens with or without -p)
logshark C:\Users\bill\downloads\logs.zip -p --startlocalmongo

Logs were generated using tsm as part of nightly backup process

The apache logs/workbook published to the tableau server defined in Logshark.config

Logshark always tries to extract to C drive

Logshark is installed in the D drive. When I run it, it gives me "not enough free disk space available to unpack logs.zip (52 Gb available, 235 Gb required)."

I have 52 G available in the C drive, but well over 235 in the D drive. How can I get Logshark to unpack into the D drive instead?

Plug-ins running inconsistently

I'm seeing inconsistencies in the plug-in functionality when running Logshark against the same set of logs with the same arguments. For example the first time I ran Logshark against a particular set of logs the Apache plug-in ran properly and a workbook was generated. The second time I ran the same exact command immediately after the first run finished the Apache plug-in failed with the error "Encountered uncaught exception while executing plugin 'Apache': The operation has timed out". I'm seeing similar issues with the Config and Filestore plug-ins while the Backgrounder plug-in never runs successfully. The logs were generated from Tableau Server 10.5.1 and I'm using Logshark v2.0. Any guidance would be appreciated. Thank you.

Installation issue.

Hi,

When I try to install, its prompting to open a file. Not sure what file I need to open.
Please advise.

FB24:F184][2017-06-20T18:43:05]e054: Failed to resolve source for file: C:\Program Files\Logshark v1.1.0.4.exe, error: 0x80070642.
[FB24:F184][2017-06-20T18:43:05]e000: Error 0x80070642: Failed while prompting for source (original path 'C:\Program Files\Logshark v1.1.0.4.exe').

Thanks,
Jagruthi

Logshark 2.0 workbooks use <hostname> in datasource connection, instead of <localhost>

As the title describes, I've just upgraded to Logshark 2.0 and noticed different behaviour. When opening a workbook generated by any of the default plugins, the data source connection to the Postgres DB now points at my computer's hostname rather than localhost. This is despite my logshark postgresconnection config looking like this:

<PostgresConnection tcpKeepalive="60"> <Server address="localhost" port="5432" /> <User username="logshark" password="logshark" /> </PostgresConnection>

Now because the connection is using my machines hostname, it's trying to resolve the hostname and giving me a pg_hba.conf error.

image

When I change the server name in the data source connection back to localhost, the connection works as expected.

Logshark 2.1 fails to load and run the replayer.dll plugin

Since i upgrade to logshark 2.1, the replayer plugin fails.

I copied the ReplayerCreation.dll into the logshark plugins folder as documented.

Running logshark produces the following messages:
Failed to load assembly 'D:\Program Files\Logshark\Plugins\ReplayCreation.dll': Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.
Loaded requested Logshark plugins: Apache, Backgrounder, ClusterController, Config, Filestore, Hyper, Netstat, Postgres, ResourceManager, SearchServer, Tabadmin, Vizportal, VizqlServer, VizqlServerPerformance
Loaded requested post-execution Logshark plugins: CustomWorkbooks

I will post this in the Replayer issue tracker as well.

Resource Manager Plugin misisng in Tableau 9.x versions

I am not able to see Resource Manager plugins in the Logshark Output folder of Tableau 9.x versions and I can see it in 10.x versions.
I am using the normal "tabadmin ziplogs file.zip" command in both 9.x and 10.x versions. Is there any additional switch we need to use in order to generate Resource Manager Plugin output in 9.x version?

Thank you

Failed publishing woekbooks to Tableau Server v2018.3.2

Hello,

I am using logshark. I can see the workbook on the local PC.
However, it can not be published to Tableau Server.(-p option)

This is an error message.

Publishing 1 workbook to Tableau Server..
Failed to publish workbooks: Unable to initialize Tableau Server for publishing: Failed to retrieve site ID for site 'default': Failed to retrieve successful response for GET request to 'http://xxxxxxxxxx/api/2.2/sites/default?key=name'

The setting is as follows.

Logshark.config

  • User is Server Administrator
  • Site is default

Tableau Server

  • REST API is enabled.
  • It is not SAML.

What is the cause of this error?

Logshark fails to persist any data from Backgrounder logs

Hello,

since April we are having problems with the backgrounder workbook. We run Logshark everyday and upload the the results to the server. However, the backgrounder workbooks is not updating since April. The following error message is repeated regularly in the backgrounder logs:

2018-05-31 02:13:31,516 fr02513vma_18053100100395_tabservprodlogzip (null) [1] INFO Logshark.Plugins.Backgrounder.Backgrounder - Failed to persist any data from Backgrounder logs!

The rest of the workboos we check on a weekly basis are working fine and this issue only affects the backgrounder. We are currently using Tableau 10.5.

Please let me know if you need any additional information.

Thanks!

Cannot process logs with external MongoDB

Hi,

Just configured external mongoDB, the logshark run goes fine till the end and then it shows the following error
Logshark run failed: MongoDB database c1436dec9a9af2ea0380388b37872e26 contains
no valid log data!

Then i re-ran it one more time, it found the hash (was quicker) and here's what it says:

Logshark run complete! [01:07.86]
Logset hash for this run was 'c1436dec9a9af2ea0380388b37872e26'.
Logshark run failed: Unable to determine status of logset. Aborting..

Please share your thoughts on how to fix the issue.
Thanks

MongoDB fail

I have installed logshark onto my local laptop and am unable process any logs at all. when I run logshark i get the error message:
Unable to retrieve logset metadata from MongoDB: An exception occurred while opening a connection to the server. ---> System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it 127.0.0.1:27017

Is there anything I have done wrong or need to change?

Logshark - Not creating the workbook

I setup the logshark and PostgreSQL as said and when i try to run the command, i see the database in the Postgre but no tables in it and the folder(Empty) in Output\ is getting created but the workbooks are not getting created in the Output folder.

Please help, Below is the command i run and i am running in my local machine against the downloaded log files.

1.) logshark Target\logs.zip --plugins VizqlDesktop --startlocalmongo
2.) logshark Target\logs.zip --plugins VizqlDesktop

Also i am trying to run only the VizqlDesktop plug in as my scope of work is around that and the size of the log files is pretty huge as well.

Any help is greatly appreciated.

Thanks,
Bala

Cluster Controller Postgres Commands Viz not populating

Hello all,

I am using Logshark and loving it! What great tool for us "accidental" Tableau admins. In the ClusterController workbook, the Cluster Controller Postgres Commands viz is not populating. I think it should because I got an email from the server "Tableau Server : Status change for multiple services on "myserver". When i look at the Cluster Controller Errors and Zookeeper Errors, they show errors that correspond the time of the Tableau Server notification email. But nothing Cluster Controller Postgres Commands viz. Can anyone explain why?

btw, this is the error as displayed in the Cluster Controller Errors dashboard that must have caused the email:
Exception from ZooKeeper ruokjava.io.IOException: Exception while executing four letter word: ruok

Thanks,
Darren

Feature Request: Save output in PostgreSQL

Logshark v2.1 and below had the ability to save output in PostgreSQL database. This provided the ability to have the Tableau reports connected to the PostgreSQL database. With that, it was easier to have a single pane view of data for a range of dates in a single report.
Now with Logshark v3.0 and v3.0.1, the outputs are stored in hyper extracts instead of PostgreSQL. With this, don't know if there's an option to see all the data for a given data range in a single report.

Not able to configure logshark with external Mongodb

Hi,

I am in the process of configuring Logshark with an external MongoDB. I have edited the Logshark.config as mentioned in the Guide. However, when I tried to run the following, I have been receiving an authentication error.

C:\Program Files\Logshark>logshark D:\Logs\logs.zip
Loading Logshark user configuration..
Failed to open connection to MongoDB database 'XX.XXX.XXX.XXX:27017': Command listDatabases failed: command listDatabases requires authentication.
Logshark run failed: Failed to open connection to MongoDB database 'XX.XXX.XXX.XXX:27017': Command listDatabases failed: command listDatabases requires authentication.
(IP hidden due to security reasons)

I did mention the credentials but for some reason logshark seems to be not able to read it. Please let me know if am missing anything here.

Appreciate your help.

Thanks,
Shruthi

Serialization error with hyper?

We have Tableau 10.5.3; and using the new logshark 2.1; are getting an error; but it's not a fatal error; but wondering if we should be worried; or if there is a configuration setting somewhere to change this.

Failed to process file 'hyper\hyper_2018_06_28_02_56_49-part2.log': Maximum serialization depth exceeded (does the object being serialized have a circular reference?).

Here's the logfile and the config file; stripped of servernames, usernames, and passwords.
Logshark.config.txt
tableau_logshark.log.20180628.152256.txt

Logshark using APPDATA folder

Logshak 3.01 is by default using local appdata folder to unpack files with no option to change. In my scenario C: is a very small OS drive with program files installed on D:

Logshark fails in this instance as there is enough space on C: to complete the process

tempsnip

Fatal error - 1 is not a supported code page

Hi.
just installed logshark as described here :https://tableau.github.io/Logshark/docs/logshark_prefunc
when i run logshark i get this:
D:\Program files\Logshark>logshark "d:\logs\11.zip" --startlocalmongo
Loading Logshark user configuration..
Purging existing MongoDB data..
Starting local MongoDB instance on port 27017..
MongoDB started successfully!
Preparing logset target 'd:\logs\hp.zip' for processing..
Encountered a fatal error while extracting logset: 1 is not a supported code page.
Parameter name: codepage
Shutting down local MongoDB process..
Logshark run complete! [00.16]

tried reinstalling everything
what am i doing wrong?

thanks.

Worbook Names coming up as book1 rather than actual names.

When I look at the logshark workbook for the plugin - vizqlserverperformance. I try to filter down to the workbook, however i see that almost all of the workbooks are named book1. It is really weird and I have nothing in the logs that says there is a book1 on my server.

My process was i pointed to tabjolt to my server to run some testing. I then took a ziplogs, i then ran logshark and I now only see book1 in the workbook. I have attached that output as an extract so I can share here.

I have also provided the ziplogs for your review.

LogShark take 4 days to process 6GB log file

Hi,

Our server log is a 6GB zip file. it takes Logshark 4 days to process them. The Logshark process is running on m4.16xl EC2 instance and writing to an external Postgres database. It is currently using the built-in MongoDB. Can you advise on how I can speed up the processing time?

Thanks advance for your suggestions?
Douglas

Logshark on AWS

We are looking into installing Logshark in an AWS environment. Do you have any recommendations? We plan on using a RDS for the PostgreSQL database, but it looks like we will have to install MongoDB on a virtual server, right?

Failed to initialize Logshark request processor

Hi there

Even though I have created logshark account and granted relevant permission levels to postgres data base, I am getting the error "Failed to initialize Logshark request processor!
Please check to make sure that the results database was configured correctly
" Excerpt from Logshark.log file is given below.


017-06-05 11:56:21,064 (null) [1] DEBUG Logshark.CLI.Program - Logshark execution arguments: D:\Tableau\ServerLogs\NonProdServer_5-06-17.zip
2017-06-05 11:56:21,110 (null) [1] INFO Logshark.Config.LogsharkConfigReader - Loading Logshark user configuration..
2017-06-05 11:56:21,224 (null) [1] DEBUG Logshark.Connections.PostgresConnectionInfo - Attempting to open connection to Postgres database 'localhost:5432\postgres' using user account 'logshark'..
2017-06-05 11:56:21,293 (null) [1] FATAL Logshark.CLI.LogsharkCLI - Failed to initialize Logshark request processor!
Please check to make sure that the results database was configured correctly. (Exception has been thrown by the target of an invocation.)
2017-06-05 11:56:21,293 (null) [1] DEBUG Logshark.CLI.LogsharkCLI - Logshark.Exceptions.DatabaseInitializationException: Failed to initialize Logshark request processor!
Please check to make sure that the results database was configured correctly. ---> System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.InvalidOperationException: This implementation is not part of the Windows Platform FIPS validated cryptographic algorithms.
at System.Security.Cryptography.MD5CryptoServiceProvider..ctor()
--- End of inner exception stack trace ---


Any help would be greatly appreciated.

Thanks
Samantha

Failed publishing woekbooks to server

I'm able to run logshark and view the workbooks on the VM local machine. None of the workbooks were published to the server although a new project was created.

PostgreSQL and MongoDB maintenance

I have been using Logshark for a while now and have noticed that there are a number of databases in both PostgreSQL and MongoDB from each time I've run it. What would be the proper way to clean these up?

Not able to run Logshark to process the log files

Hi,

I am in the process of testing Logshark in my machine. I have edited the Logshark.config as mentioned in the Guide. However, when I tried to run the following I don't get anything:

C:\Program Files\Logshark>logshark C:\Users\Alex\Downloads\mylogs10262017 --startlocalmongo
Loading Logshark user configuration..
Target must be a valid file, directory or MD5 hash!
C:\Program Files\Logshark>

Please let me know if I am missing anything.

Thank you
Alex

Encounter during executing plugin 'Apache'...Input string was not in a correct format.

Recently, having lots of issue with Apache plugin not able to complete successfully due to Input string was not in correct format.

2018-11-08 10:53:33,190 se140323_18110815075081_saigam20181108zip (null) [1] ERROR Logshark.Core.Controller.Plugin.PluginExecutor - Encountered uncaught exception while executing plugin 'Apache': An error occurred while deserializing the ContentLength property of class Logshark.Plugins.Apache.Model.HttpdRequest: Input string was not in a correct format. (Input string was not in a correct format.)
2018-11-08 10:53:33,210 se140323_18110815075081_saigam20181108zip (null) [1] DEBUG Logshark.Core.Controller.Plugin.PluginExecutor - System.FormatException: An error occurred while deserializing the ContentLength property of class Logshark.Plugins.Apache.Model.HttpdRequest: Input string was not in a correct format. ---> System.FormatException: Input string was not in a correct format.

Is this due to improper log file? Doesn't Tableau generate these log file with default settings that would impact all log files? Why only the httpd access log?

Anyway, if it is erroneous characters, could the process not skip over it and continue on to at least complete the creation of the hyper and workbook?

Running Logshark in command line

Hi ,

I recently came across this tool and wanted to get started . I have successfully installed the standalone Postgres instance and trying to run logshark in command line with full admin rights. I am using this after opening cmd - C:"C:\Program Files\Logshark\Logshark.exe" but this says No logset target specified.

I tried several combinations like C:\Program Files\Logshark>Logshark C:\Users\User1\Desktop\Tableau Ziplog\UAT Logs.zip -- plugin Apache.

But this doesn't give me any output file. I am sure that I am missing something here but couldn't get it working as I read the User guide.

I will really appreciate if you can help me with this trivial thing.

Thanks!

Realtime

Wondering it is possible to ingest data into logshark near real time for monitoring and alerting?

Issues creating Replayer JSON file using Tableau v10.1.1 ziplogs and LogShark v3.0.1

Hi there,

We are upgrading our Tableau VM from v10.1.1 to v2018.3.3 and wish to do some performance testing. I am using the logs from Tableau v10.1.1 and running LogShark v3.0.1, with the following command:

logshark C:\Admin\logs\Logs_030419.zip --startlocalmongo --plugins ReplayCreation

I get the following error that I haven't seen before:

C:\Users\jholling>logshark C:\Admin\logs\Logs_030419.zip --startlocalmongo --plugins ReplayCreation
Loading Logshark user configuration..
A MongoDB process is already running. Attempting to shut it down..
Purging existing MongoDB data..
Starting local MongoDB instance on port 27017..
MongoDB started successfully!
Preparing logset target 'C:\Admin\logs\Logs_030419.zip' for processing..
Extracting contents of archive 'Logs_030419.zip' to 'C:\Users\jholling\AppData\Local\Temp\Logshark\19040413465583'.. (656.5Mb)
Extracted 464 files from 'Logs_030419.zip'. (10.8Gb unpacked)
Finished extracting required files from logset! Unpacked 11Gb out of 657Mb. [01:16.15]
Loaded 3 artifact processors: DesktopLogProcessor, ServerClassicLogProcessor, ServerTsmLogProcessor
Found matching artifact processor: ServerClassicLogProcessor
Computing logset hash..
Logset hash is '99f877ce9b726d908de3be06ef9de297'.
Loaded requested Logshark plugin: ReplayCreation
Found existing logset matching hash! Skipping extraction and parsing.
Initializing ReplayCreation plugin..
Execution of ReplayCreation plugin started at 1:48 PM..
Executing default 5 concurrent queries to get browser sessions
No path specified for Replay json file, it will be saved to default directory - C:\Program Files\Logshark\Output\tts-kirkby060_19040412465431_logs030419zip
No file name specified for Replay json file, the replay file will be saved as - Playback_04_04_-13-48-13.json
Executing ReplayCreation plugin
Querying Access logs for GET requests
Number of Query Results : 237480
Encountered uncaught exception while executing plugin 'ReplayCreation': Input string was not in a correct format.
ReplayCreation failed to execute successfully.
Finished executing plugins! [1 failure]
Logshark run complete! [01:26.04]
Logset hash for this run was '99f877ce9b726d908de3be06ef9de297'.

It worked a few months ago using the same setup so, not sure why it isn't working.

Also, we run a cleanup routine that exports a weeks worth of ziplogs to a network location and when we try to use these logs we get the following error:

C:\Users\jholling>logshark C:\Admin\logs\Logs_310319.zip --startlocalmongo --plugins ReplayCreation
Loading Logshark user configuration..
A MongoDB process is already running. Attempting to shut it down..
Purging existing MongoDB data..
Starting local MongoDB instance on port 27017..
MongoDB started successfully!
Preparing logset target 'C:\Admin\logs\Logs_310319.zip' for processing..
Extracting contents of archive 'Logs_310319.zip' to 'C:\Users\jholling\AppData\Local\Temp\Logshark\19040414250630'.. (2.2Gb)
Extracting nested archive 'remote_logs.zip' from 'Logs_310319.zip'.. (1039.8Mb)
Extracting nested archive 'inprogress_remote_logs.zip' from 'remote_logs.zip'.. (9.5Mb)
Extracted 9 files from 'inprogress_remote_logs.zip'. (45.4Mb unpacked)
Extracted 599 files from 'remote_logs.zip'. (17Gb unpacked)
Extracted 1324 files from 'Logs_310319.zip'. (36.6Gb unpacked)
Finished extracting required files from logset! Unpacked 37Gb out of 2Gb. [05:01.51]
Loaded 3 artifact processors: DesktopLogProcessor, ServerClassicLogProcessor, ServerTsmLogProcessor
Logshark run complete! [05:04.11]
Logshark run failed: No compatible artifact processor found for payload! Is this a valid logset?

If someone could I would be most appreciative.

Many thanks,

Error publishing to a SSL Tableau site

Error during publishing of the workbook (none of the workbook gets published).

If I have to publish a workbook using CLI TABADMIN, I have to use the --no-certcheck for it to publish. I am assuming it has something to do with this particular workaround.

2018-09-07 15:18:11,295 se140323_18090719173330_se120017tablogbackup1zip (null) [1] INFO Logshark.Core.Controller.Plugin.PluginExecutor - Saved workbook to 'F:\Logshark\Output\se140323_18090719173330_se120017tablogbackup1zip\Config.twb'!
2018-09-07 15:18:11,306 se140323_18090719173330_se120017tablogbackup1zip (null) [1] INFO Logshark.Core.Controller.Workbook.WorkbookPublisher - Publishing 1 workbook to Tableau Server..
2018-09-07 15:18:11,464 se140323_18090719173330_se120017tablogbackup1zip (null) [1] ERROR Logshark.Core.Controller.Plugin.PluginExecutor - Failed to publish workbooks: Unable to initialize Tableau Server for publishing: The underlying connection was closed: An unexpected error occurred on a receive.
2018-09-07 15:18:11,476 se140323_18090719173330_se120017tablogbackup1zip (null) [1] DEBUG Logshark.Core.Controller.Plugin.PluginExecutor - Logshark.Core.Exceptions.PublishingException: Unable to initialize Tableau Server for publishing: The underlying connection was closed: An unexpected error occurred on a receive. ---> Logshark.Core.Exceptions.PublishingException: Unable to initialize Tableau Server for publishing: The underlying connection was closed: An unexpected error occurred on a receive. ---> System.Net.WebException: The underlying connection was closed: An unexpected error occurred on a receive. ---> System.ComponentModel.Win32Exception: The client and server cannot communicate, because they do not possess a common algorithm
at System.Net.SSPIWrapper.AcquireCredentialsHandle(SSPIInterface SecModule, String package, CredentialUse intent, SecureCredential scc)
at System.Net.Security.SecureChannel.AcquireCredentialsHandle(CredentialUse credUsage, SecureCredential& secureCredential)
at System.Net.Security.SecureChannel.AcquireClientCredentials(Byte[]& thumbPrint)
at System.Net.Security.SecureChannel.GenerateToken(Byte[] input, Int32 offset, Int32 count, Byte[]& output)
at System.Net.Security.SecureChannel.NextMessage(Byte[] incoming, Int32 offset, Int32 count)
at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest)
at System.Net.Security.SslState.ForceAuthentication(Boolean receiveFirst, Byte[] buffer, AsyncProtocolRequest asyncRequest)
at System.Net.Security.SslState.ProcessAuthentication(LazyAsyncResult lazyResult)
at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Net.TlsStream.ProcessAuthentication(LazyAsyncResult result)
at System.Net.TlsStream.Write(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.ConnectStream.WriteHeaders(Boolean async)
--- End of inner exception stack trace ---
at System.Net.HttpWebRequest.GetRequestStream(TransportContext& context)
at System.Net.HttpWebRequest.GetRequestStream()
at Tableau.RestApi.ApiRequest.ToHttpWebRequest()
at Tableau.RestApi.ApiRequest.GetResponse(Int32 maxAttempts)
at Tableau.RestApi.ApiRequest.IssueRequest(String requestFailureMessage, Int32 maxAttempts)
at Tableau.RestApi.RestApiRequestor.Authenticate()
at Tableau.RestApi.RestApiRequestor.GetAuthToken()
at Tableau.RestApi.RestApiRequestor.GetSiteId()
at Logshark.Core.Controller.Workbook.WorkbookPublisher.InitializeProject(IRestApiRequestor requestor)
at Logshark.Core.Controller.Workbook.WorkbookPublisher.PublishWorkbooks(String pluginName, IEnumerable1 workbooksToPublish) --- End of inner exception stack trace --- at Logshark.Core.Controller.Workbook.WorkbookPublisher.PublishWorkbooks(String pluginName, IEnumerable1 workbooksToPublish)
at Logshark.Core.Controller.Workbook.WorkbookPublisher.PublishWorkbooks(IPluginResponse pluginResponse)
--- End of inner exception stack trace ---
at Logshark.Core.Controller.Workbook.WorkbookPublisher.PublishWorkbooks(IPluginResponse pluginResponse)
at Logshark.Core.Controller.Plugin.PluginExecutor.ExecutePlugin(Type pluginType, PluginExecutionRequest pluginExecutionRequest, IEnumerable`1 previousPluginResponses)
2018-09-07 15:18:11,478 se140323_18090719173330_se120017tablogbackup1zip (null) [1] ERROR Logshark.Core.Controller.Plugin.PluginExecutor - Config failed to execute successfully.
2018-09-07 15:18:11,478 se140323_18090719173330_se120017tablogbackup1zip (null) [1] INFO Logshark.Core.Controller.Plugin.PluginExecutor - Finished executing plugins! [1 failure]

__MACOSX folder breaking Logshark?

I've used Logshark successfully on several different sets of log files in the past, but I got an error on a recent set. Have read the other similar issue raised on here and confirmed the logs contained both a buildversion.txt and workgroup.yml file.

There is however a difference in that there is a folder called "__MACOSX" in the zip file. When I remove this, logshark processes correctly.

Not sure what that folder is doing there though? Thought I'd raise it here in case someone else sees this behaviour.

Feature Request: Installation as stand-alone or without Admin rights

Many of us Managing Tableau Server installations are not Admins of our own Client machines (and sometimes not even Server machines).

Unfortunately, I'm unable to install Logshark from my client machine, and am concerned about installing on my Server machine (because of performance, server stability, etc).

For that reason, I'd like to suggest a stand-alone or admin-less installation. (the same goes for Tableau Desktop!) I understand that it is not technically-practical but it is business-important.

2016-11-03 16_44_37-logshark setup

Logshark 2.1 failed to run VizqlServerPerformance plugin

I found it very useful to run the VizqlServerPerformance plugin for more performance info.
To enable VizqlServerPerformance, you had to explicitly enable it in the logshar.config.

After upgrading to logshark 2.1 this fails. Please check the attached partial logfile.

Please advice what to do to resolve this. Or is it a bug?

Top of the logfile, shows 'vizqlServer' is running OK. But 'VizqlServerPerformance' after it does not.
Execution of VizqlServer plugin started at 11:10 AM..
1070/1622 VizqlServerSession items have been persisted to the database. [65%]
1622/1622 VizqlServerSession items have been persisted to the database. [100%]
Saved workbook to 'D:\Program Files\Logshark\Output\itsmon01_18100208585203_wkplogs20180923zip\VizqlServer.twb'!
VizqlServer execution completed! [42.35]
Initializing VizqlServerPerformance plugin..
Execution of VizqlServerPerformance plugin started at 11:11 AM..
Exception processing end-visual-model-producer events on session 0A2F01C8906243BFAF839A8761C71695-0:0: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.ArgumentException: Logline key not of type end-visual-interpreter!
at Logshark.Plugins.Vizql.Models.VizqlEvent.ValidateArguments(String expectedKeyType, BsonDocument document)
at Logshark.Plugins.Vizql.Models.Events.Render.VizqlEndVisualInterpreter..ctor(BsonDocument document)
--- End of inner exception stack trace ---

logshark_partial_vizqlserverperformance_fails.txt

Appending to existing DB

I'm updating my logshark database regularly using the script
logshark <logs.zip_name> --dbname <company_name> -f --parseall
I just want to confirm if I use a log set with overlapping dates, does the get duplicated in the Postgres DB or is there sufficient intelligence to account for this (eg primary key update based on identifiers)?

Publishing to Tableau server 2018.1

Hello,

When I try to get Logshark v2.1 to publish to Tableau Server 2018.1 I get the following error:

Publishing of workbook 'ClusterController.twb' failed: Failed to publish workbook 'ClusterController.twb': Failed to retrieve successful response for POST request to 'http://xx.xx.xx.xx/api/2.2/sites/0120d718-bef4-4057-b319-96b7b2334a28/workbooks?overwrite=True' after 3 attempts: The remote server returned an error: (403) Forbidden. [3 attempts made]

I've tried this with Explorer (can publish), site administrator explorer and server administrator permissions and I get the same error. I know the API is enabled as I ensured this was the case this morning.

Any ideas?

Thanks
Rob

fatal error while processing logset:

I have tried several iterations of execute logshark against a "tabadmin ziplogs -d MM/DD/YYYY" output - even when I minimize the number of days to output the log for, I seem to always end up receiving an error while processing a logset (near its final stages of processing) - with the error in Logshark.log showing:

[1] DEBUG Logshark.CLI.LogsharkCLI - Logshark.Exceptions.InvalidLogsetException: Target does not appear to be a valid Tableau logset!

Feature Request: MongoDB bind to localhost

It would be nice to have mongod included in the application to bind to only the localhost instead of all IPs for this type of use.
Possible implementations.

  • Give a cli option (--bind_ip) to pass to mongod.exe
  • Give a cli option to use a config file and include the current defaults in the config file
  • Upgrade mongo to v3.6 and leave the default - which appears to be bind to localhost only (https://jira.mongodb.org/browse/SERVER-28229)

Edited to add an obvious one that I hadn't included: The client caller will have to change also to accommodate any of the above 3 server changes

Not Running

Environment: Tableau 2018.3 3-node cluster

Installed the latest version of logshark on a windows server 2012 R2 with admin privileges. Took a snapshot of the Tableau logs.

Executed the below command got a print on the console "Logshark user configuration.." and immediately it finishes. Filled in the correct configurations too.
logshark log.zip -s -p

Checked the log file and it doesn't have any error messages apart from these. What could be happening? Please advise. Thanks

2019-01-23 22:45:13,388 (null) (null) [1] DEBUG Logshark.CLI.Program - Logshark execution arguments: C:\Temp\logs.zip -s -p
2019-01-23 22:45:13,481 (null) (null) [1] INFO Logshark.RequestModel.Config.LogsharkConfigReader - Loading Logshark user configuration..
2019-01-23 22:45:13,513 (null) (null) [1] DEBUG Logshark.RequestModel.Config.LogsharkConfigReader - Loaded user configuration!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.