Coder Social home page Coder Social logo

fio-plot's Introduction

fio-plot

FIO is a tool for benchmarking storage devices. FIO helps to assess the storage performance in terms of IOP/s and latency.

Fio-plot generates charts from FIO storage benchmark data. It can process FIO output in JSON format. It can also process FIO log file output (in CSV format). It also includes bench-fio, a benchmark tool to automate benchmarking with FIO. Checkout the many examples below.

barchart

To make these charts yourself, you need to follow this process:

  1. Run your tests, maybe use the included benchmark script bench-fio
  2. Determine which information you would like to show
  3. Run fio-plot to generate the images with the appropriate command line options

Quick installation guide:

Ubuntu 18.04+ LTS: please run this command first:

apt install zlib1g-dev libjpeg-dev python3-pip

All operating systems:

pip3 install fio-plot 

If you want to use the benchmark script bench-fio, make sure to install Fio too.

If you don't want to install fio-plot system-wide, you can make a virtual environment like this:

cd /desired/path
python3 -m venv fio-plot
source fio-plot/bin/activate
pip3 install fio-plot

When you source the virtual environment, fio-plot and bench-fio will be in your executable path.

If you want to install from source, you can clone the repository and run

python3 setup.py install

Configuration command-line vs. INI

Fio-plot supports configuration through command-line parameters or using an INI format configuration file. The examples provided in the following sections use command-line parameters.

This is how you use an INI configuration file (instead):

fio-plot /path/to/fio-plot.ini 

An example INI is inclued in the fio_plot/templates/fio-plot.ini file. It looks like this:

[graphtype]
graphtype = bargraph2d_qd

[settings]
input_directory = /path/to/benchmarkdata
output_filename = test.png
title = Title
subtitle = 
source = https://louwrentius.com
rw = randread
type = 
...
  • The fio-plot --help command explains the usage of the parameters available in the INI.
  • You can't use both the INI file and command-line options, you have to pick one.

2D chart (iodepth)

This kind of chart shows both IOPs and Latency for different queue depths. barchart

This is the command-line used to generate this graph:

fio-plot -i INTEL_D3-S4610 --source "https://louwrentius.com" -T "INTEL D3-S4610 SSD on IBM M1015" -l -r randread

2D chart (numjobs)

This kind of chart shows both IOPs and Latency for diffent simultaneous number of jobs. barchart

This is the command-line used to generate this graph:

fio-plot -i INTEL_D3-S4610 --source "https://louwrentius.com" -T "INTEL D3-S4610 SSD on IBM M1015" -N -r randread

2D chart to compare benchmark results

The compare chart shows the results from multiple different benchmarks in one graph. The graph data is always for a specific queue depth and numjobs values (the examples use qd=1, nj=1 (the default)).

barchart

This is the command-line used to generate this graph:

fio-plot -i INTEL_D3-S4610 SAMSUNG_860_PRO KINGSTON_DC500M SAMSUNG_PM883 --source "https://louwrentius.com" -T "Comparing the performance of various Solid State Drives" -C -r randread --xlabel-parent 0

It is also possible to group the bars for IOPs and Latency like this:

barchart

This is the command-line used to generate this graph:

fio-plot -i INTEL_D3-S4610 SAMSUNG_860_PRO KINGSTON_DC500M SAMSUNG_PM883 --source "https://louwrentius.com" -T "Comparing the performance of various Solid State Drives" -C -r randread --xlabel-parent 0 --group-bars

3D chart

A 3D bar chart that plots both queue depth an numjobs against either latency or IOPs. This example shows IOPs.

3dbarchart

This is the command-line used to generate this graph:

fio-plot -i RAID10 --source "https://louwrentius.com"  -T "RAID10 performance of 8 x WD Velociraptor 10K RPM" -L -t iops -r randread

It is also possible to chart the latency:

3dbarchart

This is the command-line used to generate this graph:

fio-plot -i RAID10 --source "https://louwrentius.com"  -T "RAID10 performance of 8 x WD Velociraptor 10K RPM" -L -t lat -r randread

Line chart based on FIO log data

Fio records a 'performance trace' of various metrics, such as IOPs and latency over time in plain-text .log files. If you use the benchmark tool included with fio-plot, this data is logged every 1 second.

This data can be parsed and graphed over time. In this example, we plot the data for four different solid state drives in one chart.

linechart

This is the command-line used to generate this graph:

fio-plot -i INTEL_D3-S4610/ KINGSTON_DC500M/ SAMSUNG_PM883/ SAMSUNG_860_PRO/ --source "https://louwrentius.com"  -T "Comparing IOPs performance of multiple SSDs" -g -t iops -r randread --xlabel-parent 0

It is also possible to chart the latency instead of IOPs.

linechart

This is the command-line used to generate this graph:

fio-plot -i INTEL_D3-S4610/ KINGSTON_DC500M/ SAMSUNG_PM883/ SAMSUNG_860_PRO/ --source "https://louwrentius.com"  -T "Comparing latency performance of multiple SSDs" -g -t lat -r randread --xlabel-parent 0

You can also include all information in one graph:

linechart

This is the command-line used to generate this graph:

fio-plot -i INTEL_D3-S4610/ KINGSTON_DC500M/ --source "https://louwrentius.com"  -T "Comparing performance of multiple SSDs" -g -t iops lat -r randread --xlabel-parent 0    

And this is an example with a single benchmark run, comparing the performance of multiple queue depths.

linechart

This is the command-line used to generate this graph:

fio-plot -i INTEL_D3-S4610 --source "https://louwrentius.com"  -T "Comparing multiple queue depths" -g -t iops lat -r randread -d 1 8 16  --xlabel-parent 0    

It is also possible to chart a total of the read+write data (iops/bw/lat) with the --draw-total option. This only works for -g style graphs and it requires a 'randrw' benchmark that is not 100% read, it should contain write data.

linechart

This is the command-line used to generate this graph:

fio-plot -i . -T "TEST" -r randrw -g -t iops --draw-total

Latency histogram

The FIO JSON output also contains latency histogram data. It's available in a ns, us and ms scale.

histogram

This is the command-line used to generate this graph:

fio-plot -i SAMSUNG_860_PRO/ --source "https://louwrentius.com"  -T "Historgram of SSD" -H -r randread -d 16 -n 16

Fio client server mechanism.

Fio supports a client-server model where one fio client can run a benchmark on multiple machines (servers) in parallel. The bench-fio tool supports this type of benchmark, see the readme for more details. For the fio-plot tool the data will be rendered based on hostname automatically.

csdemo

csdemo

The --include-hosts and --exclude-hosts parameters allow filtering to only display the desired hosts.

Benchmark script

A benchmark script is provided alongside fio-plot, that automates the process of running multiple benchmarks with different parameters. For example, it allows you to gather data for different queue depths and/or number of simultaneous jobs. The benchmark script shows progress in real-time.

████████████████████████████████████████████████████
		+++ Fio Benchmark Script +++

Job template:                  fio-job-template.fio
I/O Engine:                    libaio
Number of benchmarks:          98
Estimated duration:            1:38:00
Devices to be tested:          /dev/md0
Test mode (read/write):        randrw
IOdepth to be tested:          1 2 4 8 16 32 64
NumJobs to be tested:          1 2 4 8 16 32 64
Blocksize(s) to be tested:     4k
Mixed workload (% Read):       75 90

████████████████████████████████████████████████████
4% |█                        | - [0:04:02, 1:35:00]-]

This particular example benchmark was run with these parameters:

bench-fio --target /dev/md0 --type device --template fio-job-template.fio  --mode randrw --output RAID_ARRAY --readmix 75 90  --destructive

In this example, we run a mixed random read/write benchmark. We have two runs, one with a 75% / 25% read/write mix and one with a 90% / 10% mix.

You can run the benchmark against an entire device or a file/folder. Alongside the benchmark script, a Fio job template file is supplied (fio-job-template.fio). This file can be customised as desired.

For more examples, please consult the separate README.md

Dependancies

Fio-plot requires 'matplotlib' and 'numpy' to be installed.

Please note that Fio-plot requires at least matplotlib version 3.3.0

Fio-plot also writes metadata to the PNG files using Pillow

Fio-plot additional example Usage

2D Bar Charts

Creating a 2D Bar Chart based on randread data and numjobs = 1 (default).

fio-plot -i <benchmark_data_folder> -T "Title" -s https://louwrentius.com -l -r randread

regularbars

Creating a 2D Bar Chart based on randread data and numjobs = 8.

fio-plot -i <benchmark_data_folder> -T "Title" -s https://louwrentius.com -l -n 8 -r randread

Creating a 2D Bar Chart grouping iops and latency data together:

fio-plot -i <benchmark_data_folder> -T "Title" -s https://louwrentius.com -l -r randread --group-bars

groupedbars

3D Bar Chart

Creating a 3D graph showing IOPS.

fio-plot -i <benchmark_data_folder> -T "Title" -s https://louwrentius.com -L -r randread -t iops

Creating a 3D graph with a subselection of data

fio-plot -i <benchmark_data_folder> -T "Title" -s https://louwrentius.com -L -r randread -t iops -J 16 -M 16

2D Bar Histogram

Creating a latency histogram with a queue depth of 1 and numjobs is 1.

fio-plot -i <benchmark_data_folder> -T "Title" -s https://louwrentius.com -H -r randread -d 1 -n 1

2D line charts

Creating a line chart from different benchmark runs in a single folder

fio-plot -i <benchmark_data_folder>  -T "Test" -g -r randread -t iops lat -d 1 8 16 -n 1

The same result but if you want markers to help distinguish between lines:

fio-plot -i <benchmark_data_folder>  -T "Test" -g -r randread -t iops lat -d 1 8 16 -n 1 --enable--markers

markers

It is also possible to change the line colors with the --colors parameter.

fio-plot -i <benchmark_data_folder> -T "Test" -g -r randread -t iops -d 1 2 4 8 --colors xkcd:red xkcd:blue xkcd:green tab:purple

Please note that you need to specify a color for each line drawn. In this example, four lines are drawn.

You can find a list of color names here. There is also a list of xkcd colors here (xkcd:'color name').

Comparing two or more benchmarks based on JSON data (2D Bar Chart):

A simple example where we compare the iops and latency of a particular iodepth and numjobs value:

fio-plots -i <folder_a> <folder_b> <folder_c> -T "Test" -C -r randwrite -d 8 

compare01

The bars can also be grouped:

compare03

There is also an option (--show-cpu) that includes a table with CPU usage:

comparecpu

It is now also possible to show steady state statistics (--show-ss) if you ran a Fio benchmark with steady state options.

steadystatechart

Comparing two or more benchmarks in a single line chart

Create a line chart based on data from two different folders (but the same benchmark parameters)

fio-plot -i <benchmark_data_folder A> <benchmark_data_folder B>  -T "Test" -g -r randread -t iops lat -d 8 -n 1

I'm assuming that the benchmark was created with the (included) bench-fio tool.

For example, you can run a benchmark on a RAID10 setup and store data in folder A. Store the benchmark data for a RAID5 setup in folder B and you can compare the results of both RAID setups in a single Line graph.

Please note that the folder names are used in the graph to distinguish the datasets.

multipledataset

Command used:

fio-plot -i ./IBM1015/RAID10/4k/ ./IBM1015/RAID5/4k/ -T "Comparing RAID 10 vs. RAID 5 on 10,000 RPM Drives" -s https://louwrentius.com -g -r randread -t iops lat -d 8 -n 1

If you use the bench-fio tool to generate benchmark data, you may notice that you end up with folders like:

IBM1015/RAID10/4k
IBM1015/RAID5/4k

Those parent folders are used to distinguish and identify the lines from each other. The labels are based on the parent folder names as you can see in the graph. By default, we use only one level deep, so in this example only RAID10/4k or RAID5/4k are used. If we want to include the folder above that (IBM1015) we use the --xlabel-parent parameter like so:

fio-plot -i ./IBM1015/RAID10/4k/ ./IBM1015/RAID5/4k/ -T "Comparing RAID 10 vs. RAID 5 on 10,000 RPM Drives" -s https://louwrentius.com -g -r randread -t iops lat -d 8 -n 1 -w 1 --xlabel-parent 2

This would look like:

labellength

Some additional examples to explain how you can trim the labels to contain exactly the directories you want:

The default:

RAID10/4k

Is equivalent to --xlabel-parent 1 --xlabel-depth 0. So by default, the parent folder is included. If you strip off the 4k folder with --xlabel-depth 1, you'll notice that the label becomes:

IBM1015/RAID10 

This is because the default --xlabel-parent is 1 and the index now starts at 'RAID10'.

If you want to strip off the 4k folder but not include the IBM1015 folder, you need to be explicit about that:

--xlabel-parent 0 --xlabel-depth 1 

Results in:

RAID10

Example:

shortlabel

JSON / LOG file name requirements

Fio-plot parses the filename of the generated .log files. The format is:

[rwmode]-iodepth-[iodepth]-numjobs-[numjobs]_[fio generated type].[numbjob job id].log

An example:

randwrite-iodepth-8-numjobs-8_lat.1.log
randwrite-iodepth-8-numjobs-8_lat.2.log
randwrite-iodepth-8-numjobs-8_lat.3.log
randwrite-iodepth-8-numjobs-8_lat.4.log
randwrite-iodepth-8-numjobs-8_lat.5.log
randwrite-iodepth-8-numjobs-8_lat.6.log
randwrite-iodepth-8-numjobs-8_lat.7.log
randwrite-iodepth-8-numjobs-8_lat.8.log 

In this example, there are 8 files because numjobs was set to 8. Fio autoamatically generates a file for each job. It's important that - if you don't use the included benchmark script - to make sure files are generated with the appropriate file name structure.

PNG metadata

All settings used to generate the PNG file are incorporated into the PNG file as metadata (tEXT). This should help you to keep track of the exact parameters and data used to generate the graphs. This metadata can be viewed with ImageMagick like this:

identify -verbose filename.png

This is a fragment of the output:

Properties:
    compare_graph: True
    date:create: 2020-09-28T16:27:08+00:00
    date:modify: 2020-09-28T16:27:07+00:00
    disable_grid: False
    dpi: 200
    enable_markers: False
    filter: ('read', 'write')
    histogram: False
    input_directory: /Users/MyUserName/data/WDRAID5 /Users/MyUserName/data/WDRAID10
    iodepth: 16
    bargraph3d: False
    latency_iops_2d: False
    line_width: 1
    loggraph: False
    maxdepth: 64
    maxjobs: 64

fio-plot's People

Contributors

blacklion avatar cyberguy42 avatar davidandreoletti avatar davidsntg avatar draeath avatar flowrey avatar jackeichen avatar jderrick avatar jiab77 avatar jsrolon avatar louwrentius avatar nobuto-m avatar ossarchitect avatar rout39574 avatar sguazt avatar tatref avatar vladrassokhin avatar zhucan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fio-plot's Issues

Error handling of missing parameters and target paths is missing

nan03@Mini:~/projects/fio-plot/fio_plot$ ./fio_plot -i ../benchmark_script/benchmarks/sda1/4k/ -T "Title" -s https://louwrentius.com -l -n 1 -r randread
Traceback (most recent call last):
  File "./fio_plot", line 141, in <module>
    main()
  File "./fio_plot", line 125, in main
    list_of_json_files = jsonimport.list_json_files(settings)
  File "/Users/nan03/projects/fio-plot/fio_plot/fiolib/jsonimport.py", line 11, in list_json_files
    files = os.listdir(absolute_dir)
FileNotFoundError: [Errno 2] No such file or directory: '/Users/nan03/projects/fio-plot/benchmark_script/benchmarks/sda1/4k'

[Feature req] configuration file

There are a ton of options.
Could be useful to pass a configuration file/ini/yaml instead of a long line.
This could be useful to integrate iterations for multiple jobs:

fio:
  - name: pippo
    target: /data
    target_type: directory
    #output
    #template
    block_size: 4k
    iodepth: "1 2 4 8 16 32 64"
    numjobs: "1 2 4 8 16 32 64"
    duration: 60
    modes:
      - randread
      - randwrite
  - name: topolino
    target: /data2
    target_type: directory
    #output
    #template
    block_size: 4k
    iodepth: "1 2 4 8 16 32 64"
    numjobs: "1 2 4 8 16 32 64"
    duration: 60
    modes:
      - randread
      - randwrite

Unable to recreate graph

I am trying to recreate the line graph for a comparison of two setups, like in the case of RAID10 vs RAID5 in your case, but I seem to be unable to obtain the same output.

Following your example

./fio_plot -i ../../RAID10 ../../RAID5 -T "Comparing RAID 10 vs. RAID 5 on 10,000 RPM Drives" -s https://louwrentius.com -g -r randread -t iops lat -d 8 -n 1 

The command I'm running is:

./fio_plot -i results/A/4k/ results/B/4k/  -T "RAIDZ2 6+6 vs RAIDZ2 6+6+LOG+CACHE" -g -r randread -t iops lat -d 8 -n 1

but the output shows only data for one and no legend at the bottom.

image

Am I doing something wrong?

TypeError: object of type 'NoneType' has no len()

When creating a histogram exactly according to the example command line in README.md (i.e, fio_plot -i something/4k/ -T "Title" -H -r randread -d 1 -n 1 ), I get the following error:

Traceback (most recent call last):
  File "//home/rubin/Syncthing/Source/Other/fio-plot/fio_plot/fio_plot", line 262, in <module>
    main()
  File "//home/rubin/Syncthing/Source/Other/fio-plot/fio_plot/fio_plot", line 250, in main
    uses_json_files[item]['function'](settings, parsed_data)
  File "/mnt/s/Source/Other/fio-plot/fio_plot/fiolib/barhistogram.py", line 120, in chart_latency_histogram
    sourcelength = len(settings['source'])
TypeError: object of type 'NoneType' has no len()

Graphs skewed when comparing more than 4 attributes

Hello, I see you have a 2d picture where you are able to plot 7 iodepths in single chart. I am trying to plot a graph that shows IOPS/Latency for 6 different VM sizes and the graph looks very skewed and abnormally long.

Could you let us know how to play with the graph sizing.

Here is the command i am using.

./fio_plot -i /dir1 /dir2 /dir3 /dir4 /dir5 /dir6 -T "Read-iodepth" -C -r read -n 2 -d 8

Support docker image

I think it's a good idea to run benchmarks with fio and save the results to a file, then use fio-plot to visualize the results.

In my situation, I got the broken dependencies error when I tried to install libjpeg-devel in Centos7:

Error: Package: libjpeg-turbo-devel-1.2.90-8.el7.x86_64 (base)
           Requires: libjpeg-turbo(x86-64) = 1.2.90-8.el7
           Installed: libjpeg-turbo-1.2.90-5.el7.x86_64 (@base)
               libjpeg-turbo(x86-64) = 1.2.90-5.el7
           Available: libjpeg-turbo-2.0.2-1.x86_64 (sankuai)
               libjpeg-turbo(x86-64) = 2.0.2-1
 You could try using --skip-broken to work around the problem

So I think generating the plot in docker container can fix this issue.

fio-plot Could not find any (matching) JSON files in the specified directory + Solution (K8s)

I generated JSON formatted data by running fio benchmark ( not using the script provided here ). Here is a snip from the JSON file:

{
"fio version" : "fio-3.28",
"timestamp" : 1636479730,
"timestamp_ms" : 1636479730127,
"time" : "Tue Nov 9 17:42:10 2021",
"global options" : {
"directory" : "/scratch/fio-759f77f784-rvkxg",
"ioengine" : "libaio",
"direct" : "1",
"size" : "10G",
"iodepth" : "64",
"numjobs" : "100",
"bs" : "512K",
"rw" : "randrw",
"runtime" : "1800",
"name" : "rwlatency-test-job"
},
"jobs" : [
{
"jobname" : "testjob",
.
. < removed lines till the end of file >
.
}

I have fio-plot installed for system wide use using pip3 install fio-plot on Ubuntu 20.04.3 LTS, and updated PATH to point to /home//.local/bin/. I have stored the fio results JSON file test2.json under /tmp/fioout/, and running fio-plot on this directory that contains the JSON file I get the following error message.

$ fio-plot -i ./fioout/ -T "rwlatency-test-job" -d 64 -n 100 -l -r randrw

Could not find any (matching) JSON files in the specified directory /tmp/fioout

Are the correct directories specified?

If so, please check the -d ([64]) -n ([100]) and -r (randrw) parameters.

Can someone please tell me what I might be doing wrong here?

Add explanations for warning and ideally also for numjobs and ioqueue, remove images directory

1:

See:

if mean > 1000:

"WARNING: the storage could not keep up with the configured I/O request size. Data is interpolated."

Can you explain what this means in detail? does it mean your data is wrong? It would be good to have a good paragraph for this, what does it mean, how does it occur, can/should you avoid it, etc.

2:

The settings iodepth and numjobs can be specified where, to my understanding, they mean:

  • iodepth: how many requests fio will initiate without waiting for acknowledgement;
  • numjobs: how many concurrent processes are writing and reading to (different parts of) the disk;

My personal big questionmark here is: does iodepth apply per process (numjobs) or to fio "globally"? Intuitively I would think per process, but I'm not sure and even after hours of googling I couldn't find a good explanation for it.

3:

Could the images directory not be removed or cleaned up? I ask this for two reasons:

  • It contains files with characters that are not valid for filenames on Windows (:) making it annoying to copy fio-plot repository around in multi operating system environments
  • It seems to me to be data created historically of a more personal nature than the project itself, the project doesn't need it + makes the repo unneccesarily large

Thanks for the great work, Just did a big performance measurement project with the help of your tools, big fan.

Unable to produce plots

Hi!
I'm triyng to compare multiple logs, nut I encounter this problem:

Could not find any (matching) JSON files in the specified directory /out/cephfs/4k

What I'm doing wrong?

/fio-plot # /fio-plot/fio_plot/fio_plot -i $(find /out -type d -mindepth 2 -maxdepth 2|xargs) -T "Test" -C -r randwrite -d 8
Could not find any (matching) JSON files in the specified directory /out/cephfs/4k
/fio-plot # find  /out/cephfs/4k
/out/cephfs/4k
/out/cephfs/4k/randread-1-2.json
/out/cephfs/4k/randread-iodepth-1-numjobs-2_lat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_slat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_clat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_lat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_slat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_clat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_bw.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_iops.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_bw.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_iops.2.log
/out/cephfs/4k/randread-1-4.json
/out/cephfs/4k/randread-iodepth-1-numjobs-4_lat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_lat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_lat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_slat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_slat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_clat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_clat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_slat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_clat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_lat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_slat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_clat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_bw.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_iops.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_bw.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_iops.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_bw.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_iops.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_bw.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_iops.4.log
/out/cephfs/4k/randread-1-8.json
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.8.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.8.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.8.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.8.log
/out/cephfs/4k/randread-2-2.json
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.8.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_lat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_lat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_slat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_clat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_slat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_clat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_bw.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_iops.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_bw.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_iops.2.log
/out/cephfs/4k/randread-2-4.json
/out/cephfs/4k/randread-iodepth-2-numjobs-4_lat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_lat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_slat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_slat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_clat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_clat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_lat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_slat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_clat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_lat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_slat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_clat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_bw.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_iops.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_bw.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_iops.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_bw.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_iops.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_bw.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_iops.4.log
/out/cephfs/4k/randread-2-8.json
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.8.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.8.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.8.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.8.log
/out/cephfs/4k/randread-8-2.json
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.8.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_lat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_lat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_slat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_clat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_slat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_clat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_bw.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_iops.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_bw.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_iops.2.log
/out/cephfs/4k/randread-8-4.json
/out/cephfs/4k/randread-iodepth-8-numjobs-4_lat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_slat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_clat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_lat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_slat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_clat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_lat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_slat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_clat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_lat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_slat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_clat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_bw.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_iops.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_bw.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_iops.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_bw.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_iops.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_bw.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_iops.4.log
/out/cephfs/4k/randread-8-8.json
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.8.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.8.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.8.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.8.log
/out/cephfs/4k/randwrite-1-2.json
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.8.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_lat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_lat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_slat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_slat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_clat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_clat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_bw.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_iops.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_bw.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_iops.2.log
/out/cephfs/4k/randwrite-1-4.json
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_lat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_slat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_clat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_lat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_slat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_clat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_lat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_slat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_clat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_lat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_slat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_clat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_bw.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_iops.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_bw.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_iops.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_bw.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_iops.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_bw.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_iops.4.log
/out/cephfs/4k/randwrite-1-8.json
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.8.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.8.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.8.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.8.log
/out/cephfs/4k/randwrite-2-2.json
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.8.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_lat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_slat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_clat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_lat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_slat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_clat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_bw.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_iops.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_bw.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_iops.2.log
/out/cephfs/4k/randwrite-2-4.json
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_lat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_slat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_clat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_lat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_slat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_clat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_lat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_slat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_clat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_lat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_slat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_clat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_bw.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_iops.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_bw.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_iops.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_bw.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_iops.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_bw.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_iops.4.log
/out/cephfs/4k/randwrite-2-8.json
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.8.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.8.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.8.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.8.log
/out/cephfs/4k/randwrite-8-2.json
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.8.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_lat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_lat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_slat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_clat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_slat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_clat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_bw.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_iops.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_bw.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_iops.2.log
/out/cephfs/4k/randwrite-8-4.json
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_lat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_slat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_clat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_lat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_lat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_lat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_slat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_slat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_clat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_clat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_slat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_clat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_bw.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_iops.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_bw.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_iops.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_bw.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_iops.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_bw.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_iops.4.log
/out/cephfs/4k/randwrite-8-8.json
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.8.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.8.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.8.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.8.log
/out/cephfs/4k/randwr-1-2.json
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.8.log
/fio-plot # ^C
/fio-plot # ^C
/fio-plot # find /out
/out
/out/vmware
/out/vmware/4k
/out/vmware/4k/randread-1-2.json
/out/vmware/4k/randread-iodepth-1-numjobs-2_lat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_lat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_slat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_clat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_slat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_clat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_bw.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_iops.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_bw.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-2_iops.2.log
/out/vmware/4k/randread-1-4.json
/out/vmware/4k/randread-iodepth-1-numjobs-4_lat.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_lat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_lat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_slat.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_slat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_slat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_clat.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_clat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_clat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_lat.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_slat.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_clat.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_bw.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_iops.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_bw.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_iops.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_bw.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_iops.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_bw.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-4_iops.4.log
/out/vmware/4k/randread-1-8.json
/out/vmware/4k/randread-iodepth-1-numjobs-8_lat.8.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_lat.6.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_lat.7.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_lat.5.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_lat.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_slat.8.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_lat.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_slat.6.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_slat.7.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_slat.5.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_clat.8.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_clat.6.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_slat.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_clat.7.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_clat.5.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_lat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_clat.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_slat.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_slat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_clat.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_clat.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_lat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_slat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_clat.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_bw.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_iops.1.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_bw.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_iops.2.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_bw.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_iops.3.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_bw.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_iops.4.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_bw.5.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_iops.5.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_bw.6.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_iops.6.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_bw.7.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_iops.7.log
/out/vmware/4k/randread-iodepth-1-numjobs-8_bw.8.log
/out/vmware/4k/randread-2-2.json
/out/vmware/4k/randread-iodepth-1-numjobs-8_iops.8.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_lat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_lat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_slat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_slat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_clat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_clat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_bw.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_iops.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_bw.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-2_iops.2.log
/out/vmware/4k/randread-2-4.json
/out/vmware/4k/randread-iodepth-2-numjobs-4_lat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_lat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_lat.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_lat.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_slat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_slat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_slat.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_clat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_slat.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_clat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_clat.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_clat.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_bw.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_iops.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_bw.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_iops.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_bw.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_iops.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_bw.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-4_iops.4.log
/out/vmware/4k/randread-2-8.json
/out/vmware/4k/randread-iodepth-2-numjobs-8_lat.6.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_lat.8.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_lat.7.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_lat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_lat.5.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_lat.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_slat.6.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_slat.8.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_slat.7.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_slat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_slat.5.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_clat.6.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_clat.8.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_slat.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_clat.7.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_clat.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_clat.5.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_clat.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_lat.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_lat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_slat.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_slat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_clat.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_clat.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_bw.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_iops.1.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_bw.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_iops.2.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_bw.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_iops.3.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_bw.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_iops.4.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_bw.5.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_iops.5.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_bw.6.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_iops.6.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_bw.7.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_iops.7.log
/out/vmware/4k/randread-iodepth-2-numjobs-8_bw.8.log
/out/vmware/4k/randread-8-2.json
/out/vmware/4k/randread-iodepth-2-numjobs-8_iops.8.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_lat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_lat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_slat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_clat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_slat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_clat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_bw.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_iops.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_bw.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-2_iops.2.log
/out/vmware/4k/randread-8-4.json
/out/vmware/4k/randread-iodepth-8-numjobs-4_lat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_lat.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_lat.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_lat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_slat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_slat.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_slat.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_clat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_clat.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_slat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_clat.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_clat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_bw.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_iops.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_bw.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_iops.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_bw.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_iops.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_bw.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-4_iops.4.log
/out/vmware/4k/randread-8-8.json
/out/vmware/4k/randread-iodepth-8-numjobs-8_lat.6.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_lat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_lat.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_lat.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_lat.8.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_lat.5.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_lat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_lat.7.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_slat.6.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_slat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_slat.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_slat.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_slat.8.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_slat.5.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_slat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_slat.7.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_clat.6.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_clat.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_clat.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_clat.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_clat.8.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_clat.5.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_clat.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_clat.7.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_bw.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_iops.1.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_bw.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_iops.2.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_bw.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_iops.3.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_bw.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_iops.4.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_bw.5.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_iops.5.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_bw.6.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_iops.6.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_bw.7.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_iops.7.log
/out/vmware/4k/randread-iodepth-8-numjobs-8_bw.8.log
/out/vmware/4k/randwrite-1-2.json
/out/vmware/4k/randread-iodepth-8-numjobs-8_iops.8.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_lat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_slat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_clat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_lat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_slat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_clat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_bw.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_iops.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_bw.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-2_iops.2.log
/out/vmware/4k/randwrite-1-4.json
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_lat.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_lat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_lat.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_slat.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_slat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_clat.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_slat.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_clat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_clat.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_lat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_slat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_clat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_bw.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_iops.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_bw.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_iops.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_bw.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_iops.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_bw.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-4_iops.4.log
/out/vmware/4k/randwrite-1-8.json
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_lat.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_lat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_slat.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_clat.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_slat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_clat.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_lat.8.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_lat.7.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_slat.8.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_clat.8.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_slat.7.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_clat.7.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_lat.6.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_lat.5.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_slat.6.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_lat.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_slat.5.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_clat.6.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_slat.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_clat.5.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_clat.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_lat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_slat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_clat.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_bw.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_iops.1.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_bw.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_iops.2.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_bw.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_iops.3.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_bw.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_iops.4.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_bw.5.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_iops.5.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_bw.6.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_iops.6.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_bw.7.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_iops.7.log
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_bw.8.log
/out/vmware/4k/randwrite-2-2.json
/out/vmware/4k/randwrite-iodepth-1-numjobs-8_iops.8.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_lat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_lat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_slat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_slat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_clat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_clat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_bw.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_iops.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_bw.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-2_iops.2.log
/out/vmware/4k/randwrite-2-4.json
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_lat.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_slat.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_lat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_lat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_clat.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_slat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_clat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_slat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_clat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_lat.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_slat.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_clat.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_bw.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_iops.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_bw.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_iops.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_bw.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_iops.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_bw.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-4_iops.4.log
/out/vmware/4k/randwrite-2-8.json
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_lat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_slat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_clat.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_lat.6.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_slat.6.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_clat.6.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_lat.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_slat.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_clat.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_lat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_lat.8.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_lat.5.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_slat.8.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_slat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_clat.8.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_slat.5.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_clat.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_lat.7.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_clat.5.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_slat.7.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_clat.7.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_lat.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_slat.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_clat.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_bw.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_iops.1.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_bw.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_iops.2.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_bw.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_iops.3.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_bw.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_iops.4.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_bw.5.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_iops.5.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_bw.6.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_iops.6.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_bw.7.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_iops.7.log
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_bw.8.log
/out/vmware/4k/randwrite-8-2.json
/out/vmware/4k/randwrite-iodepth-2-numjobs-8_iops.8.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_lat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_slat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_lat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_clat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_slat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_clat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_bw.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_iops.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_bw.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-2_iops.2.log
/out/vmware/4k/randwrite-8-4.json
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_lat.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_slat.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_clat.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_lat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_lat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_slat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_slat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_clat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_clat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_lat.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_slat.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_clat.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_bw.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_iops.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_bw.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_iops.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_bw.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_iops.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_bw.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-4_iops.4.log
/out/vmware/4k/randwrite-8-8.json
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_lat.6.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_slat.6.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_clat.6.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_lat.5.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_lat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_lat.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_lat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_lat.8.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_slat.5.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_slat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_slat.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_slat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_clat.5.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_slat.8.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_lat.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_clat.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_clat.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_clat.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_lat.7.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_clat.8.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_slat.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_slat.7.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_clat.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_clat.7.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_bw.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_iops.1.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_bw.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_iops.2.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_bw.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_iops.3.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_bw.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_iops.4.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_bw.5.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_iops.5.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_bw.6.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_iops.6.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_bw.7.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_iops.7.log
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_bw.8.log
/out/vmware/4k/randwr-1-2.json
/out/vmware/4k/randwrite-iodepth-8-numjobs-8_iops.8.log
/out/ceph
/out/ceph/4k
/out/ceph/4k/randread-1-2.json
/out/ceph/4k/randread-iodepth-1-numjobs-2_lat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_slat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_clat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_lat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_slat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_clat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_bw.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_iops.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_bw.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-2_iops.2.log
/out/ceph/4k/randread-1-4.json
/out/ceph/4k/randread-iodepth-1-numjobs-4_lat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_slat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_clat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_lat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_lat.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_slat.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_clat.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_lat.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_slat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_clat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_slat.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_clat.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_bw.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_iops.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_bw.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_iops.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_bw.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_iops.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_bw.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-4_iops.4.log
/out/ceph/4k/randread-1-8.json
/out/ceph/4k/randread-iodepth-1-numjobs-8_lat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_slat.7.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_lat.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_clat.7.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_lat.8.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_slat.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_clat.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_lat.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_lat.7.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_lat.6.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_slat.6.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_clat.6.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_lat.5.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_slat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_clat.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_slat.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_clat.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_slat.8.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_clat.8.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_slat.5.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_clat.5.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_lat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_slat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_clat.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_bw.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_iops.1.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_bw.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_iops.2.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_bw.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_iops.3.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_bw.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_iops.4.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_bw.5.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_iops.5.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_bw.6.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_iops.6.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_bw.7.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_iops.7.log
/out/ceph/4k/randread-iodepth-1-numjobs-8_bw.8.log
/out/ceph/4k/randread-2-2.json
/out/ceph/4k/randread-iodepth-1-numjobs-8_iops.8.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_lat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_slat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_clat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_lat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_slat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_clat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_bw.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_iops.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_bw.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-2_iops.2.log
/out/ceph/4k/randread-2-4.json
/out/ceph/4k/randread-iodepth-2-numjobs-4_lat.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_slat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_clat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_lat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_lat.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_slat.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_lat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_clat.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_slat.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_clat.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_slat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_clat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_bw.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_iops.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_bw.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_iops.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_bw.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_iops.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_bw.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-4_iops.4.log
/out/ceph/4k/randread-2-8.json
/out/ceph/4k/randread-iodepth-2-numjobs-8_lat.7.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_slat.8.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_lat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_slat.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_clat.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_lat.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_lat.6.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_slat.6.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_clat.6.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_lat.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_slat.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_clat.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_lat.5.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_slat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_clat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_lat.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_lat.8.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_clat.8.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_slat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_clat.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_slat.5.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_clat.5.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_slat.7.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_clat.7.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_bw.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_iops.1.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_bw.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_iops.2.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_bw.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_iops.3.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_bw.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_iops.4.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_bw.5.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_iops.5.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_bw.6.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_iops.6.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_bw.7.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_iops.7.log
/out/ceph/4k/randread-iodepth-2-numjobs-8_bw.8.log
/out/ceph/4k/randread-8-2.json
/out/ceph/4k/randread-iodepth-2-numjobs-8_iops.8.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_lat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_slat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_lat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_clat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_slat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_clat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_bw.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_iops.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_bw.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-2_iops.2.log
/out/ceph/4k/randread-8-4.json
/out/ceph/4k/randread-iodepth-8-numjobs-4_lat.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_slat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_clat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_lat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_slat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_lat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_clat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_slat.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_clat.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_lat.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_slat.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_clat.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_bw.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_iops.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_bw.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_iops.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_bw.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_iops.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_bw.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-4_iops.4.log
/out/ceph/4k/randread-8-8.json
/out/ceph/4k/randread-iodepth-8-numjobs-8_lat.8.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_slat.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_clat.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_lat.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_lat.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_slat.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_clat.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_lat.7.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_slat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_slat.7.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_lat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_clat.7.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_clat.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_lat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_slat.6.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_clat.6.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_lat.6.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_lat.5.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_slat.5.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_clat.5.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_slat.8.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_clat.8.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_slat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_clat.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_bw.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_iops.1.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_bw.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_iops.2.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_bw.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_iops.3.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_bw.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_iops.4.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_bw.5.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_iops.5.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_bw.6.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_iops.6.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_bw.7.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_iops.7.log
/out/ceph/4k/randread-iodepth-8-numjobs-8_bw.8.log
/out/ceph/4k/randwrite-1-2.json
/out/ceph/4k/randread-iodepth-8-numjobs-8_iops.8.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_lat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_slat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_clat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_lat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_slat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_clat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_bw.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_iops.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_bw.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-2_iops.2.log
/out/ceph/4k/randwrite-1-4.json
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_lat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_slat.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_slat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_clat.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_clat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_lat.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_slat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_clat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_lat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_lat.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_slat.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_clat.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_bw.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_iops.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_bw.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_iops.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_bw.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_iops.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_bw.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-4_iops.4.log
/out/ceph/4k/randwrite-1-8.json
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_lat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_slat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_clat.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_lat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_slat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_clat.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_lat.6.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_slat.5.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_slat.6.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_clat.5.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_clat.6.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_lat.5.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_lat.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_slat.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_clat.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_lat.8.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_slat.8.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_clat.8.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_lat.7.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_slat.7.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_clat.7.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_lat.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_slat.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_clat.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_bw.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_iops.1.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_bw.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_iops.2.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_bw.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_iops.3.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_bw.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_iops.4.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_bw.5.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_iops.5.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_bw.6.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_iops.6.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_bw.7.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_iops.7.log
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_bw.8.log
/out/ceph/4k/randwrite-2-2.json
/out/ceph/4k/randwrite-iodepth-1-numjobs-8_iops.8.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_lat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_slat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_slat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_clat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_clat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_lat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_bw.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_iops.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_bw.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-2_iops.2.log
/out/ceph/4k/randwrite-2-4.json
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_lat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_slat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_lat.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_clat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_slat.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_clat.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_lat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_slat.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_slat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_clat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_clat.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_lat.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_bw.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_iops.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_bw.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_iops.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_bw.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_iops.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_bw.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-4_iops.4.log
/out/ceph/4k/randwrite-2-8.json
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_slat.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_lat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_slat.8.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_slat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_clat.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_clat.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_clat.8.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_lat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_slat.7.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_slat.6.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_lat.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_slat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_clat.7.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_lat.8.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_clat.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_clat.6.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_lat.7.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_lat.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_slat.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_clat.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_lat.6.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_lat.5.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_slat.5.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_clat.5.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_bw.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_iops.1.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_bw.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_iops.2.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_bw.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_iops.3.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_bw.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_iops.4.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_bw.5.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_iops.5.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_bw.6.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_iops.6.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_bw.7.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_iops.7.log
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_bw.8.log
/out/ceph/4k/randwrite-8-2.json
/out/ceph/4k/randwrite-iodepth-2-numjobs-8_iops.8.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_lat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_slat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_clat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_lat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_slat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_clat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_bw.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_iops.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_bw.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-2_iops.2.log
/out/ceph/4k/randwrite-8-4.json
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_lat.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_slat.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_slat.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_clat.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_clat.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_lat.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_slat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_clat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_lat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_lat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_slat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_clat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_bw.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_iops.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_bw.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_iops.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_bw.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_iops.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_bw.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-4_iops.4.log
/out/ceph/4k/randwrite-8-8.json
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_lat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_slat.7.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_slat.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_slat.5.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_slat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_clat.5.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_lat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_clat.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_slat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_lat.5.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_clat.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_lat.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_slat.8.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_slat.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_clat.8.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_clat.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_lat.8.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_lat.6.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_slat.6.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_clat.6.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_clat.7.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_clat.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_lat.7.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_lat.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_bw.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_iops.1.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_bw.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_iops.2.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_bw.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_iops.3.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_bw.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_iops.4.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_bw.5.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_iops.5.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_bw.6.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_iops.6.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_bw.7.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_iops.7.log
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_bw.8.log
/out/ceph/4k/randwr-1-2.json
/out/ceph/4k/randwrite-iodepth-8-numjobs-8_iops.8.log
/out/cephfs
/out/cephfs/4k
/out/cephfs/4k/randread-1-2.json
/out/cephfs/4k/randread-iodepth-1-numjobs-2_lat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_slat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_clat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_lat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_slat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_clat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_bw.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_iops.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_bw.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-2_iops.2.log
/out/cephfs/4k/randread-1-4.json
/out/cephfs/4k/randread-iodepth-1-numjobs-4_lat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_lat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_lat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_slat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_slat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_clat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_clat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_slat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_clat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_lat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_slat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_clat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_bw.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_iops.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_bw.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_iops.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_bw.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_iops.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_bw.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-4_iops.4.log
/out/cephfs/4k/randread-1-8.json
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.8.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.8.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.8.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_lat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_slat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_clat.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.1.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.2.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.3.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.4.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.5.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.6.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.7.log
/out/cephfs/4k/randread-iodepth-1-numjobs-8_bw.8.log
/out/cephfs/4k/randread-2-2.json
/out/cephfs/4k/randread-iodepth-1-numjobs-8_iops.8.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_lat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_lat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_slat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_clat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_slat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_clat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_bw.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_iops.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_bw.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-2_iops.2.log
/out/cephfs/4k/randread-2-4.json
/out/cephfs/4k/randread-iodepth-2-numjobs-4_lat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_lat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_slat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_slat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_clat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_clat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_lat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_slat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_clat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_lat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_slat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_clat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_bw.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_iops.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_bw.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_iops.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_bw.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_iops.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_bw.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-4_iops.4.log
/out/cephfs/4k/randread-2-8.json
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.8.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.8.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_lat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.8.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_slat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_clat.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.1.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.2.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.3.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.4.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.5.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.6.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.7.log
/out/cephfs/4k/randread-iodepth-2-numjobs-8_bw.8.log
/out/cephfs/4k/randread-8-2.json
/out/cephfs/4k/randread-iodepth-2-numjobs-8_iops.8.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_lat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_lat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_slat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_clat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_slat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_clat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_bw.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_iops.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_bw.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-2_iops.2.log
/out/cephfs/4k/randread-8-4.json
/out/cephfs/4k/randread-iodepth-8-numjobs-4_lat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_slat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_clat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_lat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_slat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_clat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_lat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_slat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_clat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_lat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_slat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_clat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_bw.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_iops.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_bw.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_iops.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_bw.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_iops.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_bw.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-4_iops.4.log
/out/cephfs/4k/randread-8-8.json
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_lat.8.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_slat.8.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_clat.8.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.1.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.2.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.3.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.4.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.5.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.6.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.7.log
/out/cephfs/4k/randread-iodepth-8-numjobs-8_bw.8.log
/out/cephfs/4k/randwrite-1-2.json
/out/cephfs/4k/randread-iodepth-8-numjobs-8_iops.8.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_lat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_lat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_slat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_slat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_clat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_clat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_bw.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_iops.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_bw.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-2_iops.2.log
/out/cephfs/4k/randwrite-1-4.json
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_lat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_slat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_clat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_lat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_slat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_clat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_lat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_slat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_clat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_lat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_slat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_clat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_bw.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_iops.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_bw.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_iops.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_bw.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_iops.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_bw.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-4_iops.4.log
/out/cephfs/4k/randwrite-1-8.json
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.8.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.8.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.8.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_lat.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_slat.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_clat.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.1.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.2.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.3.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.4.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.5.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.6.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.7.log
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_bw.8.log
/out/cephfs/4k/randwrite-2-2.json
/out/cephfs/4k/randwrite-iodepth-1-numjobs-8_iops.8.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_lat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_slat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_clat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_lat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_slat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_clat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_bw.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_iops.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_bw.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-2_iops.2.log
/out/cephfs/4k/randwrite-2-4.json
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_lat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_slat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_clat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_lat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_slat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_clat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_lat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_slat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_clat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_lat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_slat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_clat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_bw.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_iops.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_bw.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_iops.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_bw.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_iops.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_bw.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-4_iops.4.log
/out/cephfs/4k/randwrite-2-8.json
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.8.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.8.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.8.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_lat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_slat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_clat.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.1.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.2.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.3.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.4.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.5.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.6.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.7.log
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_bw.8.log
/out/cephfs/4k/randwrite-8-2.json
/out/cephfs/4k/randwrite-iodepth-2-numjobs-8_iops.8.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_lat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_lat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_slat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_clat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_slat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_clat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_bw.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_iops.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_bw.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-2_iops.2.log
/out/cephfs/4k/randwrite-8-4.json
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_lat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_slat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_clat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_lat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_lat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_lat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_slat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_slat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_clat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_clat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_slat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_clat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_bw.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_iops.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_bw.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_iops.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_bw.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_iops.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_bw.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-4_iops.4.log
/out/cephfs/4k/randwrite-8-8.json
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.8.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.8.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.8.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_lat.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_slat.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_clat.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.1.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.2.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.3.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.4.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.5.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.6.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.7.log
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_bw.8.log
/out/cephfs/4k/randwr-1-2.json
/out/cephfs/4k/randwrite-iodepth-8-numjobs-8_iops.8.log
/fio-plot #

2D chart ValueError: 'center right' is not a valid value for align; supported values are 'center', 'right', 'left'

./fio_plot -i ../benchmark_script/benchmarks/sda1/4k/ -T "Title" -s https://louwrentius.com -l -n 1 -r randread

Traceback (most recent call last):
File "./fio_plot", line 141, in
main()
File "./fio_plot", line 129, in main
uses_json_files[item](settings, parsed_data)
File "/scripts/fio-plot/fio_plot/fiolib/bar2d.py", line 62, in chart_2dbarchart_jsonlogdata
shared.create_stddev_table(data, ax2)
File "/scripts/fio-plot/fio_plot/fiolib/shared_chart.py", line 153, in create_stddev_table
rasterized=False)
File "/usr/local/lib64/python3.6/site-packages/matplotlib/table.py", line 802, in table
loc=cellLoc)
File "/usr/local/lib64/python3.6/site-packages/matplotlib/table.py", line 337, in add_cell
cell = Cell(xy, visible_edges=self.edges, *args, **kwargs)
File "/usr/local/lib64/python3.6/site-packages/matplotlib/table.py", line 99, in init
horizontalalignment=loc, verticalalignment='center')
File "/usr/local/lib64/python3.6/site-packages/matplotlib/text.py", line 157, in init
self.set_horizontalalignment(horizontalalignment)
File "/usr/local/lib64/python3.6/site-packages/matplotlib/text.py", line 950, in set_horizontalalignment
cbook._check_in_list(['center', 'right', 'left'], align=align)
File "/usr/local/lib64/python3.6/site-packages/matplotlib/cbook/init.py", line 2250, in _check_in_list
.format(v, k, ', '.join(map(repr, values))))
ValueError: 'center right' is not a valid value for align; supported values are 'center', 'right', 'left'

Name: matplotlib
Version: 3.3.0
Summary: Python plotting package
Home-page: https://matplotlib.org
Author: John D. Hunter, Michael Droettboom
Author-email: [email protected]
License: PSF
Location: /usr/local/lib64/python3.6/site-packages
Requires: cycler, pillow, python-dateutil, pyparsing, numpy, kiwisolver

Decreasing --runtime``` (time duration per test) does not decrease total time of bench_fio run

I have run following command that took ~12 minutes - with default --runtime (60)

~/fio-plot/benchmark_script/bench_fio --target /tmp/testdir --type directory --mode randread randwrite --output . --iodepth 1 2 4 8 16 32 64 128 256 512 1024 2048 --numjobs 1 2 3 4 5 6 7 8 9 10 -s 20M -j ~/fio-plot/benchmark_script/fio-job-template.fio
                                +++ Fio Benchmark Script +++

Estimated duration            : 4:00:00
Test target                   : /tmp/testdir
Job template                  : /root/fio-plot/benchmark_script/fio-job-template.fio
I/O Engine                    : libaio
Test mode (read/write)        : randread randwrite
IOdepth to be tested          : 1 2 4 8 16 32 64 128 256 512 1024 2048
NumJobs to be tested          : 1 2 3 4 5 6 7 8 9 10
Block size                    : 4k
Direct I/O                    : 1
Specified test data size      : 20M
Precondition template         : precondition.fio
Time duration per test (s)    : 60
Benchmark loops               : 1
Log interval of perf data (ms): 500
Invalidate buffer cache       : 1
Target type                   : directory
Output folder                 : .
Number of benchmarks          : 240

███████████████████████████████████████████████████████████████████████████████████████████████
 100% |█████████████████████████|   [0:11:32, 0:00:00]-]

Next, I run the same command with --runtime 10 (10 seconds)

Shouldn't it take less time than in first test which was run with default 60 seconds per test?
Why it estimates the same time to end with 10 seconds --runtime?

Command in test2:
~/fio-plot/benchmark_script/bench_fio --target /tmp/testdir --type directory --mode randread randwrite --output . --iodepth 1 2 4 8 16 32 64 128 256 512 1024 2048 --numjobs 1 2 3 4 5 6 7 8 9 10 -s 20M -j ~/fio-plot/benchmark_script/fio-job-template.fio --runtime 10


Estimated duration            : 0:40:00
Test target                   : /tmp/testdir
Job template                  : /root/fio-plot/benchmark_script/fio-job-template.fio
I/O Engine                    : libaio
Test mode (read/write)        : randread randwrite
IOdepth to be tested          : 1 2 4 8 16 32 64 128 256 512 1024 2048
NumJobs to be tested          : 1 2 3 4 5 6 7 8 9 10
Block size                    : 4k
Direct I/O                    : 1
Specified test data size      : 20M
Precondition template         : precondition.fio
Time duration per test (s)    : 10
Benchmark loops               : 1
Log interval of perf data (ms): 500
Invalidate buffer cache       : 1
Target type                   : directory
Output folder                 : .
Number of benchmarks          : 240

███████████████████████████████████████████████████████████████████████████████████████████████
   9% |██▎                      | | [0:01:10, 0:11:37]-]

-t bw option produces an error on fio_plot

when I use the -t bw to get bandwidth, it give me an error - lat and iops work options work fine.

Traceback (most recent call last):
File "./fio_plot", line 135, in
main()
File "./fio_plot", line 123, in main
uses_json_files[item](settings, parsed_data)
File "/spice/data/users/itnw/fio-plot-master/fio_plot/fiolib/bar3d.py", line 28, in plot_3d
rw, metric)
File "/spice/data/users/itnw/fio-plot-master/fio_plot/fiolib/shared_chart.py", line 45, in get_record_set_3d
row.append(record[metric])
KeyError: 'bw'

3D graphs are transposed

Proof: run with non-square data (numjobs != iodepth), e.g. using next script:

./fio_plot/fio_plot -i benchmark_data/HPDL380G8/HBA/SAMSUNG_860PRO -T randread -r randread -L -t iops
rm "benchmark_data/HPDL380G8/HBA/SAMSUNG_860PRO/"randread-64*.json
./fio_plot/fio_plot -i benchmark_data/HPDL380G8/HBA/SAMSUNG_860PRO -T randread -r randread -L -t iops
# observe difference in left part of graph

Will submit PR shortly

Support sequential mix read write ?

Both of benchmark_script and plot module doesn't support readwrite mode, this is part of fio docmentation

readwrite=str, rw=str
Type of I/O pattern. Accepted values are:
-read
Sequential reads.

  • write
    Sequential writes.
  • trim
    Sequential trims (Linux block devices and SCSI character devices only).
  • randread
    Random reads.
  • randwrite
    Random writes.
  • randtrim
    Random trims (Linux block devices and SCSI character devices only).
  • rw,readwrite
    Sequential mixed reads and writes.
  • randrw
    Random mixed reads and writes.
  • trimwrite
    Sequential trim+write sequences. Blocks will be trimmed first, then the same blocks will be written to.

I don't why only randrw is supported, what about support rw too ?

[Feature Request] SSD Preconditioning

As I know, if we want to get reliable result, before running benchmark we should prepare SSD.
It is usually called 'SSD Preconditioning'.

benchmark_script within fio-plot is fancy tool, but It doesn't contains preconditioning option.
So, I run my preconditioning shell script first before executing fio-plot's benchmark script.

I think it is more convenient when SSD preconditioning is integrated with fio-plot's benchmark script.

plot 3d graph from randrw benchmark

hi,
I run following bench_fio command:

 
██████████████████████████████████████████████████████████████████████████████████████████████
                                +++ Fio Benchmark Script +++

Estimated duration            : 0:42:00
Test target                   : /nfstest
Job template                  : /home/azureuser/fio-plot/benchmark_script/fio-job-template.fio
I/O Engine                    : libaio
Test mode (read/write)        : randrw
IOdepth to be tested          : 1 2 4 8 16 32 64
NumJobs to be tested          : 1 2 4 8 16 32
Block size                    : 4k
Direct I/O                    : 1
Specified test data size      : 20M
Precondition template         : precondition.fio
Read/write mix in %% read     : 80
Time duration per test (s)    : 60
Benchmark loops               : 1
Log interval of perf data (ms): 500
Invalidate buffer cache       : 1
Target type                   : directory
Output folder                 : .
Number of benchmarks          : 42

██████████████████████████████████████████████████████████████████████████████████████████████
 100% |█████████████████████████|   [0:05:51, 0:00:00]-]

I can't figure out how to plot a 3D graph from the data, the command just fails, example what I've tried:

$ ~/fio-plot/fio_plot/fio_plot -i ./nfstest/randrw80/4k/ -T "rhel84 lvm linear" -d 1 2 4 8 16 32 64 -n 2 4 8 16 32 64 -L -t iops -r randrw
Since we are processing randrw data, you must specify a filter for either read or write data, not both.
$ ~/fio-plot/fio_plot/fio_plot -i ./nfstest/randrw80/4k/ -T "rhel84 lvm linear" -d 1 2 4 8 16 32 64 -n 2 4 8 16 32 64 -L -t iops -r write     
Could not find any (matching) JSON files in the specified directory /home/azureuser/nfstest/randrw80/4k

Are the correct directories specified?

If so, please check the -d ([1, 2, 4, 8, 16, 32, 64]) -n ([2, 4, 8, 16, 32, 64]) and -r (write) parameters.

thank you for any hint.

unicode error on benchmark script

no matter what combinations of options I try on the benchmark script I get this error:

Traceback (most recent call last):
File "./bench_fio", line 385, in
main()
File "./bench_fio", line 378, in main
display_header(settings, tests)
File "./bench_fio", line 297, in display_header
print(f"\u2588" * (fl + width))
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-77: ordinal not in range(128)

Any ideas? I'm running the script on a rhel7.7 server.

fio_plot: ZeroDivisionError: division by zero

./bench_fio -d /data -t directory -s 1g --mode randread -o /data/benchmarks --iodepth 1 2 4 8 16 32 64 --numjobs 1 --block-size 16k

./fio_plot/fio_plot -i /data/benchmarks/data/16k -T "Apple SSD AP0256M BLOCK_SIZE 16K On Vagrant" -s https://louwrentius.com -l -n 1 -r randread

Traceback (most recent call last):
File "/app/fio_plot/fio_plot", line 151, in
main()
File "/app/fio_plot/fio_plot", line 139, in main
uses_json_files[item](settings, parsed_data)
File "/app/fio_plot/fiolib/bar2d.py", line 14, in chart_2dbarchart_jsonlogdata
data = shared.get_record_set(settings, dataset, dataset_types,
File "/app/fio_plot/fiolib/shared_chart.py", line 99, in get_record_set
lat_stddev_percent = supporting.raw_stddev_to_percent(
File "/app/fio_plot/fiolib/supporting.py", line 222, in raw_stddev_to_percent
percent = round((int(y) / int(x)) * 100, 0)
ZeroDivisionError: division by zero

Stack trace when specifying higher iodepth values

I did run bench_fio command:
~/fio-plot/benchmark_script/bench_fio --target /tmp/testdir --type directory --mode randread randwrite --output . --iodepth 1 2 4 8 16 32 64 128 256 512 1024 2048 --numjobs 1 -s 20M -j ~/fio-plot/benchmark_script/fio-job-template.fio

which produced folder with logs and json files:

[root@centos1 tmp]# find . -name "*.json" -ls
 17679641      8 -rw-r--r--   1  root     root         7369 Sep  5 09:51 ./testdir/4k/randread-1-1.json
 17679679      8 -rw-r--r--   1  root     root         7375 Sep  5 09:51 ./testdir/4k/randread-2-1.json
 17696196      8 -rw-r--r--   1  root     root         7354 Sep  5 09:51 ./testdir/4k/randread-4-1.json
 17696202      8 -rw-r--r--   1  root     root         7355 Sep  5 09:51 ./testdir/4k/randread-8-1.json
 17696213      8 -rw-r--r--   1  root     root         7361 Sep  5 09:51 ./testdir/4k/randread-16-1.json
 17696219      8 -rw-r--r--   1  root     root         7371 Sep  5 09:51 ./testdir/4k/randread-32-1.json
 17696225      8 -rw-r--r--   1  root     root         7378 Sep  5 09:51 ./testdir/4k/randread-64-1.json
 17696231      8 -rw-r--r--   1  root     root         7376 Sep  5 09:51 ./testdir/4k/randread-128-1.json
 17696237      8 -rw-r--r--   1  root     root         7399 Sep  5 09:51 ./testdir/4k/randread-256-1.json
 17696243      8 -rw-r--r--   1  root     root         7408 Sep  5 09:51 ./testdir/4k/randread-512-1.json
 17696249      8 -rw-r--r--   1  root     root         7414 Sep  5 09:51 ./testdir/4k/randread-1024-1.json
 17696255      8 -rw-r--r--   1  root     root         7417 Sep  5 09:51 ./testdir/4k/randread-2048-1.json
 17696325      8 -rw-r--r--   1  root     root         7331 Sep  5 09:51 ./testdir/4k/randwrite-1-1.json
 17696331      8 -rw-r--r--   1  root     root         7340 Sep  5 09:51 ./testdir/4k/randwrite-2-1.json
 17696337      8 -rw-r--r--   1  root     root         7350 Sep  5 09:51 ./testdir/4k/randwrite-4-1.json
 17696343      8 -rw-r--r--   1  root     root         7356 Sep  5 09:51 ./testdir/4k/randwrite-8-1.json
 17696349      8 -rw-r--r--   1  root     root         7368 Sep  5 09:51 ./testdir/4k/randwrite-16-1.json
 17696355      8 -rw-r--r--   1  root     root         7376 Sep  5 09:51 ./testdir/4k/randwrite-32-1.json
 17696361      8 -rw-r--r--   1  root     root         7390 Sep  5 09:52 ./testdir/4k/randwrite-64-1.json
 17696367      8 -rw-r--r--   1  root     root         7407 Sep  5 09:52 ./testdir/4k/randwrite-128-1.json
 17696373      8 -rw-r--r--   1  root     root         7402 Sep  5 09:52 ./testdir/4k/randwrite-256-1.json
 17768760      8 -rw-r--r--   1  root     root         7416 Sep  5 09:52 ./testdir/4k/randwrite-512-1.json
 17768766      8 -rw-r--r--   1  root     root         7420 Sep  5 09:52 ./testdir/4k/randwrite-1024-1.json
 17763012      8 -rw-r--r--   1  root     root         7421 Sep  5 09:52 ./testdir/4k/randwrite-2048-1.json

However when I run fio_plot command:

 ~/fio-plot/fio_plot/fio_plot -i ./testdir/4k/ -T "TESTRUN" -d 1 2 4 8 16 32 64 128 256 512 1024 2048 -L -t iops -r randread

or
~/fio-plot/fio_plot/fio_plot -i ./testdir/4k/ -T "TESTRUN" -L -t iops -r randread

it produces only graph for iodepths 1-64

I found it fails on iodepth higher than 64

[root@centos1 tmp]# ~/fio-plot/fio_plot/fio_plot -i ./testdir/4k/ -T "TESTRUN" -d 64 -L -t iops -r randread

 Saving to file TESTRUN_2021-09-05_100527_fT.png

[root@centos1 tmp]# ~/fio-plot/fio_plot/fio_plot -i ./testdir/4k/ -T "TESTRUN" -d 128 -L -t iops -r randread
Traceback (most recent call last):
  File "/root/fio-plot/fio_plot/fio_plot", line 36, in <module>
    main()
  File "/root/fio-plot/fio_plot/fio_plot", line 29, in main
    routing_dict[item]["function"](settings, data)
  File "/root/fio-plot/fio_plot/fiolib/bar3d.py", line 90, in plot_3d
    xpos, ypos = np.meshgrid(xpos - (size / lx), ypos - (size * (ly / lx)))
ZeroDivisionError: float division by zero
[root@centos1 tmp]#

Am I doing/using something wrong?

temp.zip

-t bw not working

I try to generate a read bandwidth comparison graph, but always get a iops graph. Is this an error in the tool or is there something wrong with my command? (using the git master).

Thank You.

@rescue ~/fio-plot/benchmark_script # ../fio_plot/fio_plot -i plain/sd*/* -T "SegRead" -C -r read -t bw lat -n 4 -d 32

 Saving to file SegRead_2021-04-26_201149_Nz.png

SegRead_2021-04-26_201149_Nz

my data was generated with

~/fio-plot/benchmark_script # ./bench_fio --target /dev/sda2 /dev/sdb2 /dev/sdc2 /dev/sdd2 --type device --mode read write -b 1024k --output plain --direct=1 --numjobs 4 --runtime 10 --iodepth 32

Getting a key error when trying to generate plots

Hi, I am running version 3.15 of fio and getting the following error when trying to generate plots:

$ ./fio-plot.py -i UpdatedFio/hammerTime/ --latency_iops_2d --numjobs 1
Traceback (most recent call last):
File "./fio-plot.py", line 740, in
main()
File "./fio-plot.py", line 727, in main
b.chart_iops_latency('randread')
File "./fio-plot.py", line 640, in chart_iops_latency
self.getStats()
File "./fio-plot.py", line 602, in getStats
'iops': self.get_nested_value(record,m['iops']),
File "./fio-plot.py", line 568, in get_nested_value
dictionary = dictionary[item]
KeyError: ''

Is this a known issue?

TypeError: 'NoneType' object is not subscriptable

After running the following bench_fio invocation:

./bench_fio --target /root/test --type directory --size 500MiB --output /root/out --iodepth 1 2 4 8 16 32 --numjobs 1 2 4 8 16 32 --rwmixread 50 --direct 1 --invalidate 1 --engine posixaio

Followed by this fio_plot command (using -L to request the 3D graph):

./fio_plot  -i ../../edge/  -L -r randread -T Test

I'm getting the following error from fio_plot:

Traceback (most recent call last):
  File "./fio_plot", line 180, in <module>
    main()
  File "./fio_plot", line 147, in main
    run_preflight_checks(settings)
  File "./fio_plot", line 123, in run_preflight_checks
    if settings['type'][0] not in ['iops', 'lat']:
TypeError: 'NoneType' object is not subscriptable

Specified target folder for benchmarking is not actually used by FIO

although with an issue, I ran it with the options below and it wrote the iotest.0.0 100g file into the local directory instead of in /var/tmp.

ls -l
total 104857868
-rw-r--r-- 1 root root 5993 Apr 22 15:42 README.md
drwxr-xr-x 2 root root 4096 Apr 22 15:02 pycache
-rwxr-xr-x 1 root root 1164 Apr 22 15:42 bench-fio.sh
-rwxr-xr-x 1 root root 14619 Apr 22 15:42 bench_fio
lrwxrwxrwx 1 root root 11 Apr 22 16:15 bench_fio.py -> ./bench_fio
-rwxr-xr-x 1 root root 1538 Apr 22 15:42 bench_fio_test.py
-rw-r--r-- 1 root root 420 Apr 22 15:42 fio-job-template.fio
-rw-r--r-- 1 root root 107374182400 Apr 22 16:40 iotest.0.0
-rw-r--r-- 1 root root 6 Apr 22 15:42 requirements.txt
drwxr-xr-x 4 root root 4096 Apr 22 16:36 spice_test

./bench_fio --target /var/tmp --type folder --mode randread randwrite -b 4k -s 100g --output spice_test --iodepth 1 --numjobs 1
██████████████████████████████████████████████████████████████████████████████
+++ Fio Benchmark Script +++

Job template: ./fio-job-template.fio
I/O Engine: libaio
Number of benchmarks: 2
Estimated duration: 0:02:00
Devices to be tested: /var/tmp
Test mode (read/write): randread randwrite
IOdepth to be tested: 1
NumJobs to be tested: 1
Block size(s) to be tested: 4k
Time per test (s): 60
File size: 100g

██████████████████████████████████████████████████████████████████████████████
100% |█████████████████████████| [0:03:58, 0:00:00]-]

Running fio__plot - no files found matching parameter "randrw"

I have both FIO and fio-plot installed
FIO: fio-3.19-48-g3966
fio_plot - latest version cloned a few days ago
Centos: CentOS Linux release 7.7.1908 (Core)

I run the following:
./bench_fio --target /mnt/afs-nfsv4 --type directory --size 5G --template fio-job-template.fio --iodepth 1 8 16 --numjobs 16 --mode randrw --output NFS_TEST --readmix 75 90

./fio_plot -i ../benchmark_script/NFS_TEST/afs-nfsv4/randrw90/4k -T "Test" -g -r randrw -t iops lat -d 1 8 16 -n 1

The directory is populated with json and .log files......
`root@nfsv4test01 (afs-nfsv4)# tree -L 3
├── randrw75

│   └── 4k
│   ├── randrw-1-16.json
│   ├── randrw-16-16.json
│   ├── randrw-8-16.json
│   ├── randrw-iodepth-16-numjobs-16_bw.10.log
│   ├── randrw-iodepth-16-numjobs-16_bw.11.log
│   ├── randrw-iodepth-16-numjobs-16_bw.12.log
│   ├── randrw-iodepth-16-numjobs-16_bw.13.log
│   ├── randrw-iodepth-16-numjobs-16_bw.14.log
│   ├── randrw-iodepth-16-numjobs-16_bw.15.log
│   ├── randrw-iodepth-16-numjobs-16_bw.16.log
│   ├── randrw-iodepth-16-numjobs-16_bw.1.log`

But I consistently get this -
No log files found that matches the specified parameter randrw

I checked the json, randrw is in there.

Wondering what I might be missing.

KeyError: 'job options'

Hi,

i just wanted to run that plotting script, but it seams not to work. Any ideas about what i did wrong??

D:\FIO\fio-plot-master>fio-plot.py -i ../Plots -s performance_hw0.json -H
Traceback (most recent call last):
  File "D:\FIO\fio-plot-master\fio-plot.py", line 320, in <module>
    main()
  File "D:\FIO\fio-plot-master\fio-plot.py", line 310, in main
    b.chart_latency_histogram('read')
  File "D:\FIO\fio-plot-master\fio-plot.py", line 262, in chart_latency_histogram
    stats = self.getStats(mode)
  File "D:\FIO\fio-plot-master\fio-plot.py", line 244, in getStats
    depth = record['jobs'][0]['job options']['iodepth'].lstrip("0")
KeyError: 'job options'

logs generated with 0 size whith any mode where write is used (read is working OK)

[root@fio-test benchmark_script]# ./bench_fio --target /dev/vda1 --type device --mode randwrite --output CEPH --iodepth 1 8 16 --numjobs 8

██████████████████████████████████████████████████████████████████████████████
+++ Fio Benchmark Script +++

Estimated duration : 0:03:00
Test target : /dev/vda1
Job template : ./fio-job-template.fio
I/O Engine : libaio
Test mode (read/write) : randwrite
IOdepth to be tested : 1 8 16
NumJobs to be tested : 8
Block size : 4k
Direct I/O : 1
Read/write mix in %% read : 75
Time duration per test (s) : 60
Log interval of perf data : 500
Invalidate buffer cache : 1
Target type : device
Output folder : CEPH

██████████████████████████████████████████████████████████████████████████████
100% |█████████████████████████| [0:00:01, 0:00:00]-]
[root@fio-test benchmark_script]# ls -al CEPH/vda1/4k
total 28
drwxr-xr-x. 2 root root 4096 May 6 17:52 .
drwxr-xr-x. 3 root root 16 May 6 17:52 ..
-rw-r--r--. 1 root root 7292 May 6 18:31 randwrite-16-8.json
-rw-r--r--. 1 root root 7288 May 6 18:31 randwrite-1-8.json
-rw-r--r--. 1 root root 7288 May 6 18:31 randwrite-8-8.json
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_bw.1.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_bw.2.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_bw.3.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_bw.4.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_bw.5.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_bw.6.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_bw.7.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_bw.8.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_iops.1.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_iops.2.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_iops.3.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_iops.4.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_iops.5.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_iops.6.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_iops.7.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-16-numjobs-8_iops.8.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_bw.1.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_bw.2.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_bw.3.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_bw.4.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_bw.5.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_bw.6.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_bw.7.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_bw.8.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_iops.1.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_iops.2.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_iops.3.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_iops.4.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_iops.5.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_iops.6.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_iops.7.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-1-numjobs-8_iops.8.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_bw.1.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_bw.2.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_bw.3.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_bw.4.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_bw.5.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_bw.6.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_bw.7.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_bw.8.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_iops.1.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_iops.2.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_iops.3.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_iops.4.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_iops.5.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_iops.6.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_iops.7.log
-rw-r--r--. 1 root root 0 May 6 18:31 randwrite-iodepth-8-numjobs-8_iops.8.log
[root@fio-test benchmark_script]#
[root@fio-test benchmark_script]# python3 -m pip install matplotlib
Requirement already satisfied: matplotlib in /usr/local/lib64/python3.6/site-packages (3.2.1)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib64/python3.6/site-packages (from matplotlib) (1.2.0)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/site-packages (from matplotlib) (0.10.0)
Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/site-packages (from matplotlib) (2.8.1)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/site-packages (from matplotlib) (2.4.7)
Requirement already satisfied: numpy>=1.11 in /usr/local/lib64/python3.6/site-packages (from matplotlib) (1.18.4)
Requirement already satisfied: six in /usr/local/lib/python3.6/site-packages (from cycler>=0.10->matplotlib) (1.14.0)
[root@fio-test benchmark_script]# python3 -m pip install numpy
Requirement already satisfied: numpy in /usr/local/lib64/python3.6/site-packages (1.18.4)
[root@fio-test benchmark_script]#
[root@fio-test benchmark_script]# uname -a
Linux fio-test 3.10.0-1062.7.1.el7.x86_64 #1 SMP Mon Dec 2 17:33:29 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
[root@fio-test benchmark_script]#

bench_fio uses os.path.isfile to check for device targets

From https://github.com/louwrentius/fio-plot/blob/master/benchmark_script/bench_fio#L103 it seems that bench_fio uses Python's os.path.isfile() to test if a certain target device is valid.

However, at least on Python 3.8.2 and Ubuntu 20.04 LTS (focal) this function returns False for any non-regular file, like block devices.

A workaround to get the script going is to use os.path.exists but a more durable solution would be to use pathlib, for example:

pathlib.Path('/dev/sda').is_block_device()

specify output filename

It's me again :)

I've noticed that you "hardcode" the PNG output here

It would be most excellent if this was altered to be just the default behavior, allowing a command line argument to specify the exact name to use.

I'm trying to graph a bunch of tests in bulk and it's a bit cumbersome to map the resulting graphs back without inspecting each file. Ie, I'm doing something like this:

#!/bin/bash

function do_graph {
    ../fio-plot/fio_plot/fio_plot "$@"
}

for input in xfs/{single,raid0}/gp{2,3} zfs/{single,raid0}/{,compressed/}gp{2,3}
do
    echo "Processing: $input"
    do_graph \
        --title "Latency Performance: $input" \
        --subtitle "Random Reads" \
        --input-directory "$input" \
        --iodepth-numjobs-3d \
        -t lat -r randread \
        --filter read
        
    do_graph \
        --title "Latency Performance: $input" \
        --subtitle "Random Writes" \
        --input-directory "$input" \
        --iodepth-numjobs-3d \
        -t lat -r randwrite \
        --filter write
done

... and the result is a pile of PNGs named like:

  • Latency-Performance_2021-04-07_150613_vd.png
  • Latency-Performance_2021-04-07_150616_Rt.png
  • Latency-Performance_2021-04-07_150619_fu.png

Ideally I would want to be able to embed what I want into the filename instead of relying upon what you've got hardcoded.

benchmark multiple device simultaneously

how we should benchmark multiple device simultaneously? if set multiple device in front of --target it benchmark them one by one so what should be done to benchmark them simultaneously?

requires python3, tkinter; AND cannot run headless

The server I was attempting to benchmark didn't have these items, that appear to be required.
After installing the additional dependencies, it appears tkinter requires a working x display.

:( I was so close!

My thoughts:
After the .json files are generated, send them to another machine with the additional libraries / dependencies for image generation.


Traceback (most recent call last):
  File "./fio-plot.py", line 320, in <module>
    main()
  File "./fio-plot.py", line 306, in main
    b.chart_iops_latency('read')
  File "./fio-plot.py", line 259, in chart_iops_latency
    c.plot_io_and_latency(mode)
  File "./fio-plot.py", line 26, in plot_io_and_latency
    nrows=2, gridspec_kw={'height_ratios': [7, 1]})
  File "/usr/lib64/python3.6/site-packages/matplotlib/pyplot.py", line 1202, in subplots
    fig = figure(**fig_kw)
  File "/usr/lib64/python3.6/site-packages/matplotlib/pyplot.py", line 535, in figure
    **kwargs)
  File "/usr/lib64/python3.6/site-packages/matplotlib/backends/backend_tkagg.py", line 81, in new_figure_manager
    return new_figure_manager_given_figure(num, figure)
  File "/usr/lib64/python3.6/site-packages/matplotlib/backends/backend_tkagg.py", line 89, in new_figure_manager_given_figure
    window = Tk.Tk()
  File "/usr/lib64/python3.6/tkinter/__init__.py", line 2017, in __init__
    self.tk = _tkinter.create(screenName, baseName, className, interactive, wantobjects, useTk, sync, use)
_tkinter.TclError: no display name and no $DISPLAY environment variable

Finally: Thank you for creating, publishing, documenting, and blogging about your work. It has been a resource of mine for the past few years.

Not able to create comparative graphs

I am using the latest fio-plot repo

I have 2 Folder:

/home/PMEM0_RW_15M/pmem0/1k 
/home/PMEM0_RW_15M/pmem0/4k

I want to create a graph comparing these two folders
/root/anaconda3/bin/python /root/fio-plot/fio_plot/fio_plot -i /home/PMEM0_RW_15M/pmem0/1k /home/PMEM0_RW_15M/pmem0/4k -T "1K_4K_15_MIN_RANDOM_WRITE" -g -r randwrite -t iops lat -d 1 -n 1
1K_4K_15_MIN_RANDOM_WRITE-2020-08-08_114507

Observation: In the plot above we do not see the Green bar and Orange Bar. Looks missing.
We only see the blue and red bar.

Issue: it does not provide the desired result.

I also tried with renaming the folders as /tmp/A and /tmp/B, but the issue remains the same.

/root/anaconda3/bin/python /root/fio-plot/fio_plot/fio_plot -i /tmp/A /tmp/B -T "1K_4K_15_MIN_RANDOM_WRITE" -g -r randwrite -t iops lat -d 1 -n 1

1K_4K_15_MIN_RANDOM_WRITE-2020-08-08_121319

If I plot each folder individually, then it plots as expected (for individual folders)

/root/anaconda3/bin/python /root/fio-plot/fio_plot/fio_plot -i /home/PMEM0_RW_15M/pmem0/4k -T "4K_15_MIN_RANDOM_WRITE" -g -r randwrite -t iops lat -d 1 -n 1
4K_15_MIN_RANDOM_WRITE-2020-08-08_114323

/root/anaconda3/bin/python /root/fio-plot/fio_plot/fio_plot -i /home/PMEM0_RW_15M/pmem0/1k -T "1K_15_MIN_RANDOM_WRITE" -g -r randwrite -t iops lat -d 1 -n 1
1K_15_MIN_RANDOM_WRITE-2020-08-08_114312

Can you please help me in understanding this issue?

Using devices

I tested this amazing tool and it works great.

Please consider putting a warning in your README file that if one chooses a device for any "write" operation, they may render the device useless as it will destroy the content of the raw device. But, that is not the case while defining a file as a target that can reside on an NVMe/SSD mounted file system.

Consider adding choice of direct/buffered IO as parameter

Consider adding

    ag.add_argument(
        "--direct", help=f"Override the default IO mode for benchmark \
            (default: {settings['direct']})", type=int, default=settings['direct'])

at line 251 of benchmark_script/bench_fio.

For example ZFS does not support direct IO mode (see openzfs/zfs#224), so in order to run tests on ZFS one needs to be able to disable it.

can't set --duration

[root@latpoc17 benchmark_script]# ./bench_fio --target /mnt/peter --type directory --size=2G -e=posixaio --template fio-job-template.fio --iodepth 8 --numjobs 8 --mode read --duration 360 --output WEKA   Traceback (most recent call last): File "./bench_fio", line 461, in <module> main() File "./bench_fio", line 454, in main display_header(settings, tests) File "./bench_fio", line 362, in display_header duration = calculate_duration(settings, tests) File "./bench_fio", line 323, in calculate_duration duration = str(datetime.timedelta(seconds=duration_in_seconds))

Enhancement for comparing different filesystem ?

fio-plot can compare bw, iops and lat at different iodepth, numjobs, read/write mode, etc.

I think maybe we can use it to compare different filesystems. This is my plan:

  1. add a new option named --filesystem
  2. if the --type=device and --filesystem=ext4 btrfs xfs, for every filesystem
    1. unmount the device
    2. mkfs
    3. mount device
    4. run fio
  3. add a new table to show the data.

If the idea is acceptable, I will work on a pull request for it.

Bad graph layout when comparing multiple queue depth and jobs

fio_plot -i "raaf/think/4k" --title "raaf/think" --subtitle "4k block (random read iops)" --dpi 300 -l -t iops -r randread -d 1 2 4 8 16 32 -n 1 2 4 8 16 32

Results in:
raaf-think_2020-10-27_183017

Ergo, the graph is strangely aligned to the right + other parts of the tables overlap.

junk files in git

Hello,

While trying out your project, I noticed you have some stuff in your repo that should ideally be put into .gitignore and never added.

Notable examples are the OSX .DS_Store files and __pychace__ directories.

Consider removing these files (they're small enough you don't have to go back rewriting history to wipe them!) and utilizing a good .gitignore. This might be a good one to base from!

lib error

when I try run some test on my mac , I got this error, neither version of matplotlib is 2.2.4 or 3.0.3

Traceback (most recent call last):
File "fio-plot.py", line 16, in
from mpl_toolkits.mplot3d import axes3d
ImportError: No module named mpl_toolkits.mplot3d

Reference to #29 - Graph Time line

I have ran the benchmark for 15 minutes, but when I plot I get the x-axis (time in second) for only 14 seconds.
Is there any flag to enable the plotting for the the time duration till the benchmark was ran?

1K_4K_15_MIN_RANDOM_WRITE-2020-08-08_114507

Support of rbd ioengine fio test

Hi,

We are using your tool and it is very helpful four our tests, but we wanted to know if we could have the support of the ioengine rbd.

As in the template fio we can setup the following

[iotest]
ioengine=rbd
pool=pool_ssd
rbdname=test
rw=${MODE}
blocksize=${BLOCK_SIZE}
iodepth=${IODEPTH}
numjobs=${NUMJOBS}
direct=${DIRECT}
group_reporting=1
invalidate=${INVALIDATE}
loops=${LOOPS}
write_bw_log=${OUTPUT}/${MODE}-iodepth-${IODEPTH}-numjobs-${NUMJOBS}
write_lat_log=${OUTPUT}/${MODE}-iodepth-${IODEPTH}-numjobs-${NUMJOBS}
write_iops_log=${OUTPUT}/${MODE}-iodepth-${IODEPTH}-numjobs-${NUMJOBS}
log_avg_msec=${LOGINTERVAL}

When running the command bench_fio we have the --target and --type mandatory, but actually in the rbd settings for fio it is not needed.

Is this could be implemented ?

Maybe I have an misunderstanding of the --device --type options in case of using rbd ioengine.

Coudl you please help me with this ?

Thank You !

Best Regards, Edouard Fazenda.

RFE: correlate commandline arguments with output graphs

fio-plot is a fantastic tool, thanks for making it available.

As I run a variety of commandlines, output graphs accumulate, and it's easy to lose track of which graph corresponds to which commandline. This is important for repeatability, as well as to ensure appropriate comparison among block sizes, drive models, etc.

Suggest a commandline flag to aid in this, one option might be to embed the commandline arguments into the output filename. Another option would be to render the args as text at the bottom of the graph; a third would be to embed the args as a iTXT or tEXt string within the PNG file.

Any of these strategies would help unambiguously correlate generation args with the output.

filterLogFiles() searchstring bug: read and RANDread data merged

There is a logic bug in L66 in filterLogFiles() function. And L50 searchstring construction.
Start-of-string delimeter is not used in if searchstring['searchString'] in item: . As a result:

  • randread data is additively merged to read data.
  • randwrite data is additively merged to write data.

This corrupts resulting graph by summing data vertically a nondeterministic way.
Can be visualized by adding pprint.pprint(logfiles) at L116 in fio_plot after logfiles is declared.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.