Coder Social home page Coder Social logo

arc-furnace's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

arc-furnace's Issues

Add logging for failed expectations when processing rows

In many cases the ArcFurnace nodes expect rows that come in to have certain fields--this is especially true for Hash and Equijoin nodes. These nodes should be resilient to missing data and properly log when expectations are not met instead of failing with a stacktrace (which they often do).

Much larger output file sizes than input file sizes

One of our projects has a ProductPipeline that is a relatively simple implementation of the library. Here's a snippet:

require_relative 'constants'
require 'arc-furnace/pipeline'
require 'arc-furnace/excel_source'
require 'arc-furnace/all_fields_csv_sink'

class ProductsPipeline < ArcFurnace::Pipeline

  include Constants

  # create products source

  source :products_source,
         type: ArcFurnace::ExcelSource,
         params: {
             filename: :product_filename,
             encoding: 'ISO-8859-1'
         }

  transform :products_transform, params: { source: :products_source } do |hash|
    result = hash.deep_dup
    result[SALSIFY_ID] = result.delete(BLAH_ID)
    result
  end

  filter :filtered_products, params: { source: :products_transform, observed_products: :observed_products } do |row, params|
    params.fetch(:observed_products).add(row[BLAH_ID])
  end

  sink type: ArcFurnace::AllFieldsCSVSink,
       source: :filtered_products,
       params: { filename: "#{Dir.pwd}/products_import.csv" }

end

The source file is a 14 MB XLSX file, but the output file is a 71 MB CSV file. The output file is five times larger and XLSX files tend to be larger than CSV files relative to the information contained. I tried removing the filter and the file size was the same, and I spot checked the file and they look identical. I feel like something is going wrong with the AllFieldsCSVSink.

Queue @dspangen

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.