Coder Social home page Coder Social logo

puppet-hadoop's Introduction

Hadoop

This module was created to assist with the installation and configuration of hadoop. Simply edit the params.pp file and mapreduce your self away!

Configuration

  • A tar.gz file needs to be placed into ~/modules/hadoop/files. You can download hadoop from here: http://hadoop.apache.org/common/releases.html
  • Once downloaded the params.pp file needs to be updated with the version downloaded.
  • The params.pp also requires the java module I have already published. That or the $java_home variable needs to be properly updated.

SSH Keys

The ssh keys for the hduser are in ~/files/ssh/ make sure you edit these files and put in your own public and private keys. If you are using this module for multiple hadoop servers the id_rsa.pub and id_rsa keys will be the same for each hduser. Also the authorized_keys file is defined in puppet as the id_rsa.pub file. If you wish to add support for other users you need to change the init.pp so authorized_keys is a differnt file in ~/files/ssh

Cluster Mode

Currently the configuration is setup for a cluster with atleast 3 nodes. Each node needs to be named in params.pp the first node should be defined as $master and the other two nodes should be defined as $slaves.

If adding more then 3 nodes up $replication value to the number of total nodes in your cluster. Also add each node to the $slaves variable.

Author

puppet-hadoop's People

Contributors

bcarpio avatar blazindrop avatar fsteeg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

puppet-hadoop's Issues

Error: 'tar -zxf hadoop-1.1.1.tar.gz' is not qualified and no path was specified. Please qualify the command or specify a path.

Also getting this, which from Google searching seems to be because I am using "puppet apply"

Debug: Exec[untar jdk1.7.0_10.tar.gz]: Adding default for path
...
Error: 'tar -zxf hadoop-1.1.1.tar.gz' is not qualified and no path was specified. Please qualify the command or specify a path.

jblaine@ip-10-191-115-140:~/test/modules/hadoop$ ls files/hadoop-1.1.1.tar.gz
files/hadoop-1.1.1.tar.gz
jblaine@ip-10-191-115-140:~/test/modules/hadoop$ grep 1.1.1 manifests/*
manifests/params.pp:            default                 => "1.1.1",
jblaine@ip-10-191-115-140:~/test/modules/hadoop$
'''

http://serverfault.com/questions/345201/how-to-set-path-when-applying-single-puppet-module

Some changes to make it work

Hi,

Great work,
I just made the following changes to make it work

params.pp

Commented out
include java::params
and just set
$java_home = $::hostname ? {
default => "/usr/java/jdk1.7.0_09",
}

also for hostnames with domain name , I had to use ""
$master = $::hostname ? {
default => "devagent1.alfa.local",
}
$slaves = $::hostname ? {
default => ["devagent1.alfa.local","devagent2.alfa.local","devagent3.alfa.local"]
}

init.pp

Commented out
include hadoop::master
include hadoop::slave

for exec I set the exec path
$execPath = '/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:'

exec { "untar hadoop-${hadoop::params::version}.tar.gz":
path => $execPath,
command => "tar -zxf hadoop-${hadoop::params::version}.tar.gz",

remove File["java-app-dir"] require in
exec { "${hadoop::params::hadoop_base}/hadoop-${hadoop::params::version}/bin/hadoop namenode -format":

Result

Then it works perfectly

thanks Edwin

old version default

Seems like the version in params.pp is a little old (0.23....)

Has this been updated/tested in 1.X / 2.X?

Error: Could not find class hadoop::master ...

When I run this with Puppet 2.7 I get:

Error: Could not find class hadoop::master ...

This is, I believe, from the "include hadoop::master" in manifests/init.pp and, to me, the error seems legit. Did you mean "include hadoop::cluster::master" and "include hadoop::cluster::slave" ? It seems so from here -- changing those stops the error.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.