Coder Social home page Coder Social logo

江哥's Projects

pointsift icon pointsift

a module for 3D semantic segmentation in point clouds.

project1 icon project1

Context This dataset contains tree observations from four areas of the Roosevelt National Forest in Colorado. All observations are cartographic variables (no remote sensing) from 30 meter x 30 meter sections of forest. There are over half a million measurements total! Content This dataset includes information on tree type, shadow coverage, distance to nearby landmarks (roads etcetera), soil type, and local topography. Inspiration Can you build a model that predicts what types of trees grow in an area based on the surrounding characteristics? A past Kaggle competition project on this topic can be found here. What kinds of trees are most common in the Roosevelt National Forest? Which tree types can grow in more diverse environments? Are there certain tree types that are sensitive to an environmental factor, such as elevation or soil type? Data Set Information: Predicting forest cover type from cartographic variables only (no remotely sensed data). The actual forest cover type for a given observation (30 x 30 meter cell) was determined from US Forest Service (USFS) Region 2 Resource Information System (RIS) data. Independent variables were derived from data originally obtained from US Geological Survey (USGS) and USFS data. Data is in raw form (not scaled) and contains binary (0 or 1) columns of data for qualitative independent variables (wilderness areas and soil types). This study area includes four wilderness areas located in the Roosevelt National Forest of northern Colorado. These areas represent forests with minimal human-caused disturbances, so that existing forest cover types are more a result of ecological processes rather than forest management practices. Some background information for these four wilderness areas: Neota (area 2) probably has the highest mean elevational value of the 4 wilderness areas. Rawah (area 1) and Comanche Peak (area 3) would have a lower mean elevational value, while Cache la Poudre (area 4) would have the lowest mean elevational value. As for primary major tree species in these areas, Neota would have spruce/fir (type 1), while Rawah and Comanche Peak would probably have lodgepole pine (type 2) as their primary species, followed by spruce/fir and aspen (type 5). Cache la Poudre would tend to have Ponderosa pine (type 3), Douglas-fir (type 6), and cottonwood/willow (type 4). The Rawah and Comanche Peak areas would tend to be more typical of the overall dataset than either the Neota or Cache la Poudre, due to their assortment of tree species and range of predictive variable values (elevation, etc.) Cache la Poudre would probably be more unique than the others, due to its relatively low elevation range and species composition. Data_Dictionary Elevation = Elevation in meters. Aspect = Aspect in degrees azimuth. Slope = Slope in degrees. Horizontal_Distance_To_Hydrology = Horizontal distance to nearest surface water features. Vertical_Distance_To_Hydrology = Vertical distance to nearest surface water features. Horizontal_Distance_To_Roadways = Horizontal distance to nearest roadway. Hillshade_9am = Hill shade index at 9am, summer solstice. Value out of 255. Hillshade_Noon = Hill shade index at noon, summer solstice. Value out of 255. Hillshade_3pm = Hill shade index at 3pm, summer solstice. Value out of 255. Horizontal_Distance_To_Fire_Point = sHorizontal distance to nearest wildfire ignition points. Wilderness_Area1 = Rawah Wilderness Area Wilderness_Area2 = Neota Wilderness Area Wilderness_Area3 = Comanche Peak Wilderness Area Wilderness_Area4 = Cache la Poudre Wilderness Area Soil_Type1 to Soil_Type40 [Total 40 Types] Cover_TypeForest Cover Type designation. Integer value between 1 and 7, with the following key: Spruce/Fir Lodgepole Pine Ponderosa Pine Cottonwood/Willow Aspen Douglas-fir Krummholz

pseudo_lidar_v2 icon pseudo_lidar_v2

(ICLR) Pseudo-LiDAR++: Accurate Depth for 3D Object Detection in Autonomous Driving

pycrown icon pycrown

PyCrown - Fast raster-based individual tree segmentation for LiDAR data

rangenet_lib icon rangenet_lib

Inference module for RangeNet++ (milioto2019iros, chen2019iros)

semantic3dnet icon semantic3dnet

Point cloud semantic segmentation via Deep 3D Convolutional Neural Network

slam icon slam

These sre some slam algorithms I've been studied.Not just push the code, I also share my notes, enjoy slam!

source icon source

Source code of the TLSeparation Python package.

treeseg icon treeseg

Extract individual trees from lidar point clouds

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.