Coder Social home page Coder Social logo

kdtree's Introduction

kdtree

GoDoc Build Status Codecov Go Report Card License

A k-d tree implementation in Go with:

  • n-dimensional points
  • k-nearest neighbor search
  • range search
  • remove without rebuilding the whole subtree
  • data attached to the points
  • using own structs by implementing a simple 2 function interface

Usage

go get github.com/kyroy/kdtree
import "github.com/kyroy/kdtree"

Implement the kdtree.Point interface

// Point specifies one element of the k-d tree.
type Point interface {
	// Dimensions returns the total number of dimensions
	Dimensions() int
	// Dimension returns the value of the i-th dimension
	Dimension(i int) float64
}

points.Point2d

type Data struct {
	value string
}

func main() {
	tree := kdtree.New([]kdtree.Point{
		&points.Point2D{X: 3, Y: 1},
		&points.Point2D{X: 5, Y: 0},
		&points.Point2D{X: 8, Y: 3},
	})

	// Insert
	tree.Insert(&points.Point2D{X: 1, Y: 8})
	tree.Insert(&points.Point2D{X: 7, Y: 5})

	// KNN (k-nearest neighbor)
	fmt.Println(tree.KNN(&points.Point{Coordinates: []float64{1, 1, 1}}, 2))
	// [{3.00 1.00} {5.00 0.00}]
	
	// RangeSearch
	fmt.Println(tree.RangeSearch(kdrange.New(1, 8, 0, 2)))
	// [{5.00 0.00} {3.00 1.00}]
    
	// Points
	fmt.Println(tree.Points())
	// [{3.00 1.00} {1.00 8.00} {5.00 0.00} {8.00 3.00} {7.00 5.00}]

	// Remove
	fmt.Println(tree.Remove(&points.Point2D{X: 5, Y: 0}))
	// {5.00 0.00}

	// String
	fmt.Println(tree)
	// [[{1.00 8.00} {3.00 1.00} [<nil> {8.00 3.00} {7.00 5.00}]]]

	// Balance
	tree.Balance()
	fmt.Println(tree)
	// [[[{3.00 1.00} {1.00 8.00} <nil>] {7.00 5.00} {8.00 3.00}]]
}

n-dimensional Points (points.Point)

type Data struct {
	value string
}

func main() {
    tree := kdtree.New([]kdtree.Point{
        points.NewPoint([]float64{7, 2, 3}, Data{value: "first"}),
        points.NewPoint([]float64{3, 7, 10}, Data{value: "second"}),
        points.NewPoint([]float64{4, 6, 1}, Data{value: "third"}),
    })
    
    // Insert
    tree.Insert(points.NewPoint([]float64{12, 4, 6}, Data{value: "fourth"}))
    tree.Insert(points.NewPoint([]float64{8, 1, 0}, Data{value: "fifth"}))
    
    // KNN (k-nearest neighbor)
    fmt.Println(tree.KNN(&points.Point{Coordinates: []float64{1, 1, 1}}, 2))
    // [{[4 6 1] {third}} {[7 2 3] {first}}]
    
    // RangeSearch
    fmt.Println(tree.RangeSearch(kdrange.New(1, 15, 1, 5, 0, 5)))
    // [{[7 2 3] {first}} {[8 1 0] {fifth}}]
    
    // Points
    fmt.Println(tree.Points())
    // [{[3 7 10] {second}} {[4 6 1] {third}} {[8 1 0] {fifth}} {[7 2 3] {first}} {[12 4 6] {fourth}}]

    // Remove
    fmt.Println(tree.Remove(points.NewPoint([]float64{3, 7, 10}, nil)))
    // {[3 7 10] {second}}

    // String
    fmt.Println(tree)
    // [[<nil> {[4 6 1] {third}} [{[8 1 0] {fifth}} {[7 2 3] {first}} {[12 4 6] {fourth}}]]]

    // Balance
    tree.Balance()
    fmt.Println(tree)
    // [[[{[7 2 3] {first}} {[4 6 1] {third}} <nil>] {[8 1 0] {fifth}} {[12 4 6] {fourth}}]]
}

kdtree's People

Contributors

kyroy avatar usedbytes avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

kdtree's Issues

Refactoring suggestion

Hey, I submitted PR with code, which will ease process of using module in projects. Can you give it a thought ?

FindLargest, FindSmallest are O(n)

FindLargest, FindSmallest are used when removing a node from the k-d tree. These functions do not take into account the axis of the currently expanded node

kdtree/kdtree.go

Lines 357 to 368 in 70830f8

func (n *node) FindLargest(axis int, largest *node) *node {
if largest == nil || n.Dimension(axis) > largest.Dimension(axis) {
largest = n
}
if n.Left != nil {
largest = n.Left.FindLargest(axis, largest)
}
if n.Right != nil {
largest = n.Right.FindLargest(axis, largest)
}
return largest
}

Given we are calling node.Remove recursively, this seems to suggest our actual runtime complexity to be O(nlogn), whereas general BST remove should take O(n).

As per Wikipedia, if we are okay with O(nlogn) complexity, we can alternatively just rebuild the node.

Alternatively, we can tombstone the node instead, which optimizes Remove for a slightly larger lookup time.

KNN returns incorrect values unless tree is Balance()d

This might be a case of "using it wrong", but KNN does not always return the nearest neighbour(s) unless the tree is balanced before the query.

My understanding is that KNN search might be slow/not-optimal on an unbalanced tree, but should still work correctly. If this isn't the case then just adding that in the doc for KNN is likely enough.

This issue is demonstrated by the attached test program, which iteratively mutates a random tree, and makes two KNN queries against it without any modifications to the tree in-between. The first has 'k' set to some low number (few), and the second has 'k' set to a higher number (many).

I would expect the first 'k' entries in the many query to match the entries in the few query, but the few query often omits the closest neighbours. In the output below, you can see that the t.KNN(p, 1) says that a point at a distance of 14.32 is the closest, whereas t.KNN(p, 10) gives the correct result of 11.66. After balancing, the results are correct. The value of few isn't critical - in my tests, any value of k < (number of entries - 1) shows the problem, but less frequently than smaller 'k's

Getting 1 NNs out of 10 entries
Failure on iteration 618
KNN((186.00, 218.00), 1)
   NN 0: distance((186.00, 218.00)->(172.00, 215.00)) = 14.32
KNN((186.00, 218.00), 10)
   NN 0: distance((186.00, 218.00)->(176.00, 212.00)) = 11.66
   NN 1: distance((186.00, 218.00)->(172.00, 215.00)) = 14.32
   NN 2: distance((186.00, 218.00)->(206.00, 196.00)) = 29.73
   NN 3: distance((186.00, 218.00)->(214.00, 188.00)) = 41.04
   NN 4: distance((186.00, 218.00)->(218.00, 253.00)) = 47.42
   NN 5: distance((186.00, 218.00)->(269.00, 176.00)) = 93.02
   NN 6: distance((186.00, 218.00)->(276.00, 181.00)) = 97.31
   NN 7: distance((186.00, 218.00)->(279.00, 264.00)) = 103.75
   NN 8: distance((186.00, 218.00)->(287.00, 244.00)) = 104.29
   NN 9: distance((186.00, 218.00)->(299.00, 205.00)) = 113.75
After balance:
KNN((186.00, 218.00), 1)
   NN 0: distance((186.00, 218.00)->(176.00, 212.00)) = 11.66
KNN((186.00, 218.00), 10)
   NN 0: distance((186.00, 218.00)->(176.00, 212.00)) = 11.66
   NN 1: distance((186.00, 218.00)->(172.00, 215.00)) = 14.32
   NN 2: distance((186.00, 218.00)->(206.00, 196.00)) = 29.73
   NN 3: distance((186.00, 218.00)->(214.00, 188.00)) = 41.04
   NN 4: distance((186.00, 218.00)->(218.00, 253.00)) = 47.42
   NN 5: distance((186.00, 218.00)->(269.00, 176.00)) = 93.02
   NN 6: distance((186.00, 218.00)->(276.00, 181.00)) = 97.31
   NN 7: distance((186.00, 218.00)->(279.00, 264.00)) = 103.75
   NN 8: distance((186.00, 218.00)->(287.00, 244.00)) = 104.29
   NN 9: distance((186.00, 218.00)->(299.00, 205.00)) = 113.75

Code follows:

package main

import (
	"fmt"
	"math"
	"math/rand"
	"time"

	"github.com/kyroy/kdtree"
	"github.com/kyroy/kdtree/points"
)

func distance(a, b *points.Point2D) float64 {
	return math.Sqrt(math.Pow(a.X - b.X, 2) + math.Pow(a.Y - b.Y, 2))
}

func dumpNNs(p kdtree.Point, nns []kdtree.Point) {
	p2d1 := p.(*points.Point2D)
	fmt.Printf("KNN((%3.2f, %3.2f), %d)\n", p2d1.X, p2d1.Y, len(nns))

	for i, n := range nns {
		p2d2 := n.(*points.Point2D)
		fmt.Printf("   NN %d: distance((%3.2f, %3.2f)->(%3.2f, %3.2f)) = %3.2f\n",
			i, p2d1.X, p2d1.Y, p2d2.X, p2d2.Y, distance(p2d1, p2d2))
	}
}

func main() {
	w, h := 300, 300
	maxSize := 10
	maxTime := time.Minute

	t := kdtree.New([]kdtree.Point{})
	arr := make([]kdtree.Point, 0, maxSize)

	many := maxSize
	few := maxSize - 1

	for ; few > 0; few-- {
		fmt.Printf("Getting %d NNs out of %d entries\n", few, maxSize)
		start := time.Now()
		for i := 0; ; i++ {
			p := &points.Point2D{
				X: float64(rand.Intn(w / 2) + w / 2),
				Y: float64(rand.Intn(h / 2) + h / 2),
			}

			// Two KNN queries
			fewNN := t.KNN(p, few)
			manyNN := t.KNN(p, many)

			// Check if the nearest is the same
			if len(fewNN) > 0 {
				if distance(p, fewNN[0].(*points.Point2D)) >
					distance(p, manyNN[0].(*points.Point2D)) {
					fmt.Println("Failure on iteration", i)
					dumpNNs(p, fewNN)
					dumpNNs(p, manyNN)

					// Balance the tree and try again
					t.Balance()
					fmt.Println("After balance:")
					fewNN = t.KNN(p, few)
					manyNN = t.KNN(p, many)
					dumpNNs(p, fewNN)
					dumpNNs(p, manyNN)

					break
				}
			}

			// Add in the new point
			arr = append(arr, p)
			//fmt.Printf("Insert (%3.2f, %3.2f)\n", p.X, p.Y)
			t.Insert(p)

			// Limit the max number of elements - which will also
			// introduce some churn in the tree
			if len(arr) > maxSize {
				idx := rand.Intn(len(arr))
				//fmt.Printf("Remove (%3.2f, %3.2f)\n", p.X, p.Y)
				t.Remove(arr[idx])
				p = arr[idx].(*points.Point2D)
				arr[idx] = arr[len(arr)-1]
				arr = arr[:len(arr)-1]
			}

			if since := time.Since(start); since > maxTime {
				fmt.Printf("No failure after %d iterations (%v)\n", i, since)
				break
			}
		}
		fmt.Println("================================================")
		t.Balance()
	}
}

Can't reach Data of the point returned by KNN search points array

tree := new(kdtree.KDTree)
for i := 0; i < length; i++ {
	R, G, B, _ := img.At(0, i).RGBA()
	tree.Insert(points.NewPoint([]float64 {
		float64(R),
		float64(G),
		float64(B)}, Data{Value: i}))
}

I have built my KDTree with the above code.
When I perform a KNN search, returned points don't have the "Data" interface.
Using fmt.Println() shows related data in the console, however point.Data is impossible to use.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.