ghostjat / np Goto Github PK
View Code? Open in Web Editor NEWA Lite & Memory Efficient PHP Library for Scientific Computing
Home Page: https://ghostjat.github.io/Np/
License: MIT License
A Lite & Memory Efficient PHP Library for Scientific Computing
Home Page: https://ghostjat.github.io/Np/
License: MIT License
In matlab/octave, I can define a m x n matrix (3 x 5 in this case) and multiply it by a n x 1 column vector (5 x 1 in this case). It yields a 3x1 column vector:
X = [1 1 1 1 1; 2 2 2 2 2; 3 3 3 3 3]
X =
1 1 1 1 1
2 2 2 2 2
3 3 3 3 3
octave:427> w = [1;2;3;4;5]
w =
1
2
3
4
5
octave:428> X * w
ans =
15
30
45
However, ghostjat cannot multiply a 3 x 5 matrix times a vector with size=5:
require __DIR__ . '/np/vendor/autoload.php';
use Np\matrix;
use Np\vector;
$x = Np\matrix::ar([
[1,1,1,1,1],
[2,2,2,2,2],
[3,3,3,3,3]
]);
$w = Np\vector::ar([1, 2, 3, 4, 5]);
$p = $x->dot($w); // throws exception Mismatch Dimensions of given Objects! Obj-A col & Obj-B row amount need to be the same!
$p = $x->multiply($w); // throws exception Mismatch Dimensions of given Objects! Obj-A col & Obj-B row amount need to be the same!
It appears that the vector::sum function incorrectly calls blas::asum, which I believe returns the sum of absolute values instead of simply the sum of the values in a vector.
You can see the problem demonstrated with this short script:
<?php
require_once 'vendor/autoload.php';
Np\core\blas::$ffi_blas = FFI::load(__DIR__ . '/vendor/ghostjat/np/src/core/blas.h');
$v = Np\vector::ar([1, 2, 3]);
// this correctly returns 6
var_dump($v->sum());
$v = Np\vector::ar([-1, -2, -3]);
// this INCORRECTLY returns 6
var_dump($v->sum());
I don't know if there is a BLAS or LAPACK function optimized to return the correct value, but I suggest we modify the vector.php source code to change this:
/**
* The sum of the vector.
* @return float
*/
public function sum(): float {
return blas::asum($this);
}
to this:
/**
* The sum of the elements of the vector.
* @return float
*/
public function sum(): float {
$sum = 0;
for($i=0; $i<$this->ndim; $i++) {
$sum += $this->data[$i];
}
return $sum;
}
/**
* The sum of the absolute values of the elements of the vector.
* @return float
*/
public function sumAbs(): float {
return blas::asum($this);
}
I have attempted to run the sample PHP script in your README file and it encounters a fatal error:
PHP Fatal error: Uncaught FFI\Exception: Failed loading scope 'blas' in /Users/sneakyimp/np/vendor/ghostjat/np/src/core/blas.php:28
Stack trace:
#0 /Users/sneakyimp/np/vendor/ghostjat/np/src/core/blas.php(28): FFI::scope('blas')
#1 /Users/sneakyimp/np/vendor/ghostjat/np/src/core/blas.php(73): Np\core\blas::init()
#2 /Users/sneakyimp/np/vendor/ghostjat/np/src/linAlgb/linAlg.php(45): Np\core\blas::gemm(Object(Np\matrix), Object(Np\matrix), Object(Np\matrix))
#3 /Users/sneakyimp/np/vendor/ghostjat/np/src/linAlgb/linAlg.php(30): Np\matrix->dotMatrix(Object(Np\matrix))
#4 /Users/sneakyimp/np/foo.php(8): Np\matrix->dot(Object(Np\matrix))
#5 {main}
thrown in /Users/sneakyimp/np/vendor/ghostjat/np/src/core/blas.php on line 28Fatal error: Uncaught FFI\Exception: Failed loading scope 'blas' in /Users/sneakyimp/np/vendor/ghostjat/np/src/core/blas.php:28
Stack trace:
#0 /Users/sneakyimp/np/vendor/ghostjat/np/src/core/blas.php(28): FFI::scope('blas')
#1 /Users/sneakyimp/np/vendor/ghostjat/np/src/core/blas.php(73): Np\core\blas::init()
#2 /Users/sneakyimp/np/vendor/ghostjat/np/src/linAlgb/linAlg.php(45): Np\core\blas::gemm(Object(Np\matrix), Object(Np\matrix), Object(Np\matrix))
#3 /Users/sneakyimp/np/vendor/ghostjat/np/src/linAlgb/linAlg.php(30): Np\matrix->dotMatrix(Object(Np\matrix))
#4 /Users/sneakyimp/np/foo.php(8): Np\matrix->dot(Object(Np\matrix))
#5 {main}
thrown in /Users/sneakyimp/np/vendor/ghostjat/np/src/core/blas.php on line 28
I have the FFI extension loaded in PHP. The FFI documentation is sorely incomplete so I'm not at all sure what this scope()
call is supposed to do. The docs mention a #define
statement, which you appear to have in blas.h:
#define FFI_SCOPE "blas"
EDIT: I'm running this script in PHP 8.1 on MacOs. I have the FFI extension loaded, and brew shows openblas and lapack are installed.
I have also confirmed the error with PHP 8.1 on Ubuntu 20.04 LTS.
The colAsVector method clearly has a problem. Some simple code to illustrate:
require __DIR__ . '/vendor/autoload.php';
use Np\matrix;
$v = matrix::ar([
[1,2,3,4,5,6],
[7,8,9,10,11,12]
]);
echo $v, "\n";
$shape = $v->getShape();
for($i=0; $i<$shape->n; $i++) {
$vect = $v->colAsVector($i);
echo $vect, "\n";
}
The output is clearly wrong, and shows the second item in each column drifting off from the correct value.
Np\matrix
1.000000 2.000000 3.000000 4.000000 5.000000 6.000000
7.000000 8.000000 9.000000 10.000000 11.000000 12.000000
Np\vector
1.000000 3.000000
Np\vector
2.000000 4.000000
Np\vector
3.000000 5.000000
Np\vector
4.000000 6.000000
Np\vector
5.000000 7.000000
Np\vector
6.000000 8.000000
I believe this modified version of the function may remedy the problem:
/**
* Return a col as vector from the matrix.
* @param int $index
* @return \Np\vector
*/
public function colAsVector(int $index): vector {
$vr = vector::factory($this->row);
for ($i = 0; $i < $this->row; $i++) {
$vr->data[$i] = $this->data[($i * $this->col) + $index];
}
return $vr;
}
The function linAlg::dot needs to have its return type changed to include float and possible int and/or other scalar types. It is possible to multiply two vectors and get a scalar result. E.g., this code in matlab/octave returns a scalar, and blas returns a float:
octave:25> m1 = [1 2 3]
m1 =
1 2 3
octave:26> m2 = [1;2;3]
m2 =
1
2
3
octave:27> m1 * m2
ans = 14
This code should return 14:
$m1 = Np\vector::ar([1, 2, 3]);
echo "$m1\n";
$m2 = Np\vector::ar([1, 2, 3]);
echo "$m2\n";
$v = $m1->dot($m2);
echo "$v\n";
but due to the narrow return type restriction in linAlgb::dot, it throws this error:
PHP Fatal error: Uncaught TypeError: Np\vector::dot(): Return value must be of type Np\matrix|Np\vector, float returned in /Users/sneakyimp/Desktop/biz/machine-learning/np/vendor/ghostjat/np/src/linAlgb/linAlg.php:34
Simply modifying the linAlg::dot function as follows should fix this particular error:
/**
*
* get dot product of m.m | m.v | v.v
*
* @param \Np\matrix|\Np\vector $d
* @return matrix|vector|float
*/
public function dot(matrix|vector $d): matrix|vector|float {
if ($this instanceof matrix) {
if ($d instanceof matrix) {
return $this->dotMatrix($d);
}
return $this->dotVector($d);
}
return blas::dot($this, $d);
}
The composer command you offer in the README file:
composer require ghostjat/np
Results in a complaint:
Could not find a version of package ghostjat/np matching your minimum-stability (stable). Require it with an explicit version constraint allowing its desired stability.
It may help to specify a slightly different composer require command:
composer require ghostjat/np:dev-main
where you can specify one of your branches (e.g., v0.0-alpha or np-0.0.1-alpha).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.