Coder Social home page Coder Social logo

hipparchus-math / hipparchus Goto Github PK

View Code? Open in Web Editor NEW
133.0 17.0 41.0 31.1 MB

An efficient, general-purpose mathematics components library in the Java programming language

License: Apache License 2.0

Java 99.59% R 0.34% Shell 0.02% Python 0.04% CSS 0.01%

hipparchus's Introduction

Hipparchus

License

The Hipparchus project is a library of lightweight, self-contained mathematics and statistics components addressing the most common problems not available in the Java programming language.

Documentation

More information can be found on the homepage. The JavaDoc can be browsed. Questions related to the usage of Hipparchus should be posted to the users mailing list.

Where can I get the latest release?

You can download source and binaries from our download page.

Alternatively you can pull it from the central Maven repositories using pom.xml settings:

<project>
  <properties>
    <!-- change the Hipparchus version number to the one suiting your needs -->
    <myprojectname.hipparchus.version>3.1</myprojectname.hipparchus.version>
  </properties>

  <dependencies>
    <dependency>
      <groupId>org.hipparchus</groupId>
      <artifactId>hipparchus-core</artifactId>
      <version>${myprojectname.hipparchus.version}</version>
    </dependency>
    <dependency>
      <groupId>org.hipparchus</groupId>
      <artifactId>hipparchus-clustering</artifactId>
      <version>${myprojectname.hipparchus.version}</version>
    </dependency>
    <dependency>
      <groupId>org.hipparchus</groupId>
      <artifactId>hipparchus-fft</artifactId>
      <version>${myprojectname.hipparchus.version}</version>
    </dependency>
    <dependency>
      <groupId>org.hipparchus</groupId>
      <artifactId>hipparchus-fitting</artifactId>
      <version>${myprojectname.hipparchus.version}</version>
    </dependency>
    <dependency>
      <groupId>org.hipparchus</groupId>
      <artifactId>hipparchus-geometry</artifactId>
      <version>${myprojectname.hipparchus.version}</version>
    </dependency>
    <dependency>
      <groupId>org.hipparchus</groupId>
      <artifactId>hipparchus-ode</artifactId>
      <version>${myprojectname.hipparchus.version}</version>
    </dependency>
    <dependency>
      <groupId>org.hipparchus</groupId>
      <artifactId>hipparchus-optim</artifactId>
      <version>${myprojectname.hipparchus.version}</version>
    </dependency>
    <dependency>
      <groupId>org.hipparchus</groupId>
      <artifactId>hipparchus-stat</artifactId>
      <version>${myprojectname.hipparchus.version}</version>
    </dependency>
  </dependencies>
</project>

If your project previously depended on Apache Commons Math and you want to switch to Hipparchus, you can also add the temporary migration jar

<dependency>
  <groupId>org.hipparchus</groupId>
  <artifactId>hipparchus-migration</artifactId>
  <version>${myprojectname.hipparchus.version}</version>
</dependency>

Contributing

There are some guidelines which will make applying contributions easier for us. Please read through our contributing guidelines.

To contact us, use the shared forum where several categories are dedicated to Hipparchus.

License

Code is under the Apache Licence v2.

hipparchus's People

Contributors

andrewsgoetz avatar antolinoandrea avatar axkr avatar bryancazabonne avatar dependabot[bot] avatar ebourg avatar gaetanpierre0 avatar garydgregory avatar gbonnefille avatar hankg avatar itstechupnorth avatar jvalet avatar lucasgirodet avatar maisonobe avatar maximejo avatar mdiggory avatar mebigfatguy avatar mgrutten avatar netomi avatar oertl avatar olamy avatar oleersoy avatar psteitz avatar rahu1 avatar romgerale avatar sdinot avatar sebbasf avatar serrof avatar vincentcucchietti avatar wardev avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hipparchus's Issues

[Feature Request] Kalman Filter - (new module hipparchus-filtering)

We have been discussing about the necessity of a KalmanFilter in the Hipparchus Luc, I, Maxime).

One alternative, the proposal of this issue, is to create a new submodule called "hipparchus-filtering" that will contain the code from the last version of Apache Commons Math (http://commons.apache.org/proper/commons-math/javadocs/api-3.6/org/apache/commons/math3/filter/package-summary.html) ported to Hipparchus.

A small enhancement will be to change the KalmanFilter implementation from apache because it uses a very strict DEFAULT_RELATIVE_SYMMETRY_THRESHOLD in the "correct" method (namely, in the CholeskyDecomposition). Therefore, it would be great if the KalmanFilter receives, in an additional constructor, the relative symmetry threshold for the CholeskyDecomposition.

The goal of this issue is to allow a centralized discussion between the contributors.

SphericalPolygonSet(double, S2Point...) fails for high resolution boundaries

Use case:

  1. Use external program to design boundary
  2. Export high resolution boundary (closely spaced points)
  3. Import in Hipparchus
  4. Use hyperplaneThickness to control performance/fidelity tradeoff

Below is a failing test case adapted from the ZigZag test case. The ZigZag region is enlarged because this issue does not occur for very small regions. The resulting region is ~ 30 degrees across. The ZigZag boundary is then sub-sampled to a tenth of the tolerance. The sub sampled data is what would be read in from an external file in the above use case. The sub-sampled data is used to create a new SPS, which is tested. On my machine a NPE is generated by the call to getEnclosingCap().

I've also included a plot of the region. Blue dots are the sub-sampled points, black lines and dots are the computed boundary. It matches closely in some locations, others are very far off.

@maisonobe any help or pointers would be appreciated. I'm still trying to get my head around the BSPTree and partitioning code.

    @Test
    public void testZigZagBoundary() {
        final double tol = 1.0e-4;
        // sample region, non-convex, not too big, not too small
        final S2Point[] vertices = {
                new S2Point(-0.12630940610562444e1, (0.8998192093789258 - 0.89) * 100),
                new S2Point(-0.12731320182988207e1, (0.8963735568774486 - 0.89) * 100),
                new S2Point(-0.1351107624622557e1, (0.8978258663483273 - 0.89) * 100),
                new S2Point(-0.13545331405131725e1, (0.8966781238246179 - 0.89) * 100),
                new S2Point(-0.14324883017454967e1, (0.8981309629283796 - 0.89) * 100),
                new S2Point(-0.14359875625524995e1, (0.896983965573036 - 0.89) * 100),
                new S2Point(-0.14749650541159384e1, (0.8977109994666864 - 0.89) * 100),
                new S2Point(-0.14785037758231825e1, (0.8965644005442432 - 0.89) * 100),
                new S2Point(-0.15369807257448784e1, (0.8976550608135502 - 0.89) * 100),
                new S2Point(-0.1526225554339386e1, (0.9010934265410458 - 0.89) * 100),
                new S2Point(-0.14679028466684121e1, (0.9000043396997698 - 0.89) * 100),
                new S2Point(-0.14643807494172612e1, (0.9011511073761742 - 0.89) * 100),
                new S2Point(-0.1386609051963748e1, (0.8996991539048602 - 0.89) * 100),
                new S2Point(-0.13831601655974668e1, (0.9008466623902937 - 0.89) * 100),
                new S2Point(-0.1305365419828323e1, (0.8993961857946309 - 0.89) * 100),
                new S2Point(-0.1301989630405964e1, (0.9005444294061787 - 0.89) * 100)};
        SphericalPolygonsSet zone = new SphericalPolygonsSet(tol, vertices);
        // sample high resolution boundary
        List<S2Point> points = new ArrayList<>();
        final Vertex start = zone.getBoundaryLoops().get(0);
        Vertex v = start;
        double step = tol / 10;
        do {
            Edge outgoing = v.getOutgoing();
            final double length = outgoing.getLength();
            int n = (int) (length / step);
            for (int i = 0; i < n; i++) {
                points.add(new S2Point(outgoing.getPointAt(i*step)));
            }
            v = outgoing.getEnd();
        } while (v != start);
        // create zone from high resolution boundary
        zone = new SphericalPolygonsSet(tol, points.toArray(new S2Point[0]));
        //print(zone);
        EnclosingBall<Sphere2D, S2Point> cap = zone.getEnclosingCap();
        // check cap size is reasonable. The region is ~0.5 accross, could be < 0.25
        Assert.assertTrue(cap.getRadius() < 0.5);
        for (S2Point vertex : vertices) {
            // check original points are on the boundary
            Assert.assertEquals("" + vertex, Location.BOUNDARY, zone.checkPoint(vertex));
            // check original points are within the cap
            Assert.assertTrue("" + vertex, cap.contains(vertex));
        }
    }

image

WilcoxonSignedRankTest sometimes returns incorrect results

The implementation in WilcoxonSignedRankTest does not handle tied pairs appropriately and the continuity correction applied when computing the normal approximation is incorrect.

Handling of ties should ideally be configurable (see e.g. scipy.stats.wilcoxon). Minimally, the implementation should document and correctly implement a strategy for handling tied pairs.

This issue was originally reported as MATH-1233.

Strange behaviour in EigenDecomposition

Hello all,
I am using EigenDecomposition class these days and sometimes the diagonalized matrix I get in return seems wrong.
A typical case is when I set this matrix :
{{23473.684554963584, 4273.093076392109},
{4273.093076392048, 4462.13956661408}}

I know the matrix is not perfectly symmetric (yet close to double precision) but I got this matrix from a AMtranspose(A) computation so I can not get a more symmetric matrix.
The algorithm then computes complex Eigen values matrix :
{{13967.9120607888, 10422.0456317615},
{-10422.0456317615, 13967.9120607888}}

I have checked with matlab and with a home-made 2*2 matrix diagonalizer and I get with both solutions the same real Eigen values :
24389.95769255035 and 3545.86642902732
which have nothing to see with the complex ones (in contradiction with the theoretical unicity of the Eigen values)

issue hypparchus.txt
.

I don't know if the symmetricity test is just too strict or if there is something else wrong or that I didn't understand but I find it suspicious ;)

Thank you in advance for your clues,
All the best,

Quentin

There are no field versions of sinCos

The FastMath.sinCos method has been added in version 1.3 to speed up computation
where both sine and cosine are required for the same angle. This is particularly true
for derivatives.

A Field equivalent would be welcome.

OLS/GLSLinearRegression sufficient data check is overly aggressive when model has no intercept

The validateSampleData method in AbstractLinearRegression requires that the number of rows in the design matrix is at least one greater than the number of regressors. If the model does not include an intercept term, this check is too stringent: nobs == number of regressors should be allowed in this case.

This issue was surfaced by the StackOverflow question OLS Multiple Linear Regression with commons-math

MathIllegalStateException on small SphericalPolygonsSet instances

It is possible to generate a MathIllegalStateException when using very small SphericalPolygonsSet instances. See code below to reproduce.

        S2Point[] s2pA = new S2Point[]{
                new S2Point(new Vector3D(0.1504230736114679, -0.6603084987333554, 0.7357754993377947)),
                new S2Point(new Vector3D(0.15011191112224423, -0.6603400871954631, 0.7358106980616113)),
                new S2Point(new Vector3D(0.15008035620222715, -0.6605195692153062, 0.7356560238085725)),
                new S2Point(new Vector3D(0.1503914563063968, -0.6604879854490165, 0.7356208472763267))
        };
        final SphericalPolygonsSet spsA = new SphericalPolygonsSet(1E-100, s2pA);
        spsA.getSize();

Unit test fails when compiling with Java 9

New methods have been added to Math/StrictMath with Java 9. As some unit tests in Hipparchus check that FastMath is always a drop-in replacement for Math/StrictMath and uses introspection for this purpose, these tests fails when the JVM used is based on Java 9.

ExponentialDistribution CDF definition?

From the CDF definition at:

should line 100

be

ret = 1.0 - FastMath.exp(-x * mean);

and line 119

be

ret = -1/mean * FastMath.log(1.0 - p);

?

Build fails on Java 11

When running mvn clean install on Java 11 it fails when trying to run the tests because @{jacoco.agent.args} is not replaced by anything. Applying the following patch seems to fix the build, but I don't know how it will affect other versions of java.

diff --git a/hipparchus-parent/pom.xml b/hipparchus-parent/pom.xml
index 0c6fec34e..cbcd23a5c 100644
--- a/hipparchus-parent/pom.xml
+++ b/hipparchus-parent/pom.xml
@@ -600,7 +600,7 @@
             <excludes>
               <exclude>**/*AbstractTest.java</exclude>
             </excludes>
-            <argLine>@{jacoco.agent.args} -Xmx1200m</argLine>
+            <argLine>-Xmx1200m</argLine>
             </configuration>
         </plugin>
         <plugin>

[Feature Request] Interface for method value on type T

I saw on Hipparchus that there is an interface, RealFieldUnivariateFunction, that allows to implement the method value(T x) for type T.
Is it possible to create also an interface to implement the method value for an array of T ? As it is already done for type double with the interface UnivariateVectorFunction
Thank, Bryan

KthSelector does not consider natural order of doubles

As a user I would expect that sorting the array and taking the k-th element yields the same result as using the KthSelector. Unfortunately, this is not true if the array contains NaN or -0/+0 values, because the implementation uses < and > operators for comparison instead of Double.compare.

Invalid BSPTree created during union

It is possible to create an invalid Sphere2D BSPTree containing a node with a null cut and attribute using the RegionFactory union method.

I have been able to trace this down to the SubCircle split method. The returned SplitSubHyperplane contains one side of the split as null but the BSPTree split method assumes that both sides will return non-null values.

The code below can be used to reproduce this issue. Note that a lowering the provided tolerance value will "solve" the issue but this is an edge case that should still be addressed.

        RegionFactory<Sphere2D> regionFactory = new RegionFactory<>();
        S2Point[] s2pA = new S2Point[]{
                new S2Point(new Vector3D(0.2122954606, -0.629606302, 0.7473463333)),
                new S2Point(new Vector3D(0.2120220248, -0.6296445493, 0.747391733)),
                new S2Point(new Vector3D(0.2119838016, -0.6298173178, 0.7472569934)),
                new S2Point(new Vector3D(0.2122571927, -0.6297790738, 0.7472116182))};

        S2Point[] s2pB = new S2Point[]{
                new S2Point(new Vector3D(0.2120291561, -0.629952069, 0.7471305292)),
                new S2Point(new Vector3D(0.2123026002, -0.6299138005, 0.7470851423)),
                new S2Point(new Vector3D(0.2123408927, -0.6297410403, 0.7472198923)),
                new S2Point(new Vector3D(0.2120674039, -0.6297793122, 0.7472653037))};

        final SphericalPolygonsSet spsA = new SphericalPolygonsSet(0.0001, s2pA);
        final SphericalPolygonsSet spsB = new SphericalPolygonsSet(0.0001, s2pB);
        SphericalPolygonsSet invalidSPS = (SphericalPolygonsSet) regionFactory.union(spsA, spsB);
        //Causes a NullPointerException
        System.out.println(invalidSPS.getSize());

Broken links: Interpolation

Allow secondary equations to update derivatives of primary equation in ODE

In some cases, additional equations can require to change the derivatives of the primary state.

One use case is optimal control, when the secondary equations handle co-state,
which changes control, and the control changes the primary state. In this
case, the primary and secondary equations are not really independent from each
other, so if possible it would be better to put state and co-state and their
equations all in the primary equations. However, this is not always possible, so
it would be better to explicitly allow secondary equations to have this side effect.

In fact, despite not being advertised, this was possible with Apache Commons Math
3.x and this feature was inadvertently removed in Hipparchus, as a side effect of
cleaning up the API.

Spurious restarts between Events - AbstractIntegrator.class

After a detection of a discontinuous event the RESET_DERIVATIVES action is triggered. However the resetOccurred flag in the AbstractIntegrator.class is not reset to false after the reset has been handled and spurious restarts occur between events. Please see: https://forum.orekit.org/t/adamsbashforthintegrator-propagation-with-srp/400/2
A temporary fixed suggested by Luc Maisonobe was to add the flag resetOccurred = false; before the boolean doneWithStep = false; in AbstractIntegrator.

DerivativeStructure for symbolic derivatives

Can you show an example/outline, how to implement a DerivativeStructure based on symbolic derivation?

At the moment I've used my own NewtonSolver in FindRoot, but it would be nice to have a general solution which I can use with hipparchus.

Enumerated real and integer distributions do not sufficiently validate constructor arguments

The EnumeratedRealDistribution and EnumeratedIntegerDistribution constructors that take parallel arrays of values and masses do not verify that the masses sum to 1. It is possible to create a "distribution" that is not a probability distribution. The probability arrays should be normalized to sum to 1 and a check should be added to ensure that at least one entry is positive.

Field integrators merge secondary states into primary state

When FieldODEIntegrator implementations create a FieldODEStateAndDerivative, they put all the components of the integrated state into the primary state and leave secondary state as null, instead of mapping components according to primary and secondary equations respective dimensions.

Error case in WelzlEncloser.pivotingBall

The following code generates an internal error:

	final WelzlEncloser<Euclidean3D, Vector3D> encloser =
            new WelzlEncloser<Euclidean3D, Vector3D>(1e-14, new SphereGenerator());
	List<Vector3D> points = new ArrayList<Vector3D>();
	points.add(new Vector3D(0.9999999731, 0.000200015, 0.0001174338));
	points.add(new Vector3D(0.9987716667, 0.0350821284, 0.0349914572));
	points.add(new Vector3D(0.9987856181, -0.0346743952, 0.0349996489));
	points.add(new Vector3D(0.9987938115, -0.0346825853, -0.0347568755));
	points.add(new Vector3D(0.9987798601, 0.0350739383, -0.0347650673));
            EnclosingBall<Euclidean3D, Vector3D> enclosing3D = encloser.enclose(points);

MannWhitneyTest reports the test statistic incorrectly and returns inaccurate p-values

The first problem (incorrect U statistic) was reported as MATH-1453. What is returned by MannWhitneyU is actually the Wilcoxon Signed Rank statistic (the maximum of the U+ and U-) What is used in the test is the correct statistic. The p-values returned by the test suffer from three accuracy-related problems:

  1. The normal approximation is alway used, even for very small samples.
  2. No continuity correction is applied to the normal approximation.
  3. No bias-correction is applied to the variance estimate when ties are present in the data.

Field ODE interpolation incorrect with restricted step

In RungeKuttaFieldStateInterpolator.previousStateLinearCombination(...) and currentStateLinearCombination(...) the wrong state is used when the interpolator is restricted. I.e. the soft state is used instead of the global state. It seems that the tests that would have caught this error were not copied from ODEStateInterpolatorAbstractTest.

Fixing and copying tests.

3D OutlineExtractor doesn't work with polyhedrons with holes

This test case fails. The 3D shape is a cube with square holes drilled along each axis.
The outline extracted when looking along the Z axis is empty, whereas it should be a big square
with a square hole in the middle.

@Test
public void testHolesInFacet() {
    double tolerance = 1.0e-10;
    PolyhedronsSet cube       = new PolyhedronsSet(-1.0, 1.0, -1.0, 1.0, -1.0, 1.0, tolerance);
    PolyhedronsSet tubeAlongX = new PolyhedronsSet(-2.0, 2.0, -0.5, 0.5, -0.5, 0.5, tolerance);
    PolyhedronsSet tubeAlongY = new PolyhedronsSet(-0.5, 0.5, -2.0, 2.0, -0.5, 0.5, tolerance);
    PolyhedronsSet tubeAlongZ = new PolyhedronsSet(-0.5, 0.5, -0.5, 0.5, -2.0, 2.0, tolerance);
    RegionFactory<Euclidean3D> factory = new RegionFactory<>();
    PolyhedronsSet cubeWithHoles = (PolyhedronsSet) factory.difference(cube,
                                                                       factory.union(tubeAlongX,
                                                                                     factory.union(tubeAlongY, tubeAlongZ)));
    Assert.assertEquals(4.0, cubeWithHoles.getSize(), 1.0e-10);
    Vector2D[][] outline = new OutlineExtractor(Vector3D.PLUS_I, Vector3D.PLUS_J).getOutline(cubeWithHoles);
    Assert.assertEquals(2, outline.length);
    Assert.assertEquals(4, outline[0].length);
    Assert.assertEquals(4, outline[1].length);
}

[Feature request] ODE solver for complex field elements

Hi there,

I am trying to use the FieldOrdinaryDifferentialEquation with Complex type but it seems not possible to use it as the Complex type doesn't implements RealFieldElements.

I do not understand why there is such a limitation of the FODE to only accept RealFieldElement and not FieldElement more generic type. Could you tell me why?

Infinite Boundary in PolygonSet

The way SphericalPolygonSet (and presumably PolygonSet) implement the checkPoint(...) method can lead to points arbitrarily far away from the center-line of the boundary being considered part of the boundary. This means that points inside a region and far a way from the center-line of the boundary may be considered to be part of the boundary. Similarly points outside the region and far away from it may be considered to be part of the boundary. Here "far" means the tolerance multiplied by some large number.

The article in 1 provides a good description of the issue. Hipparchus is currently using a mitre (at left) which leads to very long points. Using a round (middle) or bevel (right) would fix the issue. I think a round join is the most intuitive meaning for tolerance.

image

I don't know if this is worth fixing or if this is merely a theoretical problem. Maps (one of the use cases for SphericalPolygonSet) tend to have some very strange boundaries.

I've used the code below with a tolerance of 1e-3 to produce the "Hipparchus" points in the plot below. As you can see, even though a point is several orders of magnitude further away from the center-line of the boundary it can still be considered part of the boundary.

            double tol = 0.001;
            int n = 100;
            double step = FastMath.PI /  n;

            for (int i = 0; i < n; i++) {
                double angle = FastMath.PI - i * step;
                RegionFactory<Sphere2D> factory = new RegionFactory<>();
                SphericalPolygonsSet plusX = new SphericalPolygonsSet(Vector3D.PLUS_I, tol);
                SphericalPolygonsSet plusY = new SphericalPolygonsSet(Vector3D.PLUS_J, tol);
                SphericalPolygonsSet plusZ = new SphericalPolygonsSet(new Vector3D(0, -FastMath.cos(angle), FastMath.sin(angle)), tol);
                SphericalPolygonsSet octant =
                        (SphericalPolygonsSet) factory.intersection(factory.intersection(plusX, plusY), plusZ);
                Circle bisect = new Circle(new Vector3D(0, -FastMath.cos(angle / 2), FastMath.sin(angle / 2)), tol);
                final double phase0 = bisect.getPhase(Vector3D.PLUS_I);
                final double boundary = UnivariateSolverUtils.solve(
                        x -> octant.checkPoint(new S2Point(bisect.getPointAt(x))) == Location.OUTSIDE ? 1 : -1,
                        phase0 - FastMath.PI / 2,
                        phase0);
                final double offset = MathUtils.normalizeAngle(boundary, phase0) - phase0;
                out.write(String.format("%20f %20f\n", angle, offset));

image

[Feature Request] Missing implementation

The class PolynomialFunction implements the interface UnivariateDifferentiableFunction, it can be interesting if this class implements also the interface RealFieldUnivariateFunction
The same remark can be done also with PolynomialFunctionNewtonForm and PolynomialSplineFunction
Thanks

NullPointerException in unbounded polygons set

The following test triggers a NullPointerException:

@Test
public void testInfiniteQuadrant() {
    final double tolerance = 1.0e-10;
    BSPTree<Euclidean2D> bsp = new BSPTree<>();
    bsp.insertCut(new Line(Vector2D.ZERO, 0.0, tolerance));
    bsp.getPlus().setAttribute(Boolean.FALSE);
    bsp.getMinus().insertCut(new Line(Vector2D.ZERO, 0.5 * FastMath.PI, tolerance));
    bsp.getMinus().getPlus().setAttribute(Boolean.FALSE);
    bsp.getMinus().getMinus().setAttribute(Boolean.TRUE);
    PolygonsSet polygons = new PolygonsSet(bsp, tolerance);
    Assert.assertEquals(Double.POSITIVE_INFINITY, polygons.getSize(), 1.0e-10);
}

[Feature Request] Use Kotlin

Kotlin has a lot of slick features, like eliminating the need for builders, etc. through named arguments, that I think would be useful for Hipparchus. Also you can compile to both Javascript and various versions of JVMs. See:

https://stackoverflow.com/questions/46858270/does-there-exist-a-babel-like-compiler-for-java
https://stackoverflow.com/questions/46892929/are-number-operations-using-kotlin-as-fast-as-the-equivalent-with-java-primitive

General info:
https://medium.com/@magnus.chatt/why-you-should-totally-switch-to-kotlin-c7bbde9e10d5

The val property (Non modifiable properties) will be really helpful in thread safe designs ...

Could compile to Javascript / Typescript and publish on NPM - which should bring more contributors into the fold.

Add implementation of RANDOM streaming percentile algorithm

The RANDOM algorithm makes a nice complement to PSquarePercentile for streaming percentiles. While it does not have a uniformly fixed bound on storage, storage is bounded for fixed quantile estimation error and grows very slowly with increases in precision. It also allows any quantile to be estimated based on the data it stores, so e.g. getResult(quantile) or even getResult(quantile[]) methods are possible. Finally, aggregation is straightforward.

spherical polygons set build fails when two edges are aligned but in reverse direction

SphericalPolygonsSet instances can be built from a list of vertices, which are points on the 2D unit sphere. When these vertices form a zigzag or star shaped boundary and two distant edges happen
to be on the same circle (according to the hyperplaneThickness parameter setting) and these edges
are in opposite orientation, then the polygon built is completely wrong.

This test case is an example of this behavior. If the hyperplane thickness (first constructor parameter) is set to 1.0e-10, then all edges are considered to belong to separate circles and the zone is properly built.
If the hyperplane thickness is set to 1.0e-6, then the edge built from vertices at indices 6 and 7 (counting from 0) and the edge built from vertices at indices 10 and 11 are considered to belong to the same circle and the polygons built is different.

@Test
public void testZigZagBoundary() {
    SphericalPolygonsSet zone = new SphericalPolygonsSet(1.0e-6,
                                                         new S2Point(-0.12630940610562444, 0.8998192093789258),
                                                         new S2Point(-0.12731320182988207, 0.8963735568774486),
                                                         new S2Point(-0.1351107624622557,  0.8978258663483273),
                                                         new S2Point(-0.13545331405131725, 0.8966781238246179),
                                                         new S2Point(-0.14324883017454967, 0.8981309629283796),
                                                         new S2Point(-0.14359875625524995, 0.896983965573036),
                                                         new S2Point(-0.14749650541159384, 0.8977109994666864),
                                                         new S2Point(-0.14785037758231825, 0.8965644005442432),
                                                         new S2Point(-0.15369807257448784, 0.8976550608135502),
                                                         new S2Point(-0.1526225554339386,  0.9010934265410458),
                                                         new S2Point(-0.14679028466684121, 0.9000043396997698),
                                                         new S2Point(-0.14643807494172612, 0.9011511073761742),
                                                         new S2Point(-0.1386609051963748,  0.8996991539048602),
                                                         new S2Point(-0.13831601655974668, 0.9008466623902937),
                                                         new S2Point(-0.1305365419828323,  0.8993961857946309),
                                                         new S2Point(-0.1301989630405964,  0.9005444294061787));
    Assert.assertEquals(Region.Location.INSIDE, zone.checkPoint(new S2Point(-0.145, 0.898)));
    Assert.assertEquals(6.463e-5, zone.getSize(),         1.0e-7);
    Assert.assertEquals(5.487e-2, zone.getBoundarySize(), 1.0e-4);
}

Performance improvement for Array2DRowRealMatrix.getSubMatrix()

Hi the Hipparchus Team :)

A performance improvement to Array2DRowRealMatrix.getSubMatrix() has been applied recently to Commons Math, you may be interested in applying it as well:

https://issues.apache.org/jira/browse/MATH-1389
apache/commons-math@72df12f

With this modification the performance is significantly better when the method hasn't been compiled by the JIT yet (or with non Hotspot JVMs). Once the JIT kicked in its roughly equivalent to the current implementation.

Add FieldBivariateFunction interface

There is on Hipparchus an interface for univariate real function for any field type (FieldUnivariateFunction), it could be interesting to add an interface for bivariate real function for any field type (FieldBivariateFunction).

Bryan

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.