juliamath / polynomials.jl Goto Github PK
View Code? Open in Web Editor NEWPolynomial manipulations in Julia
Home Page: http://juliamath.github.io/Polynomials.jl/
License: Other
Polynomial manipulations in Julia
Home Page: http://juliamath.github.io/Polynomials.jl/
License: Other
Shouldn't the travis badge point to this fork? At this point it only shows the status of the original repository of vtjnash
.
In the code section
function roots{T}(p::Poly{T})
R = promote_type(T, Float64)
length(p) == 0 && return zeros(R, 0)
p = truncate(p)
...
the polynomial
p=8.362779449448982e41 - 2.510840694154672e57∙x + 4.2817430781178795e44∙x^2 - 1.6225927682921337e31∙x^3 + 1.0∙x^4
gets truncated to p=truncate(p)=-2.510840694154672e57∙x
so the roots(p) returns just one root 0.0.
This is wrong and meaningless. When the line p = truncate(p)
is removed from the roots() function, roots(p)
returns correct roots
1.62259e31
1.75922e13
8.79609e12
0.0
So, forcing truncate() is counterproductive. Just leave the polynomial as is and let the eigvals() routine do its job.
PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their tests (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3). The results of this script are used to generate a package listing enhanced with testing results.
Tests pass.
Tests fail, but package loads.
Tests pass.
means that PackageEvaluator found the tests for your package, executed them, and they all passed.
Tests fail, but package loads.
means that PackageEvaluator found the tests for your package, executed them, and they didn't pass. However, trying to load your package with using
worked.
This issue was filed because your testing status became worse. No additional issues will be filed if your package remains in this state, and no issue will be filed if it improves. If you'd like to opt-out of these status-change messages, reply to this message saying you'd like to and @IainNZ will add an exception. If you'd like to discuss PackageEvaluator.jl please file an issue at the repository. For example, your package may be untestable on the test machine due to a dependency - an exception can be added.
Test log:
INFO: Cloning cache of Polynomials from git://github.com/Keno/Polynomials.jl.git
INFO: Installing Polynomials v0.0.2
INFO: REQUIRE updated.
ERROR: wrong number of arguments
in include at boot.jl:238
at /home/idunning/pkgtest/.julia/v0.2/Polynomials/test/tests.jl:70
INFO: REQUIRE updated.
A support for x = Poly("x")
and x = Poly(Complex128, "x")
type of construction could be cool.
I am not supporting such an attitude in general, but some people with MATLAB background prefer using:
julia> using Polynomials
julia> s = Poly("s")
Poly(s)
julia> p1 = s^5 + s^2 + 3s + 1
Poly(1.0 + 3.0s + s^2 + s^5)
I have come across such an attitude in users when using the control toolbox in MATLAB. In our attempt to port such functionality in Julia, we are relying on Polynomials. Hence, I thought I could open an issue here and discuss with you.
Should you be interested in supporting such a feature, please check the PR.
P.S: I have also got rid of the clutter in the constructor. I hope you would like it.
I just noticed this in the code; seems like an obvious improvement.
The master branch fails tests for me:
julia> Pkg.test("Polynomials")
INFO: Testing Polynomials
Test for the exponential function.
Test for the sine function.
Test for the cosine function.
Test for the summation of a factorially divergent series.
The approximate sum of the divergent series is: 0.596347366095526
The approximate sum of the convergent series is: 0.5963473623231946
Test for setindex!()
Test for element-wise operations
Test Failed
Expression: isa(psum,Vector{Poly{Float64}})
ERROR: LoadError: There was an error during testing
in record(::Base.Test.FallbackTestSet, ::Base.Test.Fail) at ./test.jl:397
in do_test(::Base.Test.Returned, ::Expr) at ./test.jl:281
in include_from_node1(::String) at ./loading.jl:426
in process_options(::Base.JLOptions) at ./client.jl:262
in _start() at ./client.jl:318
while loading /Users/ranjan/.julia/v0.5/Polynomials/test/runtests.jl, in expression starting on line 176
===========================================================[ ERROR: Polynomials ]============================================================
failed process: Process(`/Users/ranjan/julia/usr/bin/julia -Cnative -J/Users/ranjan/julia/usr/lib/julia/sys.dylib --compile=yes --depwarn=yes --check-bounds=yes --code-coverage=none --color=yes --compilecache=yes /Users/ranjan/.julia/v0.5/Polynomials/test/runtests.jl`, ProcessExited(1)) [1]
=============================================================================================================================================
ERROR: Polynomials had test errors
in #test#61(::Bool, ::Function, ::Array{AbstractString,1}) at ./pkg/entry.jl:740
in (::Base.Pkg.Entry.#kw##test)(::Array{Any,1}, ::Base.Pkg.Entry.#test, ::Array{AbstractString,1}) at ./<missing>:0
in (::Base.Pkg.Dir.##2#3{Array{Any,1},Base.Pkg.Entry.#test,Tuple{Array{AbstractString,1}}})() at ./pkg/dir.jl:31
in cd(::Base.Pkg.Dir.##2#3{Array{Any,1},Base.Pkg.Entry.#test,Tuple{Array{AbstractString,1}}}, ::String) at ./file.jl:59
in #cd#1(::Array{Any,1}, ::Function, ::Function, ::Array{AbstractString,1}, ::Vararg{Array{AbstractString,1},N}) at ./pkg/dir.jl:31
in (::Base.Pkg.Dir.#kw##cd)(::Array{Any,1}, ::Base.Pkg.Dir.#cd, ::Function, ::Array{AbstractString,1}, ::Vararg{Array{AbstractString,1},N}) at ./<missing>:0
in #test#3(::Bool, ::Function, ::String, ::Vararg{String,N}) at ./pkg/pkg.jl:258
in test(::String, ::Vararg{String,N}) at ./pkg/pkg.jl:258
I can't explain the poor accuracy in the following computation.
using Polynomials
r = [1.0e-6;1.0e6];
p = poly(r);
( sort(roots(p)) - r ) ./ r
The relative error for the smaller root is about 7.6e-6. Both MATLAB and the PolynomialRoots package seem to produce perfect answers. The roots are well conditioned.
My version info:
Julia Version 0.4.6
Commit 2e358ce (2016-06-19 17:16 UTC)
Platform Info:
System: Darwin (x86_64-apple-darwin13.4.0)
CPU: Intel(R) Core(TM) i5-2467M CPU @ 1.60GHz
WORD_SIZE: 64
uname: Darwin 14.5.0 Darwin Kernel Version 14.5.0: Mon Aug 29 21:14:16 PDT 2016; root:xnu-2782.50.6~1/RELEASE_X86_64 x86_64 i386
Memory: 4.0 GB (16.30078125 MB free)
Uptime: 426448.0 sec
Load Avg: 2.26318359375 1.94970703125 1.9189453125
Intel(R) Core(TM) i5-2467M CPU @ 1.60GHz:
speed user nice sys idle irq
#1 1600 MHz 11458 s 0 s 9041 s 125561 s 0 s
#2 1600 MHz 3482 s 0 s 1632 s 140905 s 0 s
#3 1600 MHz 10466 s 0 s 5557 s 129996 s 0 s
#4 1600 MHz 3557 s 0 s 1684 s 140777 s 0 s
BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Sandybridge)
LAPACK: libopenblas64_
LIBM: libopenlibm
LLVM: libLLVM-3.3
Environment:
ATOM_HOME = /Users/Toby/.atom
TERM = xterm-256color
PATH = /Applications/Julia-0.4.6.app/Contents/Resources/julia/bin:/Applications/Julia-0.4.6.app/Contents/Resources/julia/libexec/git-core:/Users/Toby/anaconda2/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/opt/X11/bin:/usr/local/git/bin:/Users/Toby/Dropbox/conf:/Users/Toby/Dropbox/research/heart/StpToXml_6.1
XPC_FLAGS = 0x0
HOME = /Users/Toby
TEXMFHOME = /Users/Toby/Dropbox/texmf
NODE_PATH = /Applications/Atom.app/Contents/Resources/app.asar/exports
FONTCONFIG_PATH = /Applications/Julia-0.4.6.app/Contents/Resources/julia/etc/fonts
GIT_EXEC_PATH = /Applications/Julia-0.4.6.app/Contents/Resources/julia/libexec/git-core
Package Directory: /Users/Toby/.julia/v0.4
6 required packages:
- Atom 0.4.5
- IJulia 1.3.2
- Plots 0.8.2
- PolynomialRoots 0.0.4
- Polynomials 0.1.0
- PyPlot 2.2.4
39 additional packages:
- BinDeps 0.4.5
- Blink 0.3.5
- CodeTools 0.3.1
- Codecs 0.2.0
- ColorTypes 0.2.7
- Colors 0.6.8
- Compat 0.9.2
- Conda 0.3.2
- FactCheck 0.4.3
- FixedPointNumbers 0.1.8
- FixedSizeArrays 0.2.3
- Hiccup 0.0.3
- Homebrew 0.4.0
- HttpCommon 0.2.6
- HttpParser 0.2.0
- HttpServer 0.1.6
- Iterators 0.1.10
- JSON 0.7.0
- JuliaParser 0.6.4
- LNR 0.0.2
- LaTeXStrings 0.2.0
- Lazy 0.11.2
- MacroTools 0.3.2
- MbedTLS 0.3.0
- Measures 0.0.3
- Media 0.2.2
- Mustache 0.1.2
- Mux 0.2.1
- Nettle 0.2.4
- PlotUtils 0.0.4
- PyCall 1.7.2
- RecipesBase 0.0.6
- Reexport 0.0.3
- Requires 0.2.2
- SHA 0.2.1
- Showoff 0.0.7
- URIParser 0.1.6
- WebSockets 0.2.1
- ZMQ 0.3.4
This seems like the most fundamental feature, yet it is missing. Are you planning to implement this at one point, or should I be looking for a hack to get the coefficients from a Poly object?
It seems this package would be a good fit for the JuliaMath organization, and @jverzani, the top contributor here, is also an active contributor in JuliaMath. Keno admittedly doesn't watch this package all that closely these days, and has expressed willingness to make the move should John deem it a good idea. In that case I can give Keno the necessary JuliaMath permissions to make the move.
Right now the only check in the package is T<:Number
. Hence, we have NaN
s and Inf
s resulting in weird behaviours. One example was in #78, where the polynomial evaluation was wrong.
We also have a strange behaviour when taking the derivatives.
julia> using Polynomials
julia> p = Poly([NaN, 1, Inf])
Poly(NaN + 1.0⋅x + Inf⋅x^2)
julia> p'
Poly(1.0 + Inf⋅x)
julia> p''
Poly(Inf)
julia> p'''
Poly(0.0)
julia> polyder(polyint(p))
Poly(NaN + 1.0⋅x + Inf⋅x^2)
julia> polyint(polyder(p))
Poly(1.0⋅x + Inf⋅x^2)
If the package continues to support NaN
s and Inf
s, I believe any operation on NaN
s should produce NaN
, including the derivative. Right now, derivative of p(x) = NaN
is 0.0
. Similarly, when p(x) = Inf
, the derivative evaluates to 0.0
, again. However, when we apply the definition (at any point), we get (Inf-Inf)/h
. Numerator, there, is also NaN
. I assume, derivative of Inf
should yield NaN
, instead.
So, my proposed solution is at least as follows:
julia> using Polynomials
julia> p = Poly([NaN, 1, Inf])
Poly(NaN + 1.0⋅x + Inf⋅x^2)
julia> p'
Poly(NaN + Inf⋅x)
julia> p''
Poly(NaN)
julia> p'''
Poly(NaN)
julia> polyder(polyint(p))
Poly(NaN + 1.0⋅x + Inf⋅x^2)
julia> polyint(polyder(p))
Poly(NaN⋅x + Inf⋅x^2)
What do you think? Am I missing something?
Note. A PR will follow.
Seems like this could be done now.
conj
is defined in Base
as conj(x) = x
so
julia> using Polynomials; Poly([1, 2 + im])
returns
Poly(1 + (2 + 1im)x)
I guess the standard definition is that the coefficients are simply conjugated.
See http://math.stackexchange.com/questions/25422/conjugate-of-complex-polynomial for a short discussion
The main constructor contains the 3 lines:
# determine the last nonzero element and truncate a accordingly
a_last = max(1,findlast( p->(abs(p) > 2*eps(T)), a))
new(a[1:a_last], symbol(var))
That has a surprising consequence:
All higher coefficients are truncated, if their absolute value is below eps(). That generates an underflow-behaviour at high values. It would be easy to find p=Poly(..); p / 1000 * 1000 != p
. There should be no such limitation, which restricts usability, for example when working with interpolating polynomials of higher degrees.
I recommend to replace abs(p) > eps(T)
by p != zero(T)
I need to obtain eltype
of a Poly
.
julia> using Polynomials
julia> A = Matrix{Poly{Number}}(2,1);
julia> eltype(eltype(A))
Any
julia> B = Matrix{Poly{Float64}}(2,1);
julia> eltype(eltype(B))
Any
julia> using Polynomials
julia> A = Matrix{Poly{Number}}(2,1);
julia> eltype(eltype(A))
Number
julia> B = Matrix{Poly{Float64}}(2,1);
julia> eltype(eltype(B))
Float64
Will follow with a PR.
Fixing the bug in +
and -
between Number
s and Poly
s.
julia> p = Poly([0.5])
Poly(0.5)
julia> p + 1
Poly(1.0)
julia> p = Poly([0.5])
Poly(0.5)
julia> p + 1
Poly(1.5)
There seems to be a problem with elevating z to the zeroth exponent.
julia> let z = Poly([0.0, 1]),
z1 = z^3 + z^0;
length(z1), z1[end]
end
(5,0.0)
So, z1 has length 5, which does not seems correct. I would have expected a length 3 underlying array, corresponding to a degree 3 polynomial. Here I am trying to access the coefficient of greatest degree for the polynomial. Is there a degree function? A related problem is that 0*z^5
is of length 6 and thus the length does not give the degree.
The defintion of '/' as:
/(num::Poly, den::Poly) = divrem(num, den)[1]
is better named div
and not /
.
As is, it causes an issue with a heuristic used in the Roots
package to test if a function is a polynomial function, as rational functions also get identified.
If this meets no objection, I'll put in a pull request with the change and update the tests.
A = Poly([1,2])
B = convert(Poly{Int64}, A)
The last expression actually copies A to B. In contrast to, for example,
A = [1,2]
B = convert(Array{Int64}, A)
where B
points to the same object as A
.
This has implications for containers that contain Poly
objects parametrized by type T
:
immutable MyContainer{T<:Number}
p::Poly{T}
end
The construction of MyContainer
will not be passed by reference but will result in a copy such that in the following code:
A = Poly([1,2])
m = MyContainer(A)
Here, m.p
will not point to A
but to a copy of A
.
The solution is simply to mimic AbstractArray
by adding the following convert to the same type:
convert{T}(::Type{Poly{T}}, p::Poly{T}) = p
Could you please add a tag for this recent change?
It would nice if it there would be an easy way to construct some of the standard polynomials (Hermite, Jacobi etc).
There isn't actually that much code available at the moment. The currently existing packages that I was able find:
The question is whether it should be a separate package or would it make sense to have the code here? I don't think it warrants a separate package and having to hunt through different packages to find the necessary function is a headache. It would be nice if you could just do something like
using Polynomials
p = hermite(3)
If seems reasonable to have them here then I would love to make some PRs at some point.
Looking at the current show
, I think it does an amazing job.
Do you think we could extend show
to have
We are working on a control systems toolbox, and we are requiring Polynomials
in that toolbox. Generally, people in the control community prefer seeing the polynomials
(s+r1)(s+r2)
for real roots (with corresponding multiplicities), or, (s^2+as+b)
for complex conjugate roots, if the system is represented by zero-pole-gain models.As a side note, in transfer function models, we are using a subset of the polynomials: polynomials with real coefficients. As a result, we can also have an efficient search strategy for Poly{T<:Real}
s when finding the complex conjugate pairs.
We can have this behaviour (if that would be implemented in Polynomials
), by
const Dict{Symbol,Any}
with a setting symbol pointing to a value type such as Show{:Factored}
, Show{:Descending}
and/or Show{:Ascending}
to dispatch the corresponding print
method for show
, for example.We have the following:
julia> using Polynomials
julia> p = Poly([1, 1+1im])
Poly( + (1 + 1im)⋅x)
I'd like to see a polyfit
function somewhere in the julia ecosystem, is this the right place?
Something like
julia> function polyfit(x, y, n)
A = [ float(x[i])^p for i = 1:length(x), p = 0:n ]
Poly(A \ y)
end
polyfit (generic function with 1 method)
julia> x = [0:10];
julia> y = x.^2-15*x+50;
julia> p=polyfit(x,y,2)
Poly(49.99999999999998 - 14.999999999999991x + 0.9999999999999991x^2)
Try the following code:
using Polynomials
p1 = Poly([1,2,1])
p2 = Poly([1,2,2])
typeof(p1) == typeof(p2)
typeof(roots(p1)) == typeof(roots(p2))
which results in false
, implying the roots
function is not type-stable.
There is a bug in setindex!
functionality. Current implementation results in
julia> using Polynomials
julia> p = Polynomials.Poly([1, 2, 1])
Poly(1 + 2x + x^2)
julia> p[5] = 1
1
julia> p
Poly(1 + 2x + x^5)
which clears the coefficient of the highest degree term in the polynomial.
In discussion #117 the absence of eachindex
came up. This should be added so that expressions such as [p[i] * variable(p)^i for i in eachindex(p)]
work.
Can we tag a new version? The Roots
package has a testing issue because of its dependency. Thx.
The roots
function is not type stable, since it can return either Array{Complex{Float64},1}
or Array{Float64,1}
, depending only on the polynomial coefficients.
It would seem like a good idea to always return Array{Complex{Float64},1}
.
Some changes might be needed to prepare the package for v0.6
:
VERSION < v"0.6.0-dev"
fails in https://juliabox.com with the v0.6-dev
kernel,Array(T, n)
results in deprecation warnings. Better to change it to Vector{T}(n)
,convert{T,S}(p::Poly{T}, s::S)
should always convert s
to Poly{T}
, but not to Poly{promote_type(T,S)}
. How could we, otherwise, have convert(Poly{Float64}, BigFloat(5))
do what we exactly want?.travis.yml
file. Allowing for failures in nightly
builds will help develop the package for supported environments in a smoother way. Problem here is that #98 and #99 are failing in nightly
in a different place from those introduced in the corresponding PRs. julia> p4 = poly([1.0, 1im, -1.0, -1im])
Poly(--1.0 + 1.0x^4)
which appears to indicate a bug in printing the polynomial. The stored coefficient is really and correctly -1.0
In #112 it was noted that the docstrings are not in the standard format of base Julia. These should be updated.
It is very confusing to see two very similar almost constructors Poly()
and poly()
which are quite different. I propose to rename poly()
so that it will be evident what's it do. For example byroots()
or fromroots()
or something similar.
It's sometimes very useful to find only the real roots of a polynomial, especially if it means faster computation times.
See this issue
Current version:
julia> using Polynomials
julia> p1 = Poly([1, 2])
Poly(1 + 2x)
julia> p2 = Poly([3, 1.])
Poly(3.0 + x)
julia> p = [p1, p2]
2-element Array{Polynomials.Poly{Float64},1}:
Poly(1.0 + 2.0x)
Poly(3.0 + x)
julia> q = [3,p1]
2-element Array{Any,1}:
3
Poly(1 + 2x)
julia> 3p
ERROR: MethodError: `.*` has no method matching .*(::Int64,
::Polynomials.Poly{Float64})
Closest candidates are:
.*(::Real, ::OrdinalRange{T,S})
.*(::Real, ::FloatRange{T<:AbstractFloat})
.*(::Real, ::LinSpace{T<:AbstractFloat})
...
in .* at arraymath.jl:120
in * at abstractarraymath.jl:54
Dot product is not defined between Number
s and Poly
s.
It would be convenient to have such a feature.
Promotion rule from Number
s to Poly
s is not defined either.
We need to implement ==
between Poly
s and Number
s so that map
works properly.
julia> map(Poly, speye(2,2))
2×2 sparse matrix with 4 Polynomials.Poly{Float64} nonzero entries:
[1, 1] = Poly(1.0)
[2, 1] = Poly(0.0)
[1, 2] = Poly(0.0)
[2, 2] = Poly(1.0)
julia> map(Poly, speye(2,2))
2×2 sparse matrix with 2 Polynomials.Poly{Float64} nonzero entries:
[1, 1] = Poly(1.0)
[2, 2] = Poly(1.0)
The module defines three dot methods:
dot{T<:Number,S}(p::Poly{S}, c::T) = p * c
dot{T<:Number,S}(c::T, p::Poly{S}) = c * p
dot(p1::Poly, p2::Poly) = p1 * p2
which make no sense to me.
dot
should define an inner product, which means that it should return a number (or whatever the coefficient type of the polynomial is). For polynomials, you would normally define the dot product to be some kind of integral. But since there are many such definitions and no single canonical choice, it would be better to leave this undefined.
I've just started using Polynomials. Readme.md says that Poly(::Vector) constructs a polynomial lowest order first. Experimenting, I see highest order first. Am I missing something? Happy to edit the Readme for you in a PR if needed.
Sometimes multiplying a polynomial p with another one, p_, may not yield the expected result. Here is a printout of a loop where a polynomial Pj is updated in every iteration. The new Pj does not always reflect the update:
2 -4.047588495693385e-21
2 -154.8971429 + x
2 6.269598936178143e-19
3 6.269598936178143e-19
3 -211.0 + x
3 -1.3228853755335882e-16
4 -1.3228853755335882e-16
4 -230.1282051 + x
4 3.044332370245841e-14
5 3.044332370245841e-14
5 -242.552795 + x
5 -7.384113253121036e-12 + 3.044332370245841e-14x
6 -7.384113253121036e-12 + 3.044332370245841e-14x
6 -257.96875 + x
6 1.904870465766067e-9 - 1.5237539414489605e-11x + 3.044332370245841e-14x^2
Here is the source code for this (c_ = some scalar value, x is a vector of data):
Pj = Poly([c_])
for i in not_j
println(i, " ", Pj)
Pj *= Poly([-x[i], 1])
println(i, " ", Poly([-x[i], 1]))
println(i, " ", Pj)
end
Would it be OK if we removed Base./(p1::Poly, p2::Poly)
overload from the package?
Over the weekend I have tried implementing RationalFunctions.jl
in the hope for extending Polynomials
to support rational functions.
In that package, I am redefining the division operator between Poly
s. However, Julia issues a warning when doing so, which is not fun. It has been, anyway, deprecated for a while now in Polynomials
.
The package has some problems related to the conjugation of Poly
objects. In my opinion, conj
should not only return a new Poly
with conjugated coefficients but also keep track of its input data. The solution proposal in #59 seems to be true only for real inputs.
What I would expect from conj
is that conj(a*x) = conj(a)*conj(x)
. In fact, after obtaining conj
of a Poly
, the variable x
has been also changed to y = conj(x)
. What I mean by that is conj(p(x)) + p(x)
should not be allowed for a Poly
object p(x)
with the current implementation.
Read below for more...
julia> p3 = Poly([1+1im, 1])
Poly((1 + 1im) + x)
julia> p4 = conj(p3)
Poly((1 - 1im) + x)
julia> p3(1)
2 + 1im
julia> p4(1)
2 - 1im
julia> conj(p3(1)) == p4(1)
true
julia> p3(1 + 1im)
2 + 2im
julia> p4(1 + 1im)
2 + 0im
julia> conj(p3(1 + 1im)) == p4(1 + 1im)
false
To me, the variable name together with its conjugation property should fix the variable
information of a Poly
object. Actually, I have implemented RationalFunctions.jl
in this way, and that seems to be working.
The way I implemented RationalFunction
objects was to encode both the variable name (in this case, p.var
) and its conjugation property (for example, a value type such as Conj{true}
) into the type itself. Then, I could rely on the dispatching mechanism of Julia for all mathematical operations without even caring about any if
checks to guarantee same-variable-polynomial operations. I am not sure if this is the best practice, but it seems to be working.
I believe,
(r::RationalFunction{T,Conj{true}}){T}(x::Real) = _funceval(r, x)
) via the dispatching mechanism,:x
, :t
, :s
, :q
, etc.).Anyways, in the current implementation of Polynomials
, p.var
is hardly ever used except for checking for consistent operations, and displaying purposes. If above functionality (i.e., tracking of conjugated variables) is to be implemented without breaking the coefficients' structure (in this case, Vector
or T<:AbstractVector
if sparse coefficients are going to be supported), the other viable option is to have a Bool
in the type structure. Then, this will bring another if
switch into the picture.
I just wanted to share my opinions about the issue, as I am interested in having RationalFunction
s with real coefficients and complex inputs in my applications. Right now I seem to have solved it properly in the above-mentioned package. However, if anyone is interested in some brainstorming and/or unifying these representations so that both the packages could play well with each other, I am open to discussions.
Apart from this, I have also a couple of concerns regarding the convenient ctranspose
for polyder
, and the no-op transpose
deprecation warning in Julia 0.5.
julia> using Polynomials
julia> p1 = Poly([1,1])
Poly(1 + x)
julia> p2 = Poly([2,2])
Poly(2 + 2⋅x)
julia> A = [p1 p2; p2 p1]
2×2 Array{Polynomials.Poly{Int64},2}:
Poly(1 + x) Poly(2 + 2⋅x)
Poly(2 + 2⋅x) Poly(1 + x)
julia> b = [p1, p2]
2-element Array{Polynomials.Poly{Int64},1}:
Poly(1 + x)
Poly(2 + 2⋅x)
julia> A*b # perfectly fine
2-element Array{Polynomials.Poly{Int64},1}:
Poly(5 + 10⋅x + 5⋅x^2)
Poly(4 + 8⋅x + 4⋅x^2)
julia> b.'*A*b # Warning on no-op `transpose`
1-element Array{Polynomials.Poly{Int64},1}:
Poly(13 + 39⋅x + 39⋅x^2 + 13⋅x^3)
julia> b'*A*b # Different result from above (of course!)
1-element Array{Polynomials.Poly{Int64},1}:
Poly(13 + 26⋅x + 13⋅x^2)
julia> b'*b # MethodError for `dot(p1::Poly, p2::Poly)`
ERROR: MethodError: no method matching dot(::Polynomials.Poly{Int64}, ::Polynomials.Poly{Int64})
julia> A
2×2 Array{Polynomials.Poly{Int64},2}:
Poly(1 + x) Poly(2 + 2⋅x)
Poly(2 + 2⋅x) Poly(1 + x)
julia> A.' # Warning on no-op `transpose`
2×2 Array{Polynomials.Poly{Int64},2}:
Poly(1 + x) Poly(2 + 2⋅x)
Poly(2 + 2⋅x) Poly(1 + x)
julia> A'
2×2 Array{Polynomials.Poly{Int64},2}:
Poly(1) Poly(2)
Poly(2) Poly(1)
Some things worth noting/discussing from the above excerpt:
ctranspose
propogates to matrix operations as polyder
,transpose(p) = copy(p)
would be needed to suppress the no-op transpose
warning,
transpose(p) = p
is a faster (more efficient) solution, I know. However, to me, one should always return temporaries from mathematical operations as the result would be changed easily (and unintentionally) in cases such as p = Poly([1,1]); q = p.'; p[2] = 5 # q is unintentionally changed
,Polynomials
has this problem when taking the derivative (of zeroth order), too. Yet, this is another discussion and maybe a taste of the library writer.dot
between Poly
objects needs to be defined,b'*A*b
above should be re-considered (b'
will have conjugated variables, whereas A*b
is unconjugated).I hope what I have listed above would make sense for you, and I hope I had not consumed a lot of your time.
Looking forward to your opinions.
some ideas for simplified constructors
Poly{T}(::Type{T}, var::Union(Symbol,Char,String)=:x) = Poly{T}(var)
type Poly
...
Poly(::var=:x) = Poly(T[0,1], var)
end
Hello,
during some work I found the results of polyfit
unnecessarily inaccurate.
I have found out, that lsq system matrix is constructed via x.^n (which supposedly walks through log/exp pow), and also the A\y inversion did not work to expectations, compared to other numerical tools.
So I wrote better version of polyfit
, see attached patch. Also I attach one test data file, try
x=readdlm("polyfit_8th_order_test.dat"); p=polyfit(x[:,1], x[:,2], 8); plot(x[:,2]-p(x[:,1])
-- a huge difference between current (meaningless result) and patched version. (And yes, the 8th order is physical in this data set ;-))
Best regards,
Marek
@CristianRojas found the following bug
p = Poly([0,1])
p + 1im
which throws InexactError()
I will follow with a PR
Hi!
First, thank you for the Polynomials package, it is amazing!
I am having an issue when I change the variable name. It is not being kept after operations. It always return to 'x'. For example:
julia> a = Poly([1, 1], 's')
Poly(1 + s)
julia> b = Poly([1, 2, 3, 4, 5, 6], 's')
Poly(1 + 2s + 3s^2 + 4s^3 + 5s^4 + 6s^5)
julia> c = a+b
Poly(2 + 3x + 3x^2 + 4x^3 + 5x^4 + 6x^5)
julia> d = a*b
Poly(1 + 3x + 5x^2 + 7x^3 + 9x^4 + 11x^5 + 6x^6)
Would it not be better to keep the variable name after operations?
Thanks,
Ronan
Remove the line that says:
#todo: sparse polynomials?
Seriously, @aytekinar and I were thinking to try to implement support for sparse coefficients. In our forks we will make a branch and incrementally try to implement this.
Hi @Keno. This package could use a little love. Here are some things that seem reasonable
/
to div
I can add/merge these if they seem like good additions. Some are outside the current scope of the package, but all seem reasonable, to me.
Horner's scheme as it is implemented now gives wrong results.
julia> using Polynomials
julia> p1 = Poly([Inf, Inf])
Poly(Inf + Inf⋅x)
julia> p1(Inf)
NaN
julia> p1(-Inf)
NaN
julia> p1(0)
NaN
julia> p2 = Poly([0, Inf])
Poly(Inf⋅x)
julia> p2(-Inf)
NaN
julia> using Polynomials
julia> p1 = Poly([Inf, Inf])
Poly(Inf + Inf⋅x)
julia> p1(Inf)
Inf
julia> p1(-Inf)
NaN
julia> p1(0)
NaN
julia> p2 = Poly([0, Inf])
Poly(Inf⋅x)
julia> p2(-Inf)
-Inf
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.