0 Reputation

2 Badges

17 years, 334 days
Wolfram Research

MaplePrimes Activity

These are replies submitted by JonMcLoone

@Carl Love the code occupies the last 9 pages of the PDF document -- after the results charts.

I don't know about the details of the arbitrary precision code, but I am not aware of any special coverage of this case. A quick experiment shows timings going up roughly linear with precision, suggesting that it is just generic high-precision.

A quick glance at the advanpix benchmark code shows that they are using Timing rather than AbsoluteTiming (time vis time[real] in Maple) which I consider unpredictable/poorly defined in a multi-core world. One test with Eigenvalues gave the same timing in both cases, but I would want to re-rerun all of those tests using AbsoluteTiming to make sure.

@ecterrab  my point was only that I informed two different Maplesoft employees about a collection of false statements being put out by your company and nothing has been done to correct them. I do not mean to imply that your wrote or support the statements but you are in a better position than I to know who is responsible and pass on the information.

@itsme @acer

Since there is an accusation of dishonesty here against people that I speak for, I feel the need to comment:

First, on the suggestion that a machine was cherry-picked for its inability to multi-thread-- the machine in use was set up by our-sysadmin group for the Software Testing group for the purpose of running lots of third party software to verify results. A seperate analysis group who were not involved in the request used the machine for benchmarking in good faith that it was representative (and because we have to pay for Maple licenses like anyone else, so one is enough!). At some point the machine was upgraded (not at the request of the analysis group. It is not clear to me if it was a new machine, or just a memory upgrade) and it remained an assumption that a random Windows machine should be considered typical.

Secondly, I was naturally concerned that, accidentally, the machine may have had some configuration that Maple does not correctly support as @acer suggested so I re-ran the benchmark on my own personal laptop - a 2019 MacBook Pro (not as mainstream as Windows but certainly a more predictable hardware). Of the five functions that @acer said gave similar performance between Mathematica and Maple two were not in the benchmark (but perhaps should be added) two became much closer and one maintained its difference. The number of tests where Mathematica was faster fell from 572 to 566 (out of 586). However, overall Maple performed relatively worse on the Mac with three tests now crashing and having to be scaled back, and the median result showing Mathematica to be 45 times faster compared to 32 times on Windows. Generally Maple was now closer on linear algebra with float[8] data and worse on function evaluation and integer linear algebra. The full results are here: Mathematca12.1Maple2020MacDraft.pdf. It should be noted that this was not run under test conditions with only a single run of the benchmark, where the protocol in the annual update is the average of three runs and has not been through any secondary checks that are required for Wolfram communications, so I make this post in my own name, not as an official Wolfram statement.

One cannot easily defend on the charge of incompetance, but the benchmark contains the entire source code for the tests. I would also be happy to share the report generation code to make analysis of test results easier. I repeat the invitation that I made in that, if there are ways that the benchmark could be fairer, I will ensure that they are considered. I have not received any feedback since then.

For symmetry, it is worth noting from that in that 2015 thread, I shared with @ecterrab a number of ways in which MapleSoft's Comparison document at was factually incorrect having previously sent them privately to @Samir Khan (for example, it does not take 14 lines of code to generate a simple slider -- the command since 2007 has been Slider[] ) and to date no change has been made to that document. As I said at the time, I would have thought that it is in all our interests to be accurate.

Jon McLoone  (Wolfram Research)

@Mariusz Iwaniuk Sorry, I don't have a date but it is in the days-weeks interval not months.

@Mariusz Iwaniuk 

FYI, this ceiling has been removed for v 12.1

In[1]:= PrimePi[10^19]

Out[1]= 234057667276344607

@Markiyan Hirnyk 

That is not correct, you get this functionality in Mathematica without additional purchases. The first page you point to are different purchase programs for Mathematica, not additional tools. The second page are cheap mobile apps, not Mathematica add-ons. There is nothing that they can do that is not already available inside Mathematica. To access step-by-step solutions in Mathematica, type "==" before the limit request and then click "show steps" in the result. Here is what you see...


To clarify the Mathematica policies...

x.y.z releases are free for x.y.0 users to deal with undiscovered regressions. Policy was changed to charge the standard 1-version upgrade fee for any upgrade where the lead digit differed by no more than 1 (ie 9.0 to 10.4 or less) so that users who upgrade periodically did not end up with higher charges because they had missed out intermediate releases. No prices were changed.

The only customers who are worse off, are those that want to stay always up to date, but insist on buying upgrades rather than service. Those people will now always be better off with Service and will be advised as such if they speak to anyone in sales.

I think the only valid criticism in your summary is that we did not pre-announce the policy, and probably should have.  I think this was because of concerns about whether we could live up to the new intended schedule. However we did do a very public dress rehearsal in the form of 10.0.1, 10.0.2 and 10.0.3 releases which contained some new features but were given only third digit release numbers so that they shipped for free.

The motivation for this change was to get new technology that was finished out, rather than sitting on the shelf for a year waiting for the next big release. Something which is become much more common practice in software generally.


I should have been clearer that the ""we are a better company.." quote was pulled from the MapleSoft comparison document, rather than a claim that I was making.

To me it seemed obvious that the basic example for ImageDeconvolve was undoing a convolution. The function is called ImageDeconvolve not "Enhance" (in the Hollywood style) and the example uses a specific convolution kernel (point spread function). The Issues section, spells out "If the PSF does not match the blur, deconvolution fails to enhance details and it may add artifacts:" and "Even a small amount of noise in a blurred image can reduce the quality of reconstruction:". I don't think the example is any more dramatic than the example in InverseRadon, which is clearly undoing a Radon transformation.

I got a "Thank you for your feedback" from my MapleSoft contact.

To clarify my comment about the Wolfram benchmark. The first version was put out and MapleSoft responded with the rebuttle document that is still on their website. It highlighted some issues that needed improvement so a new version was created that addressed them.  There is a "Revision notes" section that details those changes, and further changes that were made in the Mathematica 10/Maple 18 revision.





I think this rather depends on your point of view. I certainly don't feel the respect, and some of it feels rather personal - "we are a better company to work with...we are more flexible, more responsive, and more focused on creating great tools for our customers".

That's fine; MapleSoft is free to choose how to frame us in its marketing, and the 'Wolfram is all marketing and doesn't care about its customers' message may work -- I note the curious assumptions in this recent thread about "business trickery" (I suspect a more likely explanation is that the example for ImageDeconvolve is essentially the inverse of the lead example for ImageConvolve page, because that function is essentially the inverse of the other.)

I must get back to be inflexible and unresponsive and avoiding making great tools now :-)


I knew I shouldn't have got involved in the more subjective parts! Welcome as I may be, I really don't feel that it is right that I come here to argue the case for Mathematica.

Of course any such document is biased- its purpose is to highlight the strengths of your product, just as it is our job to highlight strengths of ours. I am fine with that, I came to put on record the factual errors, which certainly are in our interests to correct and, I would have thought, also in you'rs (to avoid informed people reading it as either ill-informed or deceptive).

If there are errors in our documents, please feel free to send them to me - you will see that all the issues raised in your (still live) response document were addressed immediately in the first revision to the numerical performance benchmark.

@Markiyan Hirnyk 

Most of my writing is on the Mathematica blog...


I understand they are currently in the 'proof reading' queue, but do not reveal substantial changes.


There were no directly incorrect assertions about Mathematica on Page 9.

However, since you are drawing me in to it, I will dispute just one of the four sections - the statement that "Maple is the uncontested leader for computing symbolic solutions to differential equations".

Firstly the justification for this has always been the results of the Kamke text. But, if I recall correctly your original work here used Kamke as the motivation  (Cheb-Terrab, Duarte and da Mota, 1997). As such it is not entirely a 'representative random sample'. I have heard (though not seen for myself) that tests against other tests suites have put the ODE results much closer.

But more importantly this paragraph is premised on a tight definition of symbolic ODEs, and Mathematica's DSolve has extended in other directions to include symbolic solutions to piecewise defined/ discontinuous ODEs, ODEs with events (eg a simple bounding ball), differential algbebraic equations or delay differential equations. And, of course, if you are interested in differential equations, then numerical solutions are extremely important and Mathematica has a full FEA PDE solver with 3D mesh generation and refinement.

So while the paragraph does not contains obvious errors, it gives a rather skewed view of differential equation support.

(Such a user might also care about discrete cases where Mathematica's RSolve also goes somewhat further).

Below is a list of factual corrections to the comparison document at

Perhaps my presence here is unwelcome, so I shall not comment on the subjective arguments that it contains.

This list was sent to a senior Maplesoft employee back in July and some of it in March 2014 but so far the document has not been updated.


Jon McLoone - Wolfram Research


Page 3-5. Ignores the TraditionalForm mode in Mathematica. In this mode all typeset examples given for Maple are supported in Mathematica input and output (except the use of "=" vs "=="). Statements 1 and 2 and 5 on page 4 are false and single letter variable names are italicized. TraditionalForm supports all kinds of typesetting that Maple doesn't such as tan^-1(x) for ArcTan.

Page 6. Mathematica can combine text and results exactly as described (though the UI for doing so is not as simple as Maple).

Page 8,  "Slider[]"  can be used instead of the 14 lines of old GUIKit code that the document claims is "required". It is actually quite hard to find the deprecated language. Google "Mathematica slider" and you get the simple function on the first hit or click on Control Objects on the User Interface tile on the help front page. The document states "the only choice is to use the command Manipulate" for making an interactive plot but that also ignores linguistic input. eg type something like "= interactive plot of sin fx"

Page 10. Contrary to the assertions on this page, Mathematica does provide step-by-step solutions thanks to its Wolfram|Alpha integration (eg enter "==integrate 1/(1+x^2)" and click the "show steps" button. You can achieve the same "reals only" results shown on this page with one line of Mathematica code (instead of loading the different 'student' packages):
$Assumptions = Element[_, Reals]; $Pre = Simplify;

Page 12. Aside from the weakness of the claim that this is representative of a broad user base, if you go to
you will see an implementation that is structurally identical the Maple example (but a bit simpler) as well as several other approaches.

Page 14.
Debugging:The Mathematica debugger does allow step through of code and evaluation of locally scoped variables. The only correct statement in this paragraph is that Mathematica does not allow debugging of built-in routines; it is there to debug user code.

Parallel: This paragraph mixes up automatic multi-threading, user thread control and user process control.
Automatic multi-threading: Mathematica uses it for most numerical calculations. Contrary to this document, this feature is not limited by license.
User level thread control:  Not available in Mathematica, but also barely available in Maple. The small list of thread-safe Maple commands is available at
User process control: Claims of complicated message passing in Mathematica are false along with the assertions of complexity. Message exchange and synchronization are entirely automated. 8 processes are included with each license increment (not 4 as stated in the document).

Writing scripts: This appears to be a reference to a bug that was fixed several versions ago.

Page 15. Not a like-for-like comparison. The document compares fixed precision arithmetic in Maple with variable precision in Mathematica. Mathematica supports both, Maple supports only fixed. The nearest equivalent to the Maple command "Digits:=20" is
Putting this before the Mathematica code gives a fixed precision lower bound precision and returns the same result as Maple.

Also the statement "There is nothing in the computation to warn the user" is clearly false. You can see Mathematica's standard pink error box in the output in the PDF. Mouseover of that gives details of the nature of the error.

Page 16-17. No source code, so can't comment.

Page 18-19. The document compares Maple code generation with the Mathematica syntax format commands. These are not the same thing. They do reference the Mathematica C Code generation ExportString[Compile[...code..], "C"] but make claims of limitations (incorrect capitalization, use of non-C commands) which it does not suffer from. The aspiration of the formater is much more limited, really just changing delimiters and infix operators to simplify copy and paste. Mathematica does not attempt to do FORTRAN code generation, though there is an addon MathCodeF90 which does this and does not suffer from the claimed limitations.

Page 19. Third party links to between Mathematica and Matlab are available in both directions.

Page 21. The person in this case study switched to Maple from Mathematica 2.x in the mid-1990's. This is not supporting evidence for current technical comparison with Mathematica 10.2, as implied.


Nearly all data is delivered "on demand" to Mathematica from Wolfram serers, and so does not affect installation size, though some data will cache locally, if used, to prevent unnecessary repeat downloads.


Installed size is a little under 4GB on Windows, approximately 50% of which is documentation which can be deleted and read online if disk space is an issue.

Page 1 of 1