Maplesoft Blog

The Maplesoft blog contains posts coming from the heart of Maplesoft. Find out what is coming next in the world of Maple, and get the best tips and tricks from the Maple experts.

We've reached quite a rhythm with Maple Flow - we update frequently, we add lots of improvements and we move fast.

What does this mean for you? It means that the feedback loop between development, the user experience and course correction has a fast time constant.

Without you being loud and vociferous, the feedback loop breaks. So don't be shy - tell us what you want!.

The new 2025.2 update builds on the theme of connectivity with two popular tools - Excel and Python. On top of that, we also have many other features and fixes that you've asked for.

Earlier versions of Maple Flow let you 

With the 2025.2 update, you can now copy and paste data from Excel into a Flow worksheet.

To be blunt, this is type of cross-application copy-paste behaviour is a no-brainer. It's such a natural workflow.

We've increasignly found that Python is now being used to script the interaction and data flow between different engineering tools. With Maple Flow 2025.2, you can now execute Maple Flow worksheets from a Python script. 

From Python, you can change and export any parameters and results defined in the worksheet

This gives me the dopamine hit of watching CPU utilization spike in the Task Manager (hey..I get my kicks where I can)

You can now do your parameter sweeps more quickly by executing the same worksheet in parallel, changing parameters for every run.

This is easy to set up - no special programming is needed.

  • Print Extents can now be set globally for all sessions, or just for the current session.
  • Any user-installed fonts used in the worksheet are now respected in the PDF export
  • Worksheets execute faster
  • The update includes fixes to many user-reported issues

You can install the Flow 2025.2 update via Help > Check for Updates (or if you're not already in the race, then grab a trial here and take Flow for a spin).

We're not pulling back on this aggresive development velocity, but we need you to point us in the right direction. Let's keep the feedback time constant small!

Mathy If one of our posts showed up in your social media feed recently, you may have found yourself staring at a giant maple leaf with feet and thinking, “Wait… who (or what) is that?” you’re not alone. 

Yes, that big, cheerful leaf you’ve been seeing is very real. 
And yes, they have a name. 

Meet Mathy. 

We officially introduced Mathy to the world a couple of weeks ago at JMM 2026 in Washington, DC, but their story actually started much earlier. 

Mathy was originally created by one of our developers, Marek Krzeminski, a few years ago as a fun internal character. Over time, they quietly became our in-office, local mathscot, popping up as mini 3D-printed Mathys around the office and even as a custom emoji someone created. 

Then, sometime last year, someone had what can only be described as a bold idea: 

What if we brought Mathy to life? 

And just like that, the giant maple leaf went from concept to costume. 

Mathy is fun, curious, and a little playful. That’s very intentional. That’s what math should feel like. 

We believe math matters. We also believe math should be approachable, joyful, and a place where curiosity is rewarded. Mathy reminds us, and hopefully others, that math doesn’t have to be intimidating. It can be fun, and it can inspire awe. 

I’ll be honest. When we decided to bring Mathy to JMM, I was a little nervous. Conferences are busy, serious places. Would people really want to interact with a seven-foot-tall maple leaf? 

As it turns out, yes. Very much yes. 

Researchers (from postdocs to seasoned academics), educators, and undergraduate and graduate students all stopped, smiled, laughed, and asked for photos. At one point, people were actually lining up to take pictures with Mathy.

Let’s just say: Mathy was a hit. 

How tall is Mathy? 
About 7 feet. They are hard to miss. 

What does Mathy love (besides math)? 
Dancing. Very much dancing. 
You can see for yourself here: Mathy's got moves!

Does Mathy talk? 
You bet they do. 

Now that Mathy has officially been introduced to the world, you’ll be seeing them more often on social media, at events, and in a few other fun places we’re cooking up. 

So if you spot a giant maple leaf dancing, waving, or talking math, now you know who they are. 

If you spot Mathy, don’t be shy, say hi. 

 

Many problems in mathematics are easy to define and conceptualize, but take a bit of deeper thinking to actually solve. Check out the Olympiad-style question (from this link) below:

 

Former Maplesoft co-op student Callum Laverance decided to make a document in Maple Learn to de-bunk this innocent-looking problem and used the powerful tools within Maple Learn to show step-by-step how to think of this problem. The first step, I recommend, would be to play around with possible values of a and b for inspiration. See how I did this below:


Based on the snippet above, we might guess that a = 0.5 and b = 1.9. The next step is to think of some equations that may be useful to help us actually solve for these values. Since the square has a side length of 4, we know its area must be 42 = 16. Therefore, the Yellow, Green and Red areas must add exactly to 16. That is,


With a bit of calculus and Maple Learn's context panel, we can integrate the function f(x) = ax2 from x = -2 to x = 2 and set it equal to this value of 8/3. This allows us to solve for the value of a.


We see that a = 1/2. Since the area of the Red section must be three times that of the Yellow (which we determined above to be 8/3), we get Red = (8/3)*3 = 8.

The last step is to find the value of b. In the figure below, we know that the line y = 4 and the curve y = bx2 intersect when bx2 = 4 (i.e. when x = ± 2/sqrt(b)).

 

Since we know the area of the red section is 8 square units, that must be the difference between the entire area underneath the horiztonal line at y = 4 and the curve y = bx2 on the interval [-2/sqrt(b), 2/sqrt(b)]. We can then write the area of the Red section as an integral in terms of b, then solve for the value of b, since we know the Red area is equal to 8.

Voila! Setting a = 1/2 and b = 16/9 ≈ 1.8 guarantees that the ratio of Yellow to Green to Red area within the square is 1:2:3, respectively. Note this is quite close to our original guess of a = 0.5 and b = 1.9. With a bit of algebra and solving a couple of integrals, we were able to solve a mathematics Olympiad problem!

Over the past year, I have spent a lot of time talking to educators, researchers, and engineers about AI. The feeling is almost universal: it is impressive, it is helpful, but you should absolutely not trust it with your math even if it sounds confident.

That tension between how capable AI feels and how accurate it actually is has been on my mind for months. AI is not going away. The challenge now is figuring out how to make it reliable.

That is where Maple MCP comes in.

Maple MCP (Model Context Protocol) connects large language models like ChatGPT, Claude, Cohere, and Perplexity to Maple’s world-class math engine.

When your AI encounters math, your AI can turn to Maple to handle the computation so the results are ones you can actually trust.

It is a simple idea, but an important one: Maple does the math and the AI does the talking. Instead of guessing, the AI can be directed to call on Maple whenever accuracy matters.

Model Context Protocol (MCP) is an emerging open standard that allows AI systems to connect to external tools and data sources. It gives language models a structured way to request computations, pass inputs, and receive reliable outputs, rather than trying to predict everything in text form.

Here is a high-level view of how MCP fits into the broader ecosystem:

MCP Architecture Diagram

Figure 1. High-level architecture of the Model Context Protocol (MCP)
Source: modelcontextprotocol.io

MCP lets an AI system connect securely to specialized services, like Maple, that provide capabilities the model does not have on its own.

If you want to learn more about the MCP standard, the documentation is a great starting point: Model Context Protocol documentation

Here is a glimpse of what happens when Maple joins the conversation:

Examples of Maple MCP in action

Figure 2. Examples of Maple MCP in action

Depending on the prompt, Maple MCP can evaluate expressions symbolically or numerically, execute Maple code, expand or factor expressions, integrate or solve equations, and even generate interactive visualizations. If you ask for an exploration or an activity, it can create a Maple Learn document with the parameters and sliders already in place.

As an example of how this plays out in practice, I asked Maple MCP:

“I'd like to create an interactive math activity in Maple that allows my students to explore the tangent of a line for the function f(x) = sin(x) + 0.5x for various values of x.”

It generated a complete Maple Learn activity that was ready to use and share. You can open the interactive version here: interactive tangent line activity .

In full disclosure, I did have to go back and forth a bit to get the exact results I wanted, mostly because my prompt wasn’t very specific, but the process was smooth, and I know it will only get better over time.

What is exciting is that this does not replace the LLM; it complements it. The model still explains, reasons, and interacts naturally. Maple simply steps in to do the math—the part AI cannot reliably do on its own.

We have opened the Maple MCP public beta, and I would love for you to try it.

Sign up today and we will send you everything you need to get started!

There is still time to register for Maple Conference 2025, which takes place November 5-7, 2025.

The free registration includes access to three full days of presentations from Maplesoft product directors and developers, two distinguished keynote speakers, contributed talks by Maple users, and opportunities to network with fellow users, researchers, and Maplesoft staff.

The final day of the conference will feature three in-depth workshops presented by the R&D team. You'll get hands-on experience with creating professional documents in Maple, learn how to solve various differential equations more effectively using Maple's numerical solvers, and explore the power of the Maple programming language while solving interesting puzzles.

Access to the workshops is included with the free conference registration.

We hope to see you there!

Kaska Kowalska
Contributed Program Co-chair

Imagine standing 365 metres above Toronto on the CN Tower’s EdgeWalk and throwing a baseball. Could you actually land it on third base at Rogers Centre, about 263 metres away?

Sportsnet raised this question, and we decided to put it to the test in Maple Learn, check out this document to see the answer.


 

 

Also take a look at the Sportsnet video on the problem, to see why the answer may not be obvious.

In the Maple Learn document, you can adjust the initial speed and angle at which to throw the ball and then visualize its trajectory (without having to throw as hard as Addison Barger).

 

I was surprised that even in the simplified projectile motion model, that neglects air resistance, AND assuming I could throw at 60mph (a questionable assumption to say the least) I wouldn’t be able to hit the base myself.

I then used Maple to build a more realistic model that would account for air resistance. The equations below model the position of the ball, where y(0) = h0 is the initial height of 365m and v0 is the initial speed.

 

local h0, m, d, rho, g:
	h0 := 365:
	m := 0.145:
	d := 0.072:
	rho := 1.225:
	g := 9.81:

	local eqns, ics:
	eqns := diff(x(t),t) = u(t), 
		    diff(y(t), t) = v(t), 
		    diff(u(t), t)= -Pi/16 * d^2 * rho/m * sqrt(u(t)^2 + v(t)^2) * u(t), 
		    diff(v(t), t)= - g - Pi/16 * d^2 * rho/m * sqrt(u(t)^2 + v(t)^2) * v(t):
	ics := x(0) = 0, y(0)=h0, u(0) = v_initial*cos(theta_initial), v(0) = v_initial * sin(theta_initial):

	local ans, xpos, ypos:
	ans := dsolve([eqns, ics], numeric, output=listprocedure):
	xpos := subs(ans, x(t));
	ypos := subs(ans, y(t));

 

In the Maple Learn document, you can visualize the difference between the models by comparing the trajectories. The trajectory from the simple model is shown in blue, and the trajectory after accounting for air resistance is modelled in red.

 

 

 

Accounting for air resistance, I’m no longer convinced even Addison Barger could accomplish this challenge.

Check out the Maple Learn document to try for yourself!

 

The full program for Maple Conference 2025 is now available. 

The agenda includes two full days of keynote speakers, presentations from Maplesoft product directors and developers, and contributed talks by Maple users all around the world. There will be opportunities to network with fellow users, researchers, and Maplesoft staff.

The final day of the conference will include three in-depth workshops presented by the R&D team.
The workshops will explore how to:

  • Create papers and reports in Maple
  • Solve various differential equations more effectively using Maple's numerical solvers
  • Solve Advent of Code challenges using Maple

Access to the workshops is included with the free conference registration.

We hope to see you there!

Kaska Kowalska
Program Co-chair

When we think about AI, most of us picture tools like ChatGPT or Gemini. However, the reality is that AI is already built into the tools we use every day, even something as familiar as a web search. And if AI is everywhere, then so are its mistakes.

A Surprising Answer from Google

Recently, I was talking with my colleague Paulina, Senior Architect at Maplesoft, who also manages the team that creates all the Maple Learn content. We were talking about Google’s AI Overview, and I said I liked it because it usually seemed accurate. She disagreed, saying she’d found plenty of errors. Naturally, I asked for an example.

Her suggestion was simple: search “is x + y a polynomial.”

So I did. Here’s what Google’s AI Overview told me:

“No, x + y is not a polynomial”

My reaction? HUH?!

The explanation correctly defined what a polynomial is but still failed to recognize that both x and y each have an implicit exponent of 1. The logic was there, but the conclusion was wrong.

Using It in the Classroom

This makes a great classroom example because it’s quick and engaging. Ask your students first whether x + y is a polynomial, then show them the AI result. The surprise sparks discussion: why does the explanation sound right but end with the wrong conclusion?

In just a few minutes, you’ve not only reviewed a basic concept but also reinforced the habit of questioning answers even when they look authoritative.

Why This Matters

As I said in a previous post, the real issue isn’t the math slip, it’s the habit of accepting answers without questioning them. It’s our responsibility to teach students how to use these tools responsibly, especially as AI use continues to grow. Critical thinking has always mattered, and now it’s essential.

 

On the very first day of class, a student once told math educator Sam Densley: “Your class feels safe.”

Open classroom door with students inside

Honestly, I can’t think of a better compliment for a teacher. I reflected on this in a LinkedIn post, and I want to share those thoughts here too.

A Story of Struggle

I rarely admit this, because it still carries a sting of shame. In my role at Maplesoft, people often assume I was naturally good at math. The truth is, I wasn’t. I had to work hard, and I failed along the way.

In fact, I failed my very first engineering course, Fundamentals of Electrical Engineering. Not once, but twice. The third time, I finally earned an A.

That second failure nearly crushed me. The first time, I told myself I was just adjusting to university life. But failing again, while my friends all passed easily, left me feeling stupid, ashamed, and like I didn’t belong.

When I got the news, I called my father. He left work to meet me, and instead of offering empty reassurances, he did something unexpected: he told me about his own struggles in school, the courses he failed, the moments he nearly gave up. Here was someone I admired, a successful engineer, admitting that he had stumbled too.

In that moment, the weight lifted. I wasn’t dumb. I wasn’t alone.

That experience has stayed with me ever since: the shame, the anxiety, the voice in my head whispering “I’m not cut out for this.” But also the relief of realizing I wasn’t the only one. And that’s why I believe vulnerability is key.

When teachers open up, something powerful happens:

  • Students stop thinking they’re the only ones who feel lost.
  • They see that failure isn’t the end; it’s part of the process.
  • It gives students permission to be honest about their own struggles.

That’s how you chip away at math anxiety and help students believe: “I can do this too.”

Why Vulnerability Matters

Abstract metallic mask with mathematical symbols

I can’t recall a single teacher in my own schooling who openly acknowledged their academic struggles. Why is that?

We tell students that “struggle is normal,” but simply saying the words isn’t enough. Students need to see it in us.

When teachers hide their struggles, students assume they’re the only ones who falter. That’s when math anxiety takes root. But when teachers are vulnerable, the cycle breaks. Students realize that struggle doesn’t mean they’re “bad at math.” It means they’re learning. Vulnerability builds trust, and trust is the foundation of a safe classroom.

What I Hear from Instructors

In my work at Maplesoft, I often hear instructors say: “Students don’t come to office hours — I wish they did.”

And I get it. Sometimes students are too anxious or hesitant to ask for help, even when a teacher makes it clear they’re available. That’s one of the reasons we built the Student Success Platform. It gives instructors a way to see where students are struggling without calling anyone out. Even if students stay silent, their struggles don’t stay invisible.

But tools can only go so far. They can reveal where students need support and even help illuminate concepts in new ways. What they can’t do is replace a teacher. Real learning happens when students feel safe, and that safety comes from trust. Trust isn’t built on flawless lectures or perfect answers. It grows when teachers are willing to be human, willing to admit they’ve struggled too.

That’s when students believe you mean it. And that’s when they’re more likely to walk through the door and ask for help.

The Real Lesson

Ultimately, what matters most in the classroom, whether in mathematics or any other subject, isn’t perfection. It’s effort.

As a new school year begins, it’s worth remembering:

  • Students don’t just need formulas.
  • They need to know struggle is normal.
  • They need to know questions are welcome.
  • They need to know the classroom is safe enough to try.

Because long after they move on, that’s what they’ll remember: not just what they learned, but how they felt.

The need to solve quadratic equations never seems to disappear. Whether it is completing a physics problem, solving a differential equation, or performing equilibrium calculations in chemistry, quadratic equations are an integral part of all STEM-based disciplines.

 

Depending on the complexity of the quadratic equation, the typical 'guess-and-check' method taught in most high school classes can often be frustrating and time-consuming. Professor of mathematics Dr. Po-Shen Loh, in his new method shown here, recognizes some important properties of solutions to quadratic equations and integrates them into a more intuitive approach that students are much more likely to feel motivated by.

 

For example, consider the equation x^2 - 14x + 45 = 0. Most students are taught to first factor this equation by thinking of two numbers that multiply to 45 and add to -14. After trying multiple values, we would discover that those values are -5 and -9. We would use these values to factor the equation into the form (x-5)*(x-9) = 0. Setting each factor equal to zero, we would get x = 5 or x = 9. Equivalently, to solve for x more directly, we need two numbers that multiply to 45 and add to 14 (again, x = 5 and x = 9).

 

The only way to speed up this process of guess-and-check is to do enough similar problems until the guesses become second nature. Not to mention, this becomes exponentially more difficult as the coefficient on x^2 increases (for example, solving the equation 6x^2 + 7x - 20 = 0).

 

For the example above, Dr. Loh's method builds on a simple starting point:

 

(i) We know that the numbers (call them R and S) add to 14

(ii) We know that since the numbers add to 14, they must have a mean value of 14/2 = 7

(iii) If the two numbers have an average of 7, they must be an equal 'distance' (call this distance z) from 7

(iv) We can write the two numbers as R = 7+z and S = 7-z

(v) Since the numbers R and S multiply to 45, then (7+z)*(7-z) = 45 ⇒ 49 - z^2 = 45. In other words, z^2 = 4, so z = +2 or z = -2

(vi) The solution to the equation is then R = 7+2 = 9 and S = 7-2 = 5 (as we predicted)

 

We can generalize this idea for any complex coefficients a, b and c in the equation ax^2 + bx + c = 0 to actually prove the quadratic formula. However, using Dr. Loh's method on specific examples (as above) helps build intuition for why the quadratic formula works in the first place. Other proof methods such as completing the square are just as mathematically sound, but they do not utilize the mathematical instinct that makes solving a problem in mathematics so gratifying.

 

Although I am currently a student working for Maplesoft, I had not used Maple Learn extensively beforehand. Dr. Loh's idea of creating a more intuitive way to solve such a conventional problem inspired me to create a document in Maple Learn, linked here, outlining the steps above.

 

Learning new ways to solve a problem in mathematics is exciting, but it is often difficult to present in a way that is clear, visually-appealing and easy to create. Most online mathematical environments are difficult to navigate and typically lack visualizations to accompany an idea. With Maple Learn, it felt comforting to open a clean canvas where I was able to easily build a document in just a few hours that not only summarized the main ideas of this new method, but also showed the user why the method works using live animations and colour schemes (see some examples below).

 

 

I surprised myself (as well as my managers) by how quickly I was able to transfer all of my ideas into the document. I could also split related content into groups and use collapsible sections to keep the document uncluttered and easy to read.

 

I also took advantage of the freedom to explore other documents and directly reference them through hyperlinks.

 

Sometimes it can be difficult to follow a new concept without having some background information. Adding these references makes it simple for the reader to access supporting documents and ensure there are no knowledge gaps to be filled along the way. Once you make a document, you also have the option to publish it to your own gallery and make it public for others to use and learn from.

 

Maple Learn has been incredibly helpful for sharing the things that interest me the most. If you have something related to mathematics that excites you, try not to keep it to yourself. Consider using Maple Learn to share your ideas with the world and see your vision come to life!

With the launch of ChatGPT 5.0, many people are testing it out and circulating their results. In our “random” Slack channel, where we share anything interesting that crosses our path, Filipe from IT posted one that stood out. He’d come across a simple math problem, double-checked it himself, and confirmed it was real:

ChatGPT 5.0 Example

As you can see, the AI-generated solution walked through clean, logical-looking steps and somehow concluded:

x = –0.21

I have two engineering degrees, and if I hadn’t known there was an error, I might not have spotted it. If I’d been tired, distracted, or rushing, I would have almost certainly missed it because I would have assumed AI could handle something this simple.

Most of us in the MaplePrimes community already understand that AI needs to be used with care. But our students may not always remember, especially at the start of the school year if they’ve already grown used to relying on AI without question. 

And if we’re honest, trusting without double-checking isn’t new. Before AI, plenty of us took shortcuts: splitting up the work, swapping answers, and just assuming they were right. I remember doing it myself in university, sometimes without even thinking twice. The tools might be different now, but that habit of skipping the “are we sure?” step has been around for a long time.

The difference now is that general-purpose AI tools such as ChatGPT have become the first place we turn for almost anything we do. They respond confidently and are often correct, which can lead us to become complacent. We trust them without question. If students develop the habit of doing this, especially while they are still learning, the stakes can be much higher as they carry those habits into work, research, and other areas of their lives.

The example above is making its rounds on social media because it’s memorable. It’s a basic problem, yet the AI still got it wrong and in a way that’s easy to miss if you’re not paying attention.

Using it in the classroom can be a great way to help students remember that AI’s answers need to be checked. It’s not about discouraging them from using AI, but about reinforcing the habit of verifying results and thinking critically about what they see.

So here’s my suggestion:

  • Show this example in your class, no matter the subject. If your students are using AI, they’ll benefit from seeing it.
  • Spend 10 minutes discussing it.
  • Use it as a jumping-off point to talk about what’s OK and not OK when using AI for your course.
  • Share other examples like this throughout the year as small reminders, so “critical thinking” becomes second nature.

This isn’t just about catching an AI’s bad subtraction. It’s about building a culture of verification and reasoning in our students. The tools will keep improving, but so will the temptation to turn off our own thinking.

If we can help students get into the habit of checking, AI can be a powerful partner without putting them on autopilot.

To the MaplePrimes community: How do you talk to your students to help them build strong habits when working with AI? Do you bring in examples like this one, or use other strategies? I’d love it if you could share your thoughts, tips, and ideas.

 

We are pleased to announce that the registration for the Maple Conference 2025 is now open!

Like the last few years, this year’s conference will be a free virtual event. Please visit the conference page for more information on how to register.

This year we are offering a number of new sessions, including more product training options, and an Audience Choice session.
Also included in this year's registration is access to an in-depth Maple workshop day presented by Maplesoft's R&D members following the conference.  You can find an overview of the program on the Sessions page. Those who register before September 14th, 2025 will have a chance to vote for the topics they want to learn more about during the Audience Choice session.

We hope to see you there!

Back in 2017, when the concept of Maple Flow was first proposed at Maplesoft, we developed an aspirational brochure to ignite our creative energy. I still have a printed few copies – here’s one that’s sat behind my monitor.

At that time, the product that did not yet exist was called “Maple Whiteboard” and the brochure described what we had gradually come to appreciate that engineers wanted from a calculation tool:

  • simplicity at its beating heart – just learn a few basic game mechanics, and then everything else “flows” (ahem). 
  • units support from the get-go
  • documentation features to describe the analysis
  • connectivity with other software
  • engineering-focused math functions

The first working version of Maple Whiteboard was crude…but the basic building blocks were in place and the concept worked. This image dates from 2019.

We unveiled Maple Flow to the public in 2021 (coming up with the name was a trial in of itself). Here’s what it looked like.

The target audience loved the new product—they liked what it could do now and were excited about its future potential. Our initial assumptions had been validated!

Maple Flow has evolved dramatically since the fever dream of the initial brochure and early prototypes. Even though it's much more powerful now, we've made sure it’s still simple to use.

Today, I’m delighted to announce the launch of Maple Flow 2025. This release is a major turning point for the product. You'll see a clean, new interface, faster performance, and more tools for documentation and moving your work from other programs.

Let me touch on my personal highlights.

A new interface headlines the release! It’s clean and simple, with logically ordered buttons in organized groups.

The ribbon is contextual; for example, click on an image, and you’ll see tools for adding shapes and text.

There's always room for improvement and refinement. Let me know what you think!.

You can now insert a table of contents into your document. The page numbers automatically update, and headings are hyperlinked – just click and you jump to that part of the worksheet.

Hyperlinks in the table of contents are preserved when you export the worksheet to PDF – that’s an awesome navigation feature when you distribute your work.

This feature gives me a visual dopamine hit every time I use it. Look how easy it is to use!

We've decided to release a tool we’ve been using internally for some time. The Maple Flow Migration Assistant is a free addon that helps you convert your Mathcad 13, 14 and 15 worksheets to Maple Flow. 


You can convert single Mathcad worksheets or point to a folder for bulk conversion. You also get many function translations.

Automatically converting executable code between two different high-level math tools is difficult; some manual reworking is probably needed for anything that’s not simple arithmetic (we documented what the Migration Assistant does here). But if you’ve already decided to make the switch from Mathcad 13, 14 or 15, then the Migration Assistant is a great time saver.

Large worksheets now evaluate faster! These are benchmarks from our internal testing suite.

You can now run Maple Flow worksheets through Excel via a simple function call. You can change parameters and get updated results.

To help you set up that function call, an interface walks you through the process.

You can use this feature to develop a simple spreadsheet reporting dashboard or perform parameter sweeps on your Flow analyses.

Large analysis projects can be difficult to manage. 

  • The results of one worksheet might need to be used in another,
  • there may be equations that are reused everywhere, 
  • or you might need to split your project into small chunks that different people can work on separately

Well, now we’ve made that whole process easier! You can now treat Flow worksheets as “black box” functions that you can call from other Flow worksheets. You can even change parameter values, and return updated results

 

The AI Formula Assistant made its debut in Maple 2025 and it sparked a lot of interest (and some interesting conversations about the future of AI in math software). 

By popular demand, we've brought this feature into Flow. You can now look up an engineering formula with a simple natural language query,


 

That's enough of my personal highlights. If you want to know more, visit the What's New pages for a complete rundown and grab a trial.

If you haven’t tried Maple Flow yet, now is the right time to jump in. We have several time-limited launch offers to make the transition to Flow as frictionless as possible; these include offers for users who are

  • deploying a small suite of licenses
  • switching from other tools
  • in large organizations that need a full implementation plan.

As ever, we can only keep Maple Flow on track if you let me know what you want - send all your feedback my way.

We are excited to announce that the Maple Conference will be held Novemeber 5-7, 2025!

Please join us at this free virtual event as it will be an excellent opportunity to meet other members of the Maple community, get the latest news about our products, and hear from the experts about the challenges and opportunities that our technology brings to teaching, learning, and research. More importantly, it's a chance for you to share the work you've been doing with Maple and Maple Learn. 

The Call for Participation is now open. We are inviting submissions of presentation proposals on a range of topics related to Maple, including Maple in education, algorithms and software, and applications. We also encourage submission of proposals related to Maple Learn. 

You can find more information about the themes of the conference and how to submit a presentation proposal at the Call for Participation page. Applications are due July 25, 2025.

After the conference, all accepted presenters and invited speakers will be invited to submit related content to the Maple Transactions journal for consideration.

Registration for attending the conference will open in July.  Watch for further announcements in the coming weeks.

We hope all of you in the Maple Primes community will join us for this event!

Kaska Kowalska
Contributed Program Co-Chair

Last week, DeepMind announced their new AlphaEvolve software along with a number of new results improving optimization problems in a number of areas. One of those results is a new, smaller, tensor decomposition for multiplying two 4 × 4 matrices in 48 scalar multiplications, improving on the previous best decomposition which needed 49 multiplications (Strassen 1969). While the DeepMind paper was mostly careful about stating this, when writing the introduction, the authors mistakenly/confusingly wrote:

For 56 years, designing an algorithm with rank less than 49 over any field with characteristic 0 was an open problem. AlphaEvolve is the first method to find a rank-48 algorithm to multiply two 4 × 4 complex-valued matrices

This gives the impression that this new result is the way to multiply 4 × 4 matrices over a field with the fewest number of field multiplications. However, it's not even the case that Strassen's 1969 paper gave a faster way to multiply 4 × 4 matrices. In fact, Winograd's 1968 paper on computing inner products faster could be applied to 4 × 4 matrix multiplication to get a formula using only 48 multiplications. Winograd uses a trick 

which changes two multiplications of ab's into one multiplication of ab's and two multiplications involving only a's or only b's. The latter multiplications can be pre-computed and saved when calculating all the innerproducts in a matrix multiplication

    # i=1..4, 4*2 = 8 multiplications
    p[i] := -A[i, 1]*A[i, 2] - A[i, 3]*A[i, 4];
    # 4*2 = 8 multiplications
    q[i] := -B[1, i]*B[2, i] - B[3, i]*B[4, i];
    # i=1..4, j=1..4, 4*4*2 = 32 multiplications
    C[i,j] = p[i] + q[j] 
            + (A[i,1]+B[2,j])*(A[i,2]+B[1,j])
            + (A[i,3]+B[4,j])*(A[i,4]+B[3,j])

It is simple to verify that C[i,j] = A[i,..] . B[..,j] and that the above formula calculates all of the entries of C=A.B with 8+8+32=48 multiplications.

So, if Winograd achieved a formula of 48 multiplications in 1968, why is the DeepMind result still interesting? Well, the relative shortcoming of Winograd's method is that it only works if the entries of A and B commute, while tensor decomposition formulas for matrix multiplication work even over noncommutative rings. That seems very esoteric, but everyone's favorite noncommutative ring is the ring of n × n matrices. So if a formula applies to a matrix of matrices, that means it gives a recursive formula for matrix multiplication - an tensor decomposition formulas do just that.

The original 1969 Strassen result gave a tensor decomposition of 2 × 2 matrix multiplication using only 7 scalar multiplications (rather than the 8 required by the inner product method) and leads to a recursive matrix multiplication algorithm using O(N^log[2](7)) scalar multiplications. Since then, it has been proved that 7 is the smallest rank tensor decompostion for the 2 × 2 case and so researchers have been interested in the 3x3 and larger cases. Alexandre Sedoglavic at the University of Lille catalogs the best tensor decompositions at https://fmm.univ-lille.fr/ with links to a Maple file with an evaluation formula for each.

The previous best 4 × 4 tensor decomposition was based on using the 2 × 2 Strassen decomposition recursively (2 × 2 matrix of 2 × 2 matrices) which led to 7 2 × 2 matrix multiplications each requiring 7 scalar multiplications, for a total of 49. The new DeepMind result reduces that to 48 scalar multiplications, which leads to a recursive algorithm using O(Nlog[4](48)) scalar multiplications: O(N2.7925) vs. O(N2.8074) for Strassen. This is a theoretical improvment over Strassen but in 2024 the best known multiplication algorithm has much lower complexity: O(N2.3713) (see https://arxiv.org/abs/2404.16349). Now, there might be some chance that the DeepMind result could be used in practical implementations, but its reduction in the number of multiplications comes at the cost of many more additions. Before doing any optimizations I counted 1264 additions in the DeepMind tensor, compared to 318 in the unoptimized 4×4 Strassen tensor (which can be optimized to 12*7+4*12=132). Finally, the DeepMind tensor decomposition involves 1/2 and sqrt(-1), so it cannot be used for fields of characteristic 2, or fields without sqrt(-1). Of course, the restriction on characteristic 2 is not a big deal, since the DeepMind team found a 4 × 4 tensor decomposition of rank 47 over GF2 in 2023 (see https://github.com/google-deepmind/alphatensor).

Now if one is only interested in 4 × 4 matrices over a commutative ring, the Winograd result isn't even the best possible. In 1970, Waksman https://ieeexplore.ieee.org/document/1671519 demonstrated an improvement of Winograd's method that used only 46 multiplications as long as the ring allowed division by 2. Waksman's method has since been improved by Rosowski to remove the divisions see https://arxiv.org/abs/1904.07683. Here's that nice compact straight-line program that computes C = A · B in 46 multiplications.

And, here attached are five Maple worksheets that create the explicit formulas for the 4 × 4 matrix methods mentioned in this post, and verify their operation count, and that they are correct.

Strassen_444.mw  Winograd.mw  DM_444.mw Waksman.mw Rosowski.mw

If you are interested in some of the 2022 results, I posted some code to my GitHub account for creating formulas from tensor decompositions, and verifying them on symbolic matrix multiplications: https://github.com/johnpmay/MapleSnippets/tree/main/FastMatrixMultiplication

The clear followup question to all of this is:  does Maple use any of these fast multiplication schemes?  And the answer to that is mostly hidden in low level BLAS code, but generally the answer is: No, the straightforward inner-product scheme for multiplying matrices optimized memory access thus usually ends up being the fastest choice in most cases.

1 2 3 4 5 6 7 Last Page 1 of 36