C_R

3602 Reputation

21 Badges

6 years, 102 days

MaplePrimes Activity


These are replies submitted by C_R

@janhardo 

Expr := Int(1 / (sqrt(-alpha*l^2 + 1) * %sqrt(-alpha*l^2 * (x + 1)/2 + 1)), alpha = 0 .. z);
Expr = value(Expr);

All by hand...not very practical in larger expressions
Thank you

@janhardo 

Yes Il-subs is the normal form I would like to achieve. 

So far I could only achieve it by hand and making sqrt inert to prevent automatic simplification

@nm 

I was about to ask if I am the only one in need of this functionality. Since MMA has it, there must be a need.
Thanks for checking

@janhardo

The first one in this particular example. It is closer to the normalized form of first incomplete elliptic integral.

Only for information:
Ideally after further manipulation I would like the get the following displayed

However automatic simplification kicks in and prevents this by automatically factoring out 2.

@Kitonum 

That looks promissing! A magic wand command (that I was hoping for but apparently does not exist) could apply the above applyop command over all subexpression provided that a listlist of all branches of subexpressions of the "main expression" (the equation in my example) exsisted.

@Kitonum 

I was not aware of these commands. Reading the top level description of the help page I would not have guessed that they work also on square roots of polynomials. Unfortunately they are only effective on such objects.

In my original post I was looking for a command that "brushes" over a larger expression and does the extraction on every occurence of a square root that allows for such an operation. I have added an example in which the commands you provided are not effective on either side of an equation. 

I have tried to make a function out of your commands and somehow "map" it over all the subexpressions. This did not work well but maybe someone can provide a highlevel command that does it. That is where I stopped with partial success on the rhs

extract_prim := x -> content(x)*primpart(x);
applyop(extract_prim, {i $ (i = 1 .. nops(rhs(expr)))}, rhs(expr));

Too complicated IMO.

Anyway, thank you for the answer. I still hope for something better but I have doubts now. 

@sand15 @nm

It seems that the convention Maple uses (sin(phi) instead of the amplitude phi for the first argument of EllipticE) is the trouble maker when doing the integration. Without allsolutions a wrapped representation of the integral is produced. This is not what we expect for the integral. The need of using allsolutions is unfortunate to get the expected solution. 

plot(int((1 + sin(x)^2)^(1/2), x = 0 .. x, allsolutions),x=0..10);

Also the piecewise expression behind the plot is very complicated. There might be other advantages for this convention but here I do not see any. Who knows... 
I am wondering whether some of the struggles I have would not happen on other CAS platforms. 
IMO, Maple should not have returned the wrapped expression in the first place or at least give a hint to the unprepared user about the restricted domain/wrapped results (the definition section of FunctionAdvisor(EllipticE) is not helpfull in this respect) or the use of allsolutions. 

@sand15 

X0, Y0, X and Y are column vectors.

The first error message is generaded in Step 1 here

Download Compare_outputs_2015_your_version_reply.mw

@sand15 

I did not know about GPE.
Interesting approach in general. It seems to allow tuning the fit fidelty at regions of interest.

In your comments you use L as a synomym for lambda. Correct?

I am impressed by how little information is required for such a good fit. Hard to believe.

Could you have a look at the attachment. Something is not working in the proc KrigingModel. 

Download Iterative_Gaussian_Process_Emulator_2025.mw

@nm

I did not see your post. I just discovered that Windows restarted my computer. When I execute now this help page

help("ExcelTools:-Export")

and click on an error message

the help page for the error message opens.

That is strange. Since my IP adress has not changed I expected the firewall bock again. I suspected my IP adress beeing part of the problem and wanted to try accessing the help page via VPN.

@dharr @sand15

Noise-free

I am impressed by the fidelity of the rational fit. It’s much better than the modulated trig function without adding too much extra parameters. In the meantime, I have found a symbolic regression tool (TuringBot). I could run it in demo mode with 50 data points with a custom model y=f1(x)+cos(f2(x)). I am getting close to sands15 attempts.

The tool tells me that it has tested close to a billion “formulas” (models satisfying the custom model). Even in its demo mode the tool is interesting since it provides many options you have mentioned in your discussion. The “Search Metric” Mean Error seems to work better on the data set. While writing, I noticed that the tool has now discontinued optimizing the model above (blue curve) and found something better
 


So far (in its default mode) the tool has not come up with a rational model. Instead, pretty good fit was achieved involving special functions: Simple in terms of length but complicated in terms of functions involved. That’s not what I want.  However, the tool is at least helpful for me (probably not for you) in exploring classes of models and finding initial conditions for subsequent optimization.

The data set:

The data set I have added to my original post describes exemplarily the error of an approximation for an exact solution of a non-linear differential equation.

My intention is to improve the approximation by an error compensating term. The approximation and the error compensation term(s) should be short and simple enough to be copied by hand. Processing by simple processors that do not come with sophisticated libraries should be possible in an efficient way.

From your comments I have learned that noise of an (unknown) process plays an important role in the selection of model optimization methods. With that knowledge I would have written my original post differently, mentioning that the data set is noise-free.

I also realized that only providing a data set is (in some instances) insufficient input for the selection of a model and for optimization methods. For instance, there could be a additional constraint of “convergence to a linearized solution” for some data points. This means that a fit should reproduce the value and/or slope at selected data points. For the exemplary data set this would be no residual at x=0. Fitting not the error but to the solution of the non-linear DE additionally makes the same slope a desirable constraint.

It looks like that I have to bake all the extra constraints into models before I can start optimizing and profiting from the codes and the background you have provided.

I still think that your discussion and insights merit a larger audience. It was a pleasure studying it.

Thank you again for all your time!

@sand15 

A big thanks for your time. I will study this ( including your last point), in depth over the week-end.

@sand15 

I had time to study your code. Great stuff, really!

Your code contains two essential elements required to automate model finding: Generating models from a model type and determining residuals. I assume that dedicated Symbolic Regression software packages allow for further user input as:

- complexity (length of the model, leaft count, ...)

- accuracy (allowed error)

- objective function for the residuals (Maples Optimization package has an option for that)

- model type selector 

Please make the reply above an answer that I can vote up for it.

Please also consider making the reply with your lost comments a post that more users can read it. I think symbolic regression is of general interest and Maple has already many functions on board that are required for a dedicated command or application.

@sand15 

Yes I am very interested. All what I have tried was not satisfying.

@dharr 

My requirement: An algorithm (not me) that searches for a model that fits a given data set best.

My best guess for the dataset above is a periodic function with a modulated argument. With that guess I could establish a few models and then try non-linear regression on them and compare the fidelity of the fits.

Symbolic regression is supposed to do this for me (including even models I do not imagine).

1 2 3 4 5 6 7 Last Page 1 of 71