mmcdara

7891 Reputation

22 Badges

9 years, 55 days

MaplePrimes Activity


These are replies submitted by mmcdara

@Josolumoh 

 

There are different ways. The one I prefer is ExportMatrix (look to the corresponding help pages).
You will be able to export your matrix in csv format and you will have no difficuty to read it in R (I use to use R and I know Maple has some facilities to "communicate" with R [see CodeGenerarion --> R], but I never tested them).

 

Circles or Cycles?

For cycles : package GraphTheory,  CycleBasis returns a basis of cycles generally non unique) and its cardinal (cyclomatic number) could be what you're looking for

@acer 

Effectively, I get the same single result writting u > 0.
Strange
 

My version :
Standard Worksheet Interface, Maple 2015.2, Mac OS X, December 21 2015 Build ID 1097895

Thank you acer.

@Josolumoh 

Finally I can give you the solution earlier than expected (sand15 at the office -->  mmcdara at home) 
 


 

restart

with(Statistics):

p := (b/2) / (1+d*x+b/2):
A := (x+r-1)!/(r-1)!/x! * p^r * (1-p)^x:

B := unapply(simplify(A / (1+d*x)), [r, b, d]);

proc (r, b, d) options operator, arrow; GAMMA(x+r)*(b/(2*d*x+b+2))^r*2^x*((d*x+1)/(2*d*x+b+2))^x/(GAMMA(r)*GAMMA(x+1)*(d*x+1)) end proc

(1)

beta := Array(0..2, [0.25, 0.5, 3.2]);
N    := 10:

X__0 := 1:
X__1 := RandomVariable(Binomial(N, 4/10));
X__2 := RandomVariable(Normal(0, 1));


X__B := exp( add(beta[k] *~ X__||k, k=0..2) ) ;

beta := Array(0..2, {(1) = .25, (2) = .5})

 

_R

 

_R0

 

exp(.25+.5*_R+3.2*_R0)

(2)

Sample_b := Sample(X__B, N);

Sample_b := Vector[row](10, {(1) = 13.537439300265703, (2) = 4.860798930674968, (3) = 81.64146542224445, (4) = 30.391748828726648, (5) = 2.4529225069176386, (6) = 197.3468017092814, (7) = 2.1678867365536125, (8) = .5525227661233436, (9) = .6754751128537665, (10) = 2.4736944907096627}, datatype = float[8])

(3)

SUBS := n -> [r=3, b=Sample_b[n], d=2.5];

proc (n) options operator, arrow; [r = 3, b = Sample_b[n], d = 2.5] end proc

(4)

Xmax       := 100:  # assume to be large enough, should be verified
NperB      := 25:
U          := RandomVariable(Uniform(0, 1)):
AllSamples := NULL:

for n from 1 to N do
   PointwiseB     := [ seq([x, B(rhs~(SUBS(n))[])], x in [$0..Xmax]) ]:
   ShouldBe1      := add(op~(2, PointwiseB)):
   PointwiseB     := [ seq([x, B(rhs~(SUBS(n))[]) / ShouldBe1], x in [$0..Xmax]) ]:
   c              := CumulativeSum(op~(2, PointwiseB));
   ApproximateCDF := zip((u, v) -> [u[1], v], PointwiseB, convert(c, list)):
   AllSamples     := AllSamples , map(u -> ListTools:-BinaryPlace(op~(2, ApproximateCDF), u), Sample(U, NperB));
end do:

AllSamples := < AllSamples >;

AllSamples := Vector(4, {(1) = ` 10 x 25 `*Matrix, (2) = `Data Type: `*anything, (3) = `Storage: `*rectangular, (4) = `Order: `*Fortran_order})

(5)

# if you want to generate samples in the range 0..UpperQuantile (here 10)

Xmax       := 100:  # assume to be large enough, should be verified
NperB      := 25:
U          := P -> RandomVariable(Uniform(0, P)):
AllSamples := NULL:

UpperQuantile := 10:
for n from 1 to N do
   PointwiseB     := [ seq([x, B(rhs~(SUBS(n))[])], x in [$0..Xmax]) ]:
   ShouldBe1      := add(op~(2, PointwiseB)):
   PointwiseB     := [ seq([x, B(rhs~(SUBS(n))[]) / ShouldBe1], x in [$0..Xmax]) ]:
   c              := CumulativeSum(op~(2, PointwiseB));
   ApproximateCDF := zip((u, v) -> [u[1], v], PointwiseB, convert(c, list)):
   UpperProba     := op~(2, ApproximateCDF)[UpperQuantile];
   AllSamples     := AllSamples , map(u -> ListTools:-BinaryPlace(op~(2, ApproximateCDF), u), Sample(U(UpperProba), NperB));
end do:

AllSamples := < AllSamples >;

AllSamples := Vector(4, {(1) = ` 10 x 25 `*Matrix, (2) = `Data Type: `*anything, (3) = `Storage: `*rectangular, (4) = `Order: `*Fortran_order})

(6)

 


 

Download B_Distribution_with_Random_b.mw



PS: add this at the end of the code to have a picture of the PDFs


for n from 1 to N do
   PointwiseB  := [ seq([x, B(rhs~(SUBS(n))[])], x in [$0..Xmax]) ]:
   ShouldBe1   := add(op~(2, PointwiseB)):
   pl||n       := plot(B(rhs~(SUBS(n))[]) / ShouldBe1, x=0..5);
end do:
plots:-display(seq(pl||n, n=1..N));

 

@Wavish 


Fitting raises two questions: "what is the technical way to do the fitting", be before all "what can I want to fit".
While one is not capable to describe the "thing to fit or to fit to" otherwise than "by eye", the problem will remain unsolvable (that's not even a problem).

At the evidence your line is not a regression line, excepted if the points above have very high weights. So, first questions: "have all your points the same weights?"
Second question: your points have different colors, "what do they mean?"

Maybe you face a linear classication problem for there are no points below the line that have the type (colo and chape) of points above. 
If it is so let me know or, if you use Maple 2018, try DNNClassifier.



 

@Christopher2222 


To complete my previous answer and prove there is no bug but just a wrong use of LinearFit

Download LinearFit.mw

@Jalale 

Animation is very time consuming: dropping P balls on a N layer Galton board requires about N*P frames with my code.
So avoid trying to simulate a large number of balls to obtain a "beautifiul" bell-shap curve at the end.

To complete my previous answer, here is the file whivh contains my first works on your problem.
There is no animation, but a pretty final result with 10^4 balls dropped can be obtain in a few seconds.

Maybe it will help you to customize your own "Galton Board"?


 

restart:

see for instance https://en.wikipedia.org/wiki/Random_walk

with(Statistics):
with(plots):
with(plottools):

LT := 10:
T  := transform((x,y) -> [y, -x]):
GaltonBoard := [seq(seq([i, j], j in [seq(-i..i, 2)]), i=0..LT)]:
p := plot(GaltonBoard, style=point, symbol=solidcircle, symbolsize=20, scaling=constrained, gridlines=true, color=blue):
display(T(p))

 

# Discrete model of the true trajectory

R := rand(0 .. 1);
trajectory := proc(N)
   local depart, t, n:
   depart := [0, 0];
   t      := depart:
   for n from 1 to N do
      depart := depart +~ [1, (2*R()-1)];
      t      := t, depart
   end do:   
   return [t]
end proc:

proc () (proc () option builtin = RandNumberInterface; end proc)(6, 2, 1) end proc

(1)

t:= plot([seq(trajectory(LT), k=1..3)], color=[green, gold, red], thickness=[3$3]):
display(T(p), T(t))

 

# 10^4 trajectories

tt := NULL:
NT := 10^4:

LT          := 25:
GaltonBoard := [seq(seq([i, j], j in [seq(-i..i, 2)]), i=0..LT)]:
p           := plot(GaltonBoard, style=point, symbol=solidcircle, symbolsize=10, scaling=constrained, gridlines=true, color=blue):

for nt from 1 to NT do
   tt := tt, trajectory(LT)
end do:
tt := [tt]:

H := map(n -> tt[n][-1][2], [$1..NT]):
B := RandomVariable(Normal((Mean, StandardDeviation)(H)));

Kurtosis(Normal(mean, std));
Kurtosis(H);

_R0

 

3

 

HFloat(2.9881727449554654)

(2)

HH    := Histogram(H, discrete, thickness=5, frequencyscale=absolute):
MaxH  := max(rhs~(Tally(H))):
alpha := MaxH / eval(PDF(B, y), y=0):
P     := plot(alpha*PDF(B, y), y=min(H)..max(H), color=red):
TH    := transform((x,y) -> [x, y*(LT/MaxH)-(2.2*LT)]):
display(T(p), TH(HH), TH(P))

 

 


 

Download Galton-unanimated.mw

@Carl Love 

 

And so was I

@acer 

This way of proceeding seemed strange to me at first sight but, indeed, there are several arguments in favour of it.

@acer 

"There's not much benefit from rtable_elems returning (or lprint displaying) the indices that take on the default value.": that's a point of view.
And why not after all? In fact this is the same attitude as that adopted when indices(SomeMAtrix) does not return the indices of zeros

Tank you acer 

@acer 

I read among the replies of a previous question this same kind of remark concerning the memory used (I believe it was Carl replying to Jalale ???).

You evoke the property of a command to be thread-safe: I'm very interested in thgis point. There still remains things I do not really understand about this "thread-safe property".
But I do not want to deviate from my initial question and these concerns about  TT will be the subject of a dedicated question.

Thanks for your reply

@acer 

Right, I did the test and it was not very conclusive regarding to the execution time 
I do not understand why CodeTools:-Usage sometimes returns very different results: running again the CodeTools:-Usage(cot~(Sample(U, N))) command of my inititial post then returned a cpu time of 75 ms!!!! (meaning Carl's tip would not be a big improvement)

But, regarding to the memory used, the inplace option Carl suggested halves the memory used, this latter becoming practically the same used by the "natural" Statistics:-Sample method.


 

@Carl Love 

Thank for the tip, Carl

@radaar 

What do you want: an explanation to this stagnation you observe or a way out?

If only the explanation matters to you, I can't do anything for you.

I do not know how the "envelope" strategy is implemented in Maple, but I know rather well acceptance/rejection methods. And I can tell you this: their efficiency is strongly dependant on the envelope distribution that is chosen. Once this said, choosing an envelope ranging from -1 to +1 to sample a gaussian distribution with standard deviation of 0.001 is stupid (no offence).
Nevertheless, this is a rule of common sense that does not explain the difficulties you have encountered, especially since the "envelope" method is said to be "adaptative" in the help pages.


If you want to generate efficiently 1000 samples of gaussian distributions with decreasing standard deviations look again to my previous sending.


To conclude this,  here is a remark: the followinq sequence works perfectly well (here again I do not know why this "epsilon" seems necessary)

Digits  := 10:
epsilon := 10.0^(-Digits);
 for i from 1 to MaxIt do 
    printf(
                 "%a\n%a\n",
                 Sample(XG(h[i]-epsilon), 5, method = [envelope, range = max(-1, -10*h[i])..min(1, 10*h[i])]),
                 i
              );
 end do:



 

@tomleslie  @Kitonum

Sorry, I just answered a new question from radaar, after this one but on the same subject , and I read your contributions after my answer.
I hope I didn't interfere too much.
 

First 133 134 135 136 137 138 139 Last Page 135 of 154