Andrew Moylan
2006-11-26 09:27:57 UTC
Hi all,
Suppose I want to evaluate an expression at a given precision. What is
the difference between using N[expr, precision] and using
SetPrecision[expr, precision]?
I've noticed that SetPrecision seems to be equivalent even in such
situations as e.g. N[Integrate[...]] automatically calling
NIntegrate[...] when the integral can't be done exactly:
SetPrecision[Integrate[x^x, {x, 0, 1}], 20]
and
N[Integrate[x^x, {x, 0, 1}], 20]
both give
0.78343051071213440706
Are there important differences between SetPrecision and N that I
should be aware of?
Cheers,
Andrew
Suppose I want to evaluate an expression at a given precision. What is
the difference between using N[expr, precision] and using
SetPrecision[expr, precision]?
I've noticed that SetPrecision seems to be equivalent even in such
situations as e.g. N[Integrate[...]] automatically calling
NIntegrate[...] when the integral can't be done exactly:
SetPrecision[Integrate[x^x, {x, 0, 1}], 20]
and
N[Integrate[x^x, {x, 0, 1}], 20]
both give
0.78343051071213440706
Are there important differences between SetPrecision and N that I
should be aware of?
Cheers,
Andrew