Talk:Total circles area: Difference between revisions

(→‎Excessive digits: eh, you do that)
 
(3 intermediate revisions by 3 users not shown)
Line 110:
::With 100 million points, the Python VdC area estimate is 21.565056432232886. --[[User:Paddy3118|Paddy3118]] 15:24, 22 September 2012 (UTC)
::: Maybe you want to write part of such information inside the page, instead of just here.
 
::: The comparisons are incorrect. Here are two modified examples: <lang python>from __future__ import division
from math import sqrt, atan2
from itertools import count
from pprint import pprint as pp
try:
from itertools import izip as zip
except:
pass
 
# Remove duplicates and sort, largest first.
circles = sorted(set([(0.001,0.523,1)]))
 
def vdcgen(base=2):
'Van der Corput sequence generator'
for n in count():
vdc, denom = 0,1
while n:
denom *= base
n, remainder = divmod(n, base)
vdc += remainder / denom
yield vdc
 
def vdc_2d():
'Two dimensional Van der Corput sequence generator'
for x, y in zip(vdcgen(base=2), vdcgen(base=3)):
yield x, y
 
def bounding_box(circles):
'Return minx, maxx, miny, maxy'
return (min(x - r for x,y,r in circles),
max(x + r for x,y,r in circles),
min(y - r for x,y,r in circles),
max(y + r for x,y,r in circles)
)
def circle_is_in_circle(c1, c2):
x1, y1, r1 = c1
x2, y2, r2 = c2
return (sqrt((x2 - x1)**2 + (y2 - y1)**2) + r2) <= r1
 
def remove_covered_circles(circles):
'Takes circles in decreasing radius order. Removes those covered by others'
i, covered = 0, []
while i < len(circles):
c1 = circles[i]
eliminate = []
for c2 in circles[i+1:]:
if circle_is_in_circle(c1, c2):
eliminate.append(c2)
if eliminate: covered += [c1, eliminate]
for c in eliminate: circles.remove(c)
i += 1
#pp(covered)
 
def main(circles):
print('Originally %i circles' % len(circles))
print('Bounding box: %r' % (bounding_box(circles),))
remove_covered_circles(circles)
print(' down to %i due to some being wholly covered by others' % len(circles))
minx, maxx, miny, maxy = bounding_box(circles)
# Shift to 0,0 and compute r**2 once
circles2 = [(x - minx, y - miny, r*r) for x, y, r in circles]
scalex, scaley = abs(maxx - minx), abs(maxy - miny)
pcount, inside, last = 0, 0, ''
for px, py in vdc_2d():
pcount += 1
px *= scalex; py *= scaley
for cx, cy, cr2 in circles2:
if (px-cx)**2 + (py-cy)**2 <= cr2:
inside += 1
break
if not pcount % 100000:
area = (inside/pcount) * scalex * scaley
print('Points: %8i, Area estimate: %r'
% (pcount, area))
# Hack to check if precision OK
this = '%.4f' % area
if this == last:
break
else:
last = this
print('The value has settled to %s' % this)
print('error = %e' % (area - 4 * atan2(1,1)))
 
if __name__ == '__main__':
main(circles)</lang>
 
::: <lang python>from collections import namedtuple
from math import floor, atan2
 
Circle = namedtuple("Circle", "x y r")
 
circles = [ Circle(0.001, 0.523, 1) ]
def main():
# compute the bounding box of the circles
x_min = min(c.x - c.r for c in circles)
x_max = max(c.x + c.r for c in circles)
y_min = min(c.y - c.r for c in circles)
y_max = max(c.y + c.r for c in circles)
 
box_side = 512
 
dx = (x_max - x_min) / float(box_side)
dy = (y_max - y_min) / float(box_side)
 
y_min = floor(y_min / dy) * dy
x_min = floor(x_min / dx) * dx
 
count = 0
 
for r in xrange(box_side + 2):
y = y_min + r * dy
for c in xrange(box_side + 2):
x = x_min + c * dx
for circle in circles:
if (x-circle.x)**2 + (y-circle.y)**2 <= (circle.r ** 2):
count += 1
break
 
print "Approximated area:", count * dx * dy
print "error = %e" % (count * dx * dy - 4 * atan2(1,1))
 
main()</lang>
::: Both examples have exactly one unit circle, and the regular method's error is about 8 times smaller and runs 10 times faster. Both being more or less uniform sampling, there's no inherent reason one should be more accurate than the other, but regular sampling is just going to be faster due to simplicity. Unless you can provide good reasoning or meaningful error estimate, it's very inapproriate to claim some method is better at it than others. --[[User:Ledrug|Ledrug]] 08:30, 15 October 2012 (UTC)
 
::Hi Ledrug, are you saying that the results quoted may be correct for the programs as given on the task page, but are either:
::* Not due to the use Van der Corput?
::* Or peculiar to the example circles chosen?
I must admit that I I only trawled the results stated for the other programs and did not do any extensive reruns with different circles as you have done. I just extracted accuracy vs points. --[[User:Paddy3118|Paddy3118]] 08:56, 15 October 2012 (UTC)
 
== C89 vs C99 ==
Line 137 ⟶ 266:
 
There are a few concerns about this algorithm: firstly, it's fast only if there are few circles. Its complexity is maybe O(N^3) with N = number of circles, while normal scanline method is probably O(N * n) or less, with n = number of scanlines. Secondly, step 4 needs to be accurate; a small precision error there may cause an arc to remain or be removed by mistake, with disastrous consequences. Also, it's difficult to estimate the error in the final result. The scanline or Monte Carlo methods have errors mostly due to statistics, while this method's error is due to floating point precision loss, which is a very different can of worms. --[[User:Ledrug|Ledrug]] 01:24, 15 October 2012 (UTC)
 
== Integral solution? ==
 
Does this solution with the [[wp:Heaviside step function|Heaviside step function]] (<math>H</math>) work?
 
:<math>\int_{(x,y)\in\mathbb{R}^2} (\prod_{i=1}^N H((r^{(i)})^2 - (x-c_x^{(i)})^2 - (y-c_y^{(i)})^2)) dx dy</math>
 
If it does maybe then we can write <math>H(x) = \lim_{k\to\infty}\frac{1}{1+e^{-2kx}}</math> and hope for things to simplify somehow?
--[[User:Grondilu|Grondilu]] ([[User talk:Grondilu|talk]]) 12:51, 3 October 2023 (UTC)
1,934

edits