How to resolve the algorithm Van der Corput sequence step by step in the Delphi programming language

Published on 12 May 2024 09:40 PM

How to resolve the algorithm Van der Corput sequence step by step in the Delphi programming language

Table of Contents

Problem Statement

When counting integers in binary, if you put a (binary) point to the right of the count then the column immediately to the left denotes a digit with a multiplier of

2

0

{\displaystyle 2^{0}}

; the digit in the next column to the left has a multiplier of

2

1

{\displaystyle 2^{1}}

; and so on. So in the following table: the binary number "10" is

1 ×

2

1

0 ×

2

0

{\displaystyle 1\times 2^{1}+0\times 2^{0}}

. You can also have binary digits to the right of the “point”, just as in the decimal number system. In that case, the digit in the place immediately to the right of the point has a weight of

2

− 1

{\displaystyle 2^{-1}}

, or

1

/

2

{\displaystyle 1/2}

. The weight for the second column to the right of the point is

2

− 2

{\displaystyle 2^{-2}}

or

1

/

4

{\displaystyle 1/4}

. And so on. If you take the integer binary count of the first table, and reflect the digits about the binary point, you end up with the van der Corput sequence of numbers in base 2. The third member of the sequence, binary 0.01, is therefore

0 ×

2

− 1

1 ×

2

− 2

{\displaystyle 0\times 2^{-1}+1\times 2^{-2}}

or

1

/

4

{\displaystyle 1/4}

. This sequence is also a superset of the numbers representable by the "fraction" field of an old IEEE floating point standard. In that standard, the "fraction" field represented the fractional part of a binary number beginning with "1." e.g. 1.101001101. Hint A hint at a way to generate members of the sequence is to modify a routine used to change the base of an integer: the above showing that 11 in decimal is

1 ×

2

3

0 ×

2

2

1 ×

2

1

1 ×

2

0

{\displaystyle 1\times 2^{3}+0\times 2^{2}+1\times 2^{1}+1\times 2^{0}}

. Reflected this would become .1101 or

1 ×

2

− 1

1 ×

2

− 2

0 ×

2

− 3

1 ×

2

− 4

{\displaystyle 1\times 2^{-1}+1\times 2^{-2}+0\times 2^{-3}+1\times 2^{-4}}

Let's start with the solution:

Step by Step solution about How to resolve the algorithm Van der Corput sequence step by step in the Delphi programming language

Source code in the delphi programming language

function VanDerCorput(N,Base: integer): double;
{Calculate binary value for numbers right of decimal}
var Value,Exponent,Digit: integer;
begin
Value:= N; Result:= 0; Exponent:= -1;
{D1 * Base^-1 + D2 * Base^-2 + D3 * Base^-3}
while Value > 0 do
	begin
	{Get digit in specified base}
	Digit:=Value mod Base;
	{Digit * Base^-Exponent}
	Result:=Result + Digit * Power(Base,Exponent);
	{Divide by base to put next digit in place}
	Value:= Value div Base;
	{Next exponent}
	Dec(Exponent);
	end;
end;


procedure ShowVanDerCorput(Memo: TMemo);
{Show Vander Coput numbers for bases 2..8 and items 1..9 }
var Base,N: integer;
var V: double;
var S: string;
begin
S:='';
for Base:=2 to 8 do
	begin
	S:=S+Format('Base %D:',[Base]);
	for N:=1 to 10 do
		begin
		V:=VanDerCorput(N,Base);
		S:=S+Format(' %1.5f',[V]);
		end;
	S:=S+CRLF;
	end;
Memo.Lines.Add(S);
end;


  

You may also check:How to resolve the algorithm Knuth's power tree step by step in the Julia programming language
You may also check:How to resolve the algorithm Comments step by step in the Perl programming language
You may also check:How to resolve the algorithm Bitwise operations step by step in the F# programming language
You may also check:How to resolve the algorithm Digital root step by step in the Befunge programming language
You may also check:How to resolve the algorithm Letter frequency step by step in the ACL2 programming language