How to resolve the algorithm Benford's law step by step in the Go programming language

Published on 12 May 2024 09:40 PM
#Go

How to resolve the algorithm Benford's law step by step in the Go programming language

Table of Contents

Problem Statement

Benford's law, also called the first-digit law, refers to the frequency distribution of digits in many (but not all) real-life sources of data. In this distribution, the number 1 occurs as the first digit about 30% of the time, while larger numbers occur in that position less frequently: 9 as the first digit less than 5% of the time. This distribution of first digits is the same as the widths of gridlines on a logarithmic scale. Benford's law also concerns the expected distribution for digits beyond the first, which approach a uniform distribution. This result has been found to apply to a wide variety of data sets, including electricity bills, street addresses, stock prices, population numbers, death rates, lengths of rivers, physical and mathematical constants, and processes described by power laws (which are very common in nature). It tends to be most accurate when values are distributed across multiple orders of magnitude. A set of numbers is said to satisfy Benford's law if the leading digit

d

{\displaystyle d}

(

d ∈ { 1 , … , 9 }

{\displaystyle d\in {1,\ldots ,9}}

) occurs with probability For this task, write (a) routine(s) to calculate the distribution of first significant (non-zero) digits in a collection of numbers, then display the actual vs. expected distribution in the way most convenient for your language (table / graph / histogram / whatever). Use the first 1000 numbers from the Fibonacci sequence as your data set. No need to show how the Fibonacci numbers are obtained. You can generate them or load them from a file; whichever is easiest. Display your actual vs expected distribution.

For extra credit: Show the distribution for one other set of numbers from a page on Wikipedia. State which Wikipedia page it can be obtained from and what the set enumerates. Again, no need to display the actual list of numbers or the code to load them.

Let's start with the solution:

Step by Step solution about How to resolve the algorithm Benford's law step by step in the Go programming language

This Go program generates and analyzes the first 1000 Fibonacci numbers. A detailed breakdown of the code:

  1. Package and Imports:

    • The program starts with the package main declaration, indicating that it's the entry point of the executable.
    • It imports the fmt package for input and output and the math package for mathematical functions.
  2. Fib1000 Function:

    • This function generates the first 1000 Fibonacci numbers and stores them in a slice r.
    • It uses the classic Fibonacci sequence calculation: starting with a and b as 0 and 1, it repeatedly calculates the next Fibonacci number as a + b and updates a and b accordingly.
    • The loop iterates 1000 times to fill the r slice.
  3. main Function:

    • This function is the program's entry point.
    • It calls the Fib1000 function to generate the Fibonacci numbers.
  4. show Function:

    • This function takes a slice of Fibonacci numbers c and a title title as input.
    • It initializes a slice f of length 9 to count the occurrences of each digit (1-9) in the Fibonacci numbers.
    • It iterates over the c slice, incrementing the corresponding element in f for each Fibonacci number.
    • It then prints the title and a table with three columns: "Digit" (representing the digits 1-9), "Observed" (percentage of observed occurrences of each digit), and "Predicted" (percentage of predicted occurrences of each digit).
    • The predicted occurrences are calculated using the formula math.Log10(1+1/float64(i+1)), where i is the digit index.
  5. Program Execution:

    • The program executes the main function.
    • The Fib1000 function generates the first 1000 Fibonacci numbers and passes them to the show function.
    • The show function analyzes the digits of these numbers and prints the results.

This program demonstrates how to generate and analyze large sequences of numbers using loops, slices, and mathematical functions in Go. It also illustrates how to count and compare observed and predicted occurrences of digits in a dataset.

Source code in the go programming language

package main

import (
    "fmt"
    "math"
)

func Fib1000() []float64 {
    a, b, r := 0., 1., [1000]float64{}
    for i := range r {
        r[i], a, b = b, b, b+a
    }
    return r[:]
}

func main() {
    show(Fib1000(), "First 1000 Fibonacci numbers")
}

func show(c []float64, title string) {
    var f [9]int
    for _, v := range c {
        f[fmt.Sprintf("%g", v)[0]-'1']++
    }
    fmt.Println(title)
    fmt.Println("Digit  Observed  Predicted")
    for i, n := range f {
        fmt.Printf("  %d  %9.3f  %8.3f\n", i+1, float64(n)/float64(len(c)),
            math.Log10(1+1/float64(i+1)))
    }
}


  

You may also check:How to resolve the algorithm Terminal control/Ringing the terminal bell step by step in the R programming language
You may also check:How to resolve the algorithm MD4 step by step in the Python programming language
You may also check:How to resolve the algorithm User input/Text step by step in the Lua programming language
You may also check:How to resolve the algorithm The Twelve Days of Christmas step by step in the ALGOL 68 programming language
You may also check:How to resolve the algorithm Jensen's Device step by step in the Delphi programming language