Skip to content

iterative ref: implement and benchmark methods to normalize half maps #25

@geoffwoollard

Description

@geoffwoollard

Insertion of slices results in uneven coverage of the voxels. The slices are added in, but just because certain slices are observed more does not mean that the values in those voxels should be increased. It just means that the certainty is going up. The expectation require "dividing out" after doing the sum.

One method to "average in" the slices inserts ones and the particle values in parallel, and then divide the "half map" by the "count map". This "dividing out" can be done in a a few ways to protect against numerical instability and noise blow up.

TODO:

  1. implement various methods to be benchmarked
  2. To a survey to see what other reconstruction software does

See
https://github.com/compSPI/reconstructSPI/pull/21/files/33e3229b1dd193f1cecdc7334668b3e991cdf4de..01ceaf5f7f80614fae628a89649908ca11d1a9a0#r836914864

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions