Diminishing the effects of digital noise in software

Discussion in 'OT Technology' started by CodeX, Aug 5, 2009.

  1. CodeX

    CodeX Guest

    So, a lot of you have probably seen me post about my work before and know what I do, so I won't explain the whole thing again, but suffice to say we use an analog-to-digital converter to turn the output from a photo-diode to a linear 32bit value. This value then represents the amount of "light" (not visible light) being collected by the photo-diode.

    We sample the ADC over a period of time and come up with data that is graphed to the screen on a dB/distance scale.

    The problem is there is repetitive digital noise in the signal, which we assume is from the data and address buses interfering with the voltage level on the analog line from the photo-diode to the ADC input during the data acquisition time.

    Previously, I was asked to reduce this in software by coming up with a set of adjustable offsets that could be additively applied to the signal level at these points. This wasn't too difficult because of the repetitive nature of the noise I was required to use no more than 12 adjustable points which would repeat over the entire set of data.

    NOW... for whatever reason the noise is much more complicated, and I guess fixing it in hardware is not an option. There may be a pattern but it is extremely complex and I doubt I could use a small set of repeated offsets to correct it.

    So, what I am thinking of is creating some automatic calibration process that sets a unique offset value for each and every point of data. I plan to do this by taking an "ideal" data set, one that should represent the result of a scan in a particular circumstance with no noise. Then, using that as a reference point for an "optimal" representation of data I would take a scan with the instrument and go through point by point determining the deviance between the optimal set and the actual data. These deviance values would be stored away as the "calibrated offsets" for each of the 4k data points.

    Anyway... this is just an idea in my head at this point, so my question is will this work? Has anyone done anything like this or see anything wrong with my plan? I am asking because this might take a decent amount of time to implement and I would hate to blow several days of development time and have nothing to show for it, I doubt the boss would be too happy regardless of my intentions:mamoru:
     
  2. deusexaethera

    deusexaethera OT Supporter

    Joined:
    Jan 27, 2005
    Messages:
    19,712
    Likes Received:
    0
    I think this is basically the short version of what you're saying, but feed it a sinewave and record it, then subtract the sinewave from the recording, then invert the remaining waveform and apply it to future inputs. Or, something like that.

    Yes, it works, Pro Tools does it quite well, if you're willing to pay DigiDesign thousands of dollars.
     
  3. CodeX

    CodeX Guest

    I would have to generate a sine wave signal with a laser source then...

    Maybe if I get more specific, basically we fire a laser into a length of optical fiber. The laser is "on" for a specific amount of time, which we refer to as the pulse width. This pulse of laser light travels down the fiber, which is up to 256 kilometers long, and every 1/100,000th of a second we sample the amount of laser energy being reflected back into the photodiode. There is a tiny bit of energy reflected back at all times, due to photons colliding with the edge of the glass fiber just right to be reflected backward, this is called backscatter.

    So, given a perfect length of fiber what you would see is a graph of signal level versus time (distance) starting very high on the left side and diminishing at a steady rate from the left to the right side of the graph due to the natural backscatter (or attenuation) of the fiber. The farther from the source you get the less energy is being reflected back because there is less total energy at that point due to energy lost prior to that point.

    Except, instead of a nice looking straight line, which it should be given perfect optical characteristics of the fiber, instead we see a noisy, jagged line that trends down at the predicted rate, meaning if you took a linear regression analysis you would likely come up with something close to the perfect ideal straight line.

    So what I was thinking of doing was generating that ideal data set, either artificially or by applying a "smoothing" algorithm to an actual set of collected data, and then determining the deviance from the ideal of the actual data for each of the 4000 sample points, storing those offsets away and then applying them to every scan after that.

    It's basically what you said with the sine wave, except a sine wave is not a natural signal that this instrument would read, so I would have to make a laser source that pulsed like a sine wave and feed it into the input, instead I am just going to use an "ideal" data set for the type of data the instrument naturally records.
     
  4. deusexaethera

    deusexaethera OT Supporter

    Joined:
    Jan 27, 2005
    Messages:
    19,712
    Likes Received:
    0
    I understood what you said, but now I'm lost as to the point of the exercise. Isn't the goal to get the light to the other end of the fiber? Who cares what gets reflected back?
     
  5. CodeX

    CodeX Guest

    No, this measures reflection of the laser energy over distance, it can detect bad splices, breaks or crimps in the fiber, or other problems and give their location (in distance from the source) down to a quarter meter. It's a diagnostic tool to find problems or to verify a good connection, you can actually determine the exact percentage of the signal to actually reach the other end of the link, and therefore determine signal to noise ratio.

    These appear as sudden drops or spikes followed by drops on the graph, as you can see in the lower right screenshot here, you can also kind of see the noise im talking about, at the lower right particularly but that screen is zoomed all the way out only showing 1/16th of the total data so you can't really see how bad the noise is.
    [​IMG]
     
    Last edited by a moderator: Aug 7, 2009
  6. deusexaethera

    deusexaethera OT Supporter

    Joined:
    Jan 27, 2005
    Messages:
    19,712
    Likes Received:
    0
    Oh, okay, I get it now. Reminds me of something NASA installed in the Space Shuttles to detect structural damage -- supposedly it's the only reason they knew what happened to Columbia when it broke apart, because it was the prototype and it had a few hundred miles of fiber running through it.

    So let me make sure I understand what you're doing: you fire a burst into the fiber, then you read the backscatter coming out the near-end of the fiber every few millionths of a second, and because the speed of light becomes a factor in that short of a timeframe, each reading lets you know the permissiveness/reflectiveness of the fiber a couple of feet further down its length from the previous reading. Is that right?

    If so, then I'm curious to know how you can be sure that the test fiber you're using is in absolutely perfect condition from one end to the other, which would be necessary for establishing a baseline. The only other alternative I can think of is to test thousands of fibers and calculate a 99% boundary around all of the readings (or whatever other percentage suits you) within which any other fiber you test is 99% likely to be in good shape.

    Another idea is to take multiple readings with different colors of light -- a cut or fracture should show the same dropoff at the same spot with all colors you test it with, because the fracture will act like a mirror (IIRC), whereas any other noise introduced by safe bends in the fiber will respond differently according to the wavelength of the light you're testing with, so they should average-out when you put the readings together.
     
    Last edited: Aug 7, 2009
  7. CodeX

    CodeX Guest

    I'll post more later I'm at work, but here is the outcome of my first stab at this, I'm quite pleased :big grin:

    [​IMG]
     

Share This Page