Okay so at work I have been working on providing a communication protocol for communicating between the PC software bundled with our hardware and the hardware itself. Basically it is a handheld fiber optic tester, displaying a graph of signal level in db versus distance, but thats not important. I have developed a virtual instrument for the PC so that when you have the two connected via the supplied USB cable they can communicate and transfer files and such, and now you can also have direct control of the instrument from the PC, and can see its exact screen on the PC. To do this I am transferring the screen data to the PC every time user input occurs that changes something. It is in the format of 320x240x8bpp. This is working, I am not asking how to implement it, but it is SLOW... The real problem is our "USB" cable is not really USB at all it uses an RS232 bridge chip to piggyback normal UART communication over USB interfaces. The fastest this chip supports is 920k. Considering I am transferring 76,800 bytes of data this should occur in less than a tenth of a second, but it takes closer to a half second per screen update for some reason. Furthermore, in an attempt to improve this I implemented real time compression using lossless RLE which I observed to shrink the 76,800 bytes to as little as 18,000 (varies with the complexity of the screen data). The puzzling thing is... this had NO impact on speed... it still takes roughly half a second for the display to update, and I have confirmed this is in waiting for the data to be received by the PC, not in drawing the screen or anything stupid like that. So... I am grasping at straws I know but does anyone have any idea what might be going on? Particularly why reducing the amount of data being sent by 75% had no impact on performance? I am at a loss and this needs to be done this week.