So, I've managed to make a very simple test program that demostrates this problem. It turns out that I guess Shinobi has internal limits on the number of significant digits of a Double precision value. I've made a github repository (minus the shared libraries and jarfiles for Shinobi) that shows this phenomenon. The application tries to render a Sin curve based on a conversion of the current time to a double precision number. As the numbers get larger (> 200 days) the chart starts to render increasingly poorly. I see multiple values of the data source being rendered at the same Y coordinate in the chart, as well as a strange 'sliding' of the gridlines behind the chart values.
In MainActivity.java, you will see some lines that look like the following.
// Setting this date closer to the actual date fixes a resolution issue
// with the internal chart representation of the data. In this demo, setting the
// value to -200, causes a stair-stepping effect. But -100 does not have this effect.
// The lower this 'delta' becomes, the more closely the chart represents the values.
Calendar c = Calendar.getInstance();
If you change the -200 value to -2 for example, the chart will render properly, and panning the chart will update much more smoothly. Setting this to a value like -1000 has terrible effects.
Why is Shinobi forcing me to reduce the significant digits of my double precision values in this way?