swampthing, Interesting question, here is my take on it.
The slower arrow MAY have a marginally shorter point on distance but PO distance is more a function of your holding point on your face than arrow speed.
I think you must factor the time of flight of the arrow so a faster arrow will travel a given time to reach it's point on distance. That is also the time that gravity is acting on the arrow. A slower arrow though will travel the same amount of time to reach its point on distance. That time for each arrow would be for the arrow to cross the line of sight (going up), arch through the air and drop back to the line of sight which would be at the point on distance.
So, each arrow will travel about the about the same distance above the line of sight then drop back to the line of sight. The faster arrow will be somewhat farther down range than the slower arrow.
At mid range, which I believe is your question, both arrows will be a given distance above the line of sight. If you put your arrow tip on the target at mid (half) range, and shoot, your arrow will hit above the point on which you held the arrow tip. Measure that distance and you will know how high above the line of sight your arrow is when traveling to your real point on distance. Said another way that will be the point of highest trajectory of the arrow above the line of sight.
If you are shooting the slower arrow at the faster arrow's point on distance the slower arrow will be substantially higher than the fast arrow. You will be holding well above the target with the slower arrow.
As I said, interesting question and one that can pretty easily be experimented with and measured. My math and physics major is 40+ years in the past and largely unused so some of the more current physics type out there may explain it better. If you do the experiment, report back and let us know the results.