Hi everybody, we have several AXIS M3044-V ip cameras watching the same open space. We would like to get the timestamp of the exact moment the camera captures its frames. We need cameras to be aligned over time since we need to geometrically triangulate pictures across views. We read camera streams with OpenCV and then we process them. As far as we know there is no way to embed the timestamp in the stream (regardless it is mjpg or mp4), the only solution we found is to synchronize the cameras through the same local NTP server and to overlay the timestamp on the frame itself. By doing this we are able to know the time of capture, and not the time of receivement (good). Up to 10fps this approach seems to works quite ok at first glance... The problem arises when we increase the framerate. This are main issues: 1) We couldn't find a proper way to trigger camera shutter through the network (I know it's not the right way of doing that because of network trasmission delay, but even a delay of 2-3ms due to the network communication would be gold for us). Did you find a nice way to do this? 2) Even though we configured cameras to take the time from the same local NTP server, internal clock still drifts in the order of 10ms. Is something that can be fixed from a Linux kernel point of view? 3) (this is the most unbelievable) if the frame rate is higher than 10fps the camera may overlay the same timestamp over successive frames. It means that two consecutives frames captures with 40-60ms interval between each other can have the same timestamp overlaid (even though the TS has a granularity of houndreds of second) Do you have any idea on how we could fix this situation?
|