Even in the early days of film, they knew that humans really need more than 24 frames per second. After all, Edison suggested, I believe, 46 as the absolute minimum to prevent eye strain. That's why some projectors in cinemas could drive at over 70 Hz to reduce flicker (they'd display the same frame multiple times). They probably stuck with such a low number because of film cost. Televisions used 50 Hz in Europe and 60 Hz in the US and interlacing to double frame rate (the difference comes down to grid frequency).
It's been also known for a long while that humans can perceive events even in the 600 Hz region. It might not be enough time to process the image in full, but it's enough for us to notice significant enough change. Try picking up a book and let pages slip from under your thumb. Or, have you ever ridden a bicycle or driven a car around a tall picket fence? Have you noticed how you can easily see through? But only when you move fast enough. Our brain is analog. I highly doubt it's sampling output of our eyes at discrete intervals like we do with cameras. We're just limited by laws of physics and the processing capacity of our brain. I imagine we perceive a stream of images as motion because they're coming in too fast and they blur together - the image changes before our brain can finish processing the previous image. Our brain has evolved to process a constantly changing picture. It stands to reason that the process is stream based and might focus on changes.