You can't compare video games with movie in the framerate debate.
Movies profit from something known as natural motion blur, something that games don't have.
Every frame of a movie has a visual interpolation to the previous. In games every frame is a perfect snapshot of time. As an effect, in the linked image, you can see the hills in the background move gradually on the left (movies), while they basically jump from frame to frame on the right (videogames).
Its possible to simulate the natural blur in CGI, but as of now is too demanding for videogames, which is why they go for higher frame rates, to minimize the "jumping distances" of objects.
Ru Weerasuriya has some good points with the breaking of past, and the use of MSAA, but both are no replacement for 60hz (or more). The truth is that the PS4 simply doesn't have the power to do 60fps, MSAA, and the content streaming for the pace, all at the same time.
If they really would have chosen 30fps for the cinematic effect* they would have spend a great part of their effects development on a good natural motion blur shader for the game. A shader that, to my knowledge, doesn't exist as of now, as all current motion blur effects still show steps and jumps in the results due to the discrete nature of videogame shaders.
*Lets be honest, cinema are not something you should aim for in visual quality. Cinema makes the best of the technology they have at hand, but that doesn't mean they're the best in the world, or the universally best. As I hopped I've shown should videogames especially not strive to follow cinema in every aspect, but remember that it is its own medium.