If you want to do any multi-pass rendering or post processing technique in OpenGL you need to set up a secondary framebuffer. Your first pass renders the normal 3d scene to the framebuffer, where you typically store the fragment output in a colour texture that fits your viewport. In a subsequent pass you might add a blur or some image processing thing. Using the default framebuffer (the one normally in use) you render a quad covering the screen and sample the texture from the first pass.
In order to make sure that you still have depth testing (so that farther away things aren't drawn over the top of closer things) in your secondary framebuffer you need to attach an additional depth buffer to it because it's no longer using the default depth buffer. There are 2 apprarent options for this in OpenGL:
The OpenGL wiki suggests using the texture when you need to sample the depth values later on (such as in shadow mapping), and using the Renderbuffer for everything else. I think David Wolff's book does this too from memory...anyway I always thought "renderbuffer" was a bit odd-sounding for something that was basically a depth buffer, and frankly I spend most of my time with students explaining the weird naming conventions of GL, so I asked about it on Twitter this morning.
Apparently the 'renderbuffer' name is a throw-back to support for an earlier implementation of MSAA (multi-sample anti-aliasing). As suggested I have tried it. I took a simple framebuffer switching demo as described here and put an if statement in to try both options, right after where I build my secondary framebuffer:
I set up a scene with 2 spheres, one behind the other, to make sure that depth testing was working. I swapped the rendering order around to be sure - turns out I had forgotten to enable depth testing first...easy to fix!
I expected the texture might be a bit slower but it wasn't - no big difference. In newer OpenGL - (4.2 I think?) You can use the generic texture storage functions instead which are a bit less verbose, so that's not really a big deal either. It's pretty much the same sort of thing, except that the name is less ambiguous than "renderbuffer" (what buffer isn't used in rendering?), and if you want to you can sample the texture later. Apparently 3.2 and newer can handle depth textures. I don't think WebGL has them yet - I recall doing something gross to store depths in a colour texture to get shadows working. What I would really prefer (to keep it simple) would be something like: glEnableDepthTesting (my_fbo, 32); and just not expose the messy bits at all. It would be trivial to write this for myself of course, but if it were like this to begin with it would be easier to learn.