As far as I am aware, ES 1.0 Shaders don't allow texture sampling in the vertex shader. This means that using vertex displacement maps as keyframes (a very efficient method for doing morph animation) is not going to work. Calculating it on the CPU is out of the question. I found a post that suggested loading the whole lot into a massive set of vertex buffers. This means with 20 key-frames your geometry buffers occupy 20x more GPU memory. At the calculation level, you only need 2 key-frames at any one time, and really these can just be a set of differences from the original mesh. Maybe it would be cheaper to send an array of key-frames in a uniform just after the shader is linked, rather tham using per-vertex attributes. It seems more intuitive...will try.
Note to self - think about using Ambient Occlusion to improve face rendering...
So, I got a basic morph target demo working. A couple of minor issues:
You can see my demo here (click image). I'll leave in the bug so you can see what happens (triangles near the top of the door are affected). Otherwise it's fine. I'm relying on Blender3D to export my mesh here - it does have an "export animations" option which might solve this consistency problem, and still give me separate .obj files. edit: I fixed the vertex order problem by associating each animation with a bone in Blender. This will be a pain for facial animation. This is a bit gross but it might be the easiest way around the problem. It looks like the proper way to do this is to use the F-Curve editor in Blender but I'm not sure that this will be picked up by the .obj exporter. I could also looking at modifying the Python export script. Lots of work, very little pay-off.