These are internally stored as a 3D image textures, but accessible like e.g.
UV coordinates though the attribute node and getattribute().
This is convenient for rendering e.g. smoke objects where data like density is
really a property of the mesh, and it avoids having to specify the smoke object
in a texture node, instead the material will work with any smoke domain.
This does not support staying fixed while the surface deforms, but for static
meshes it should match up with the surface texture coordinates. Implemented
as a matrix transform from objects space to mesh texture space.
Making this work for deforming surfaces would be quite complicated, you might
need something like harmonic coordinates as used in the mesh deform modifier,
probably will not be possible anytime soon.
should be no functional changes yet. UV, tangent and intercept are now stored
as attributes, with the intention to add more like multiple uv's, vertex
colors, generated coordinates and motion vectors later.
Things got a bit messy due to having both triangle and curve data in the same
mesh data structure, which also gives us two sets of attributes. This will get
cleaned up when we split the mesh class.
Most of the changes are related to adding support for motion data throughout
the code. There's some code for actual camera/object motion blur raytracing
but it's unfinished (it badly slows down the raytracing kernel even when the
option is turned off), so that code it disabled still.
Motion vector export from Blender tries to avoid computing derived meshes
when the mesh does not have a deforming modifier, and it also won't store
motion vectors for every vertex if only the object or camera is moving.