Commit Graph

9 Commits

Author SHA1 Message Date
Campbell Barton
8b57f7502b code cleanup: gpl header update (formatting) 2012-11-18 00:30:06 +00:00
Campbell Barton
9ecc6fdcc7 style cleanup 2012-08-23 07:10:48 +00:00
Campbell Barton
80dca0a06d style cleanup: consistent names for header guards. 2012-03-09 19:17:19 +00:00
Campbell Barton
89a963fb7f style cleanup: comment blocks 2012-03-09 18:28:30 +00:00
Campbell Barton
4a04f72069 remove $Id: tags after discussion on the mailign list: http://markmail.org/message/fp7ozcywxum3ar7n 2011-10-23 17:52:20 +00:00
Nathan Letwory
1f4fc992ef doxygen: bge scenegraph and videotexture 2011-02-22 19:30:37 +00:00
Benoit Bolsee
71f7e50451 VideoTexture: optional arguments to ImageBuff constructor.
ImageBuff([width,height[,color[,scale]]])

width, height: size of buffer in pixel.
               default: buffer not allocated.
color: initial value of RGB channels. Alpha channel is 255.
       Possible values: 0(black=default) -> 255 (white)
scale: True or False to enable or disable fast scaling
       default: False

This constructors eliminates the need to use the load function
when you just want to initialize the image buffer to black or white.
2010-02-26 22:14:31 +00:00
Benoit Bolsee
445d077cf4 BGE: Add plot method to VideoTexture.ImageBuff class.
Synopsis: plot(brush,width,height,x,y,mode)
          plot(imgbuff,x,y,mode)

The first form uses a byte array containing the brush shape.
The second form uses another ImageBuff object as a brush.
The ImageBuff object must be initialized before you can call
these methods. Use load(rgb_buffer,sizex,sizey) method to create
an image buffer of given size (with alpha channel set to 255).
The brush is plotted directly in the image buffer. The texture
is updated only when the VideoTexture.Texture parent object is
refreshed: this will download the image buffer to the GPU.

brush:  Byte array containing RGBA data to be plotted in image buffer.
        The data must be continuous in memory, organized row by row
        starting from lower left corner of the image. Each pixel is
        4 bytes representing RGBA data in that order.
width:  Horizontal size in pixels of image in brush.
height: Vertical size in pixels of the image in brush.
imgbuff:Another ImageBuff object that is used as a brush. The object
        must have been initialized first with load().
x:      Horizontal position in pixel from left side of the image buffer
        where the brush will be plotted. The brush is plotted on pixels
        positions x->x+width-1. Clipping is performed if the brush falls
        partially outside the image buffer.
y:      Vertical position in pixel from bottom side of the image buffer
        where the brush will be plotted.
mode:   Mode of drawing. Use one of the following value:
        0 : MIX
        1 : ADD
	2 : SUB
        3 : MUL
        4 : LIGHTEN
        5 : DARKEN
        6 : ERASE ALPHA
        7 : ADD ALPHA
        1000 : COPY RGBA (default)
        1001 : COPY RGB
        1002 : COPY ALPHA

        Modes 0 to 7 are 'blend' modes: the brush pixels are combined
        with the image pixel in various ways. Refer to Blender documentation
        to learn more about these modes.
2009-12-08 10:02:22 +00:00
Benoit Bolsee
a8c4eef326 VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
 
This is Zdeno Miklas video texture plugin ported to trunk. 
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:

The module name is changed to VideoTexture (instead of blendVideoTex).

A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):

VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture

file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394. 
The user specifies the type of device with the file parameter:
   [<device_type>][:<standard>]
   <device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
   <standard>    : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
   v4l   : /dev/video<capture>
   dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely 
instead of device type. Examples of valid file parameter:
   /dev/v4l/video0:pal
   /dev/ieee1394/1:ntsc
   dv1394:ntsc
   v4l:pal
   :secam

capture: 
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.

rate: 
the capture frame rate, by default 25 frames/sec

width: 
height: 
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability. 
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480, 
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.

Simple example
**************
1. Texture definition script:

import VideoTexture

contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
	matID = VideoTexture.materialID(obj, 'MAVideoMat')
	GameLogic.video = VideoTexture.Texture(obj, matID)
	GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
	# Streaming is also possible:
	#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
	GameLogic.vidSrc.repeat = -1
	# If the video dimensions are not a power of 2, scaling must be done before
	# sending the texture to the GPU. This is done by default with gluScaleImage()
	# but you can also use a faster, but less precise, scaling by setting scale
	# to True. Best approach is to convert the video offline and set the dimensions right.
	GameLogic.vidSrc.scale = True
	# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
	#GameLogic.vidSrc.flip = True

if contr.getSensors()[0].isPositive():
	GameLogic.video.source = GameLogic.vidSrc
	GameLogic.vidSrc.play()


2. Texture refresh script:

obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
  GameLogic.video.refresh(True)

You can download this demo here: 
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00