2011-10-23 17:52:20 +00:00
|
|
|
/*
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
-----------------------------------------------------------------------------
|
|
|
|
This source file is part of VideoTexture library
|
|
|
|
|
|
|
|
Copyright (c) 2006 The Zdeno Ash Miklas
|
|
|
|
|
|
|
|
This program is free software; you can redistribute it and/or modify it under
|
|
|
|
the terms of the GNU Lesser General Public License as published by the Free Software
|
|
|
|
Foundation; either version 2 of the License, or (at your option) any later
|
|
|
|
version.
|
|
|
|
|
|
|
|
This program is distributed in the hope that it will be useful, but WITHOUT
|
|
|
|
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
|
|
|
|
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.
|
|
|
|
|
|
|
|
You should have received a copy of the GNU Lesser General Public License along with
|
|
|
|
this program; if not, write to the Free Software Foundation, Inc., 59 Temple
|
|
|
|
Place - Suite 330, Boston, MA 02111-1307, USA, or go to
|
|
|
|
http://www.gnu.org/copyleft/lesser.txt.
|
|
|
|
-----------------------------------------------------------------------------
|
|
|
|
*/
|
|
|
|
|
|
|
|
|
2011-02-22 19:30:37 +00:00
|
|
|
/** \file Exception.h
|
|
|
|
* \ingroup bgevideotex
|
|
|
|
*/
|
|
|
|
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
#if !defined EXCEPTION_H
|
|
|
|
#define EXCEPTION_H
|
|
|
|
|
|
|
|
#include <exception>
|
|
|
|
#include <vector>
|
|
|
|
#include <string>
|
2008-11-01 12:48:46 +00:00
|
|
|
#include <algorithm>
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
|
|
|
|
#include "Common.h"
|
|
|
|
|
|
|
|
|
|
|
|
#define CHCKHRSLTV(fnc,val,err) \
|
|
|
|
{ \
|
|
|
|
HRESULT macroHRslt = (fnc); \
|
|
|
|
if (macroHRslt != val) \
|
|
|
|
throw Exception (err, macroHRslt, __FILE__, __LINE__); \
|
|
|
|
}
|
|
|
|
|
|
|
|
#define THRWEXCP(err,hRslt) throw Exception (err, hRslt, __FILE__, __LINE__);
|
|
|
|
|
|
|
|
|
|
|
|
#if defined WIN32
|
|
|
|
|
|
|
|
#define CHCKHRSLT(fnc,err) \
|
|
|
|
{ \
|
|
|
|
HRESULT macroHRslt = (fnc); \
|
|
|
|
if (FAILED(macroHRslt)) \
|
|
|
|
throw Exception (err, macroHRslt, __FILE__, __LINE__); \
|
|
|
|
}
|
|
|
|
|
|
|
|
#else
|
|
|
|
|
|
|
|
#define CHCKHRSLT(fnc,err) CHCKHRSLTV(fnc,S_OK,err)
|
|
|
|
|
|
|
|
#endif
|
|
|
|
|
|
|
|
|
|
|
|
// forward declarations
|
|
|
|
class ExceptionID;
|
|
|
|
class Exception;
|
|
|
|
|
|
|
|
|
|
|
|
// exception identificators
|
|
|
|
extern ExceptionID ErrGeneral, ErrNotFound;
|
|
|
|
|
|
|
|
|
|
|
|
// result type
|
|
|
|
typedef long RESULT;
|
|
|
|
|
|
|
|
|
|
|
|
// class ExceptionID for exception identification
|
|
|
|
class ExceptionID
|
|
|
|
{
|
|
|
|
public:
|
|
|
|
// constructor a destructor
|
|
|
|
ExceptionID (void) {}
|
|
|
|
~ExceptionID (void) {}
|
|
|
|
|
|
|
|
private:
|
|
|
|
// not allowed
|
|
|
|
ExceptionID (const ExceptionID & obj) throw() {}
|
|
|
|
ExceptionID & operator= (const ExceptionID & obj) throw() { return *this; }
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
|
|
// class ExpDesc for exception description
|
|
|
|
class ExpDesc
|
|
|
|
{
|
|
|
|
public:
|
|
|
|
// constructor a destructor
|
2010-02-16 16:47:41 +00:00
|
|
|
ExpDesc (ExceptionID & exp, const char * desc, RESULT hres = S_OK);
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
~ExpDesc (void);
|
|
|
|
|
|
|
|
// comparision function
|
|
|
|
// returns 0, if exception identification don't match at all
|
|
|
|
// returns 1, if only exception identification is matching
|
|
|
|
// returns 2, if both exception identification and result are matching
|
|
|
|
int isExp (ExceptionID * exp, RESULT hres = S_OK) throw()
|
|
|
|
{
|
|
|
|
// check exception identification
|
|
|
|
if (&m_expID == exp)
|
|
|
|
{
|
|
|
|
// check result value
|
|
|
|
if (m_hRslt == hres) return 2;
|
|
|
|
// only identification match
|
|
|
|
if (m_hRslt == S_OK) return 1;
|
|
|
|
}
|
|
|
|
// no match
|
|
|
|
return 0;
|
|
|
|
}
|
|
|
|
|
|
|
|
// get exception description
|
|
|
|
void loadDesc (std::string & desc) throw()
|
|
|
|
{
|
|
|
|
desc = m_description;
|
|
|
|
}
|
|
|
|
|
2011-09-03 02:15:49 +00:00
|
|
|
void registerDesc(void)
|
|
|
|
{
|
|
|
|
if (std::find(m_expDescs.begin(), m_expDescs.end(), this) == m_expDescs.end())
|
|
|
|
m_expDescs.push_back(this);
|
|
|
|
}
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
// list of exception descriptions
|
|
|
|
static std::vector<ExpDesc*> m_expDescs;
|
|
|
|
|
|
|
|
private:
|
|
|
|
// exception ID
|
|
|
|
ExceptionID & m_expID;
|
|
|
|
// result
|
|
|
|
RESULT m_hRslt;
|
|
|
|
// description
|
2010-02-16 16:47:41 +00:00
|
|
|
const char * m_description;
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
|
|
|
|
// not allowed
|
|
|
|
ExpDesc (const ExpDesc & obj) : m_expID (ErrNotFound) {}
|
|
|
|
ExpDesc & operator= (const ExpDesc & obj) { return *this; }
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
// class Exception
|
|
|
|
class Exception : public std::exception
|
|
|
|
{
|
|
|
|
public:
|
|
|
|
// constructor
|
|
|
|
Exception ();
|
|
|
|
// destructor
|
|
|
|
virtual ~Exception () throw();
|
|
|
|
// copy constructor
|
|
|
|
Exception (const Exception & xpt);
|
|
|
|
// assignment operator
|
|
|
|
Exception & operator= (const Exception & xpt);
|
|
|
|
// get exception description
|
|
|
|
virtual const char * what(void);
|
|
|
|
|
|
|
|
// debug version of constructor
|
2010-02-16 16:47:41 +00:00
|
|
|
Exception (ExceptionID & expID, RESULT rslt, const char * fil, int lin);
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
// set source file and line of exception
|
2010-02-16 16:47:41 +00:00
|
|
|
void setFileLine (const char * fil, int lin);
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
|
|
|
|
// get description in string
|
|
|
|
std::string & getDesc (void) throw() { return m_desc; }
|
|
|
|
|
|
|
|
// report exception
|
|
|
|
virtual void report (void);
|
|
|
|
|
|
|
|
// get exception id
|
|
|
|
ExceptionID * getID (void) throw() { return m_expID; }
|
|
|
|
|
|
|
|
/// last exception description
|
|
|
|
static std::string m_lastError;
|
|
|
|
|
|
|
|
/// log file name
|
2010-02-16 16:47:41 +00:00
|
|
|
static const char * m_logFile;
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
|
|
|
|
protected:
|
|
|
|
// exception identification
|
|
|
|
ExceptionID * m_expID;
|
|
|
|
// RESULT code
|
|
|
|
RESULT m_hRslt;
|
|
|
|
|
|
|
|
// exception description
|
|
|
|
std::string m_desc;
|
|
|
|
|
|
|
|
// set exception description
|
|
|
|
virtual void setXptDesc (void);
|
|
|
|
|
|
|
|
// copy exception
|
|
|
|
void copy (const Exception & xpt);
|
|
|
|
|
|
|
|
// file name where exception was thrown
|
|
|
|
std::string m_fileName;
|
|
|
|
// line number in file
|
|
|
|
int m_line;
|
|
|
|
|
|
|
|
};
|
|
|
|
|
2008-11-01 12:48:46 +00:00
|
|
|
extern ExpDesc MaterialNotAvailDesc;
|
|
|
|
extern ExpDesc ImageSizesNotMatchDesc;
|
VideoTexture: improvements to image data access API.
- Use BGL buffer instead of string for image data.
- Add buffer interface to image source.
- Allow customization of pixel format.
- Add valid property to check if the image data is available.
The image property of all Image source objects will now
return a BGL 'buffer' object. Previously it was returning
a string, which was not working at all with Python 3.1.
The BGL buffer type allows sequence access to bytes and
is directly usable in BGL OpenGL wrapper functions.
The buffer is formated as a 1 dimensional array of bytes
with 4 bytes per pixel in RGBA order.
BGL buffers will also be accepted in the ImageBuff load()
and plot() functions.
It is possible to customize the pixel format by using
the VideoTexture.imageToArray(image, mode) function:
the first argument is a Image source object, the second
optional argument is a format string using the R, G, B,
A, 0 and 1 characters. For example "BGR" means that each
pixel will be 3 bytes, corresponding to the Blue, Green
and Red channel in that order. Use 0 for a fixed hex 00
value, 1 for hex FF. The default mode is "RGBA".
All Image source objects now support the buffer interface
which allows to create memoryview objects for direct access
to the image internal buffer without memory copy. The buffer
format is one dimensional array of bytes with 4 bytes per
pixel in RGBA order. The buffer is writable, which allows
custom modifications of the image data.
v = memoryview(source)
A bug in the Python 3.1 buffer API will cause a crash if
the memoryview object cannot be created. Therefore, you
must always check first that an image data is available
before creating a memoryview object. Use the new valid
attribute for that:
if source.valid:
v = memoryview(source)
...
Note: the BGL buffer object itself does not yet support
the buffer interface.
Note: the valid attribute makes sense only if you use
image source in conjunction with texture object like this:
# refresh texture but keep image data in memory
texture.refresh(False)
if texture.source.valid:
v = memoryview(texture.source)
# process image
...
# invalidate image for next texture refresh
texture.source.refresh()
Limitation: While memoryview objects exist, the image cannot be
resized. Resizing occurs with ImageViewport objects when the
viewport size is changed or with ImageFFmpeg when a new image
is reloaded for example. Any attempt to resize will cause a
runtime error. Delete the memoryview objects is you want to
resize an image source object.
2010-02-21 22:20:00 +00:00
|
|
|
extern ExpDesc ImageHasExportsDesc;
|
|
|
|
extern ExpDesc InvalidColorChannelDesc;
|
2008-11-01 12:48:46 +00:00
|
|
|
extern ExpDesc SceneInvalidDesc;
|
|
|
|
extern ExpDesc CameraInvalidDesc;
|
VideoTexture: new ImageMirror class for easy mirror (and portal) creation
The new class VideoTexture.ImageMirror() is available to perform
automatic mirror rendering.
Constructor:
VideoTexture.ImageMirror(scene,observer,mirror,material)
scene: reference to the scene that will be rendered.
Both observer and mirror must be part of that scene.
observer: reference to a game object used as view point for
mirror rendering: the scene will be rendered through
the mirror as if the active camera was at the observer
location. Usually the observer is the active camera
but you can use any game obejct.
mirror: reference to the mesh object holding the mirror.
material: material ID of the mirror texture as returned by
VideoTexture.materialID(). The mirror is formed by
the polygons mapped to that material.
There are no specific methods or attributes. ImageMirror inherits
all methods and attributes from ImageRender. You must refresh the
parent VideoTexture.Texture object regularly to update the mirror
rendering.
Guidelines on how to create a working mirror:
- Use a texture that is specific to the mirror so that the mirror
rendering only appears on the mirror.
- The mirror must be planar; the algorithm works well only for planar
or quasi planar mirror. For spherical mirror, you will get better
results with ImageRender and a camera at the center of the mirror.
ImageMirror automatically computes the mirror orientation and
position. The mirror doesn't need to be rectangular, it can be
circular or take any form provided it is planar.
- The mirror up direction must be along the Z axis in local mesh
coordinates. If the mirror is not vertical, ImageMirror will
compute the up direction as being the projection of the Z axis
on the mirror plane.
- UV mapping must be set right to get correct mirror rendering:
- make a planar projection of the mirror polygons (Unwrap or projection from view)
- eventually rotate the projection so that UV up direction corresponds to the mesh Z axis
- scale the projection so that the extreme points touch the border of the texture
- flip the UV projection horizontally (scale -1 on X axis). This is needed
because the mirror texture is rendered from the back of the mirror and
thus is reversed from the view point of the observer. Horizontal flip
in the UV map restores the correct orientation.
Besides these simple rules, the mirror rendering is completely automatic.
In particular, you don't need to allocate a camera for the rendering,
ImageMirror creates dynamically a camera for that. The reflection is correct
even on large angles. The mirror can be a dynamic and moving object, the
algorithm always computes the correct camera position based on observer
relative position. You don't have to worry about mirror position in the scene:
the algorithm automatically computes the camera frustum so that any object
behind the mirror is not rendered.
Warnings:
- observer and mirror are references to game objects. ImageMirror keeps
a pointer to them but does not increment the reference count. You must ensure
that these game objects are not deleted as long as you refresh() the ImageMirror
object. You must release the ImageMirror object before you delete the game
objects. To release the ImageMirror object (normally stored in GameLogic),
just assign it to None.
- Mirror rendering is automatically skipped when the observer is behind the mirror
but it is not disabled when the mirror is out of sight of the observer.
You should only refresh the mirror when you know that the observer is likely to see it.
For example, no need to refresh a car inner mirror when the player is not in the car.
Example:
contr = GameLogic.getCurrentController()
# object holding the mirror
mirror = contr.getOwner()
scene = GameLogic.getCurrentScene()
# observer will be the active camere
camera = scene.getObjectList()['OBCamera']
matID = VideoTexture.materialID(mirror, 'IMmirror.png')
GameLogic.mirror = VideoTexture.Texture(mirror, matID)
GameLogic.mirror.source = VideoTexture.ImageMirror(scene,camera,mirror,matID)
# to render the mirror, just call GameLogic.mirror.refresh(True) on each frame.
You can download a demo game (with a video file) here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.zip
For those who have already downloaded the demo, you can just update the blend file:
http://home.scarlet.be/~tsi46445/blender/MirrorTextureDemo.blend
2008-12-04 16:07:46 +00:00
|
|
|
extern ExpDesc ObserverInvalidDesc;
|
|
|
|
extern ExpDesc MirrorInvalidDesc;
|
|
|
|
extern ExpDesc MirrorSizeInvalidDesc;
|
|
|
|
extern ExpDesc MirrorNormalInvalidDesc;
|
|
|
|
extern ExpDesc MirrorHorizontalDesc;
|
|
|
|
extern ExpDesc MirrorTooSmallDesc;
|
2008-11-01 12:48:46 +00:00
|
|
|
extern ExpDesc SourceVideoEmptyDesc;
|
|
|
|
extern ExpDesc SourceVideoCreationDesc;
|
|
|
|
|
|
|
|
|
|
|
|
void registerAllExceptions(void);
|
VideoTexture module.
The only compilation system that works for sure is the MSVC project files. I've tried my best to
update the other compilation system but I count on the community to check and fix them.
This is Zdeno Miklas video texture plugin ported to trunk.
The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html)
EXCEPT for the following:
The module name is changed to VideoTexture (instead of blendVideoTex).
A new (and only) video source is now available: VideoFFmpeg()
You must pass 1 to 4 arguments when you create it (you can use named arguments):
VideoFFmpeg(file) : play a video file
VideoFFmpeg(file, capture, rate, width, height) : start a live video capture
file:
In the first form, file is a video file name, relative to startup directory.
It can also be a URL, FFmpeg will happily stream a video from a network source.
In the second form, file is empty or is a hint for the format of the video capture.
In Windows, file is ignored and should be empty or not specified.
In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394.
The user specifies the type of device with the file parameter:
[<device_type>][:<standard>]
<device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l'
<standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc'
The driver name is constructed automatically from the device types:
v4l : /dev/video<capture>
dv1394: /dev/dv1394/<capture>
If you have different driver name, you can specify the driver name explicitely
instead of device type. Examples of valid file parameter:
/dev/v4l/video0:pal
/dev/ieee1394/1:ntsc
dv1394:ntsc
v4l:pal
:secam
capture:
Defines the index number of the capture source, starting from 0. The first capture device is always 0.
The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1.
rate:
the capture frame rate, by default 25 frames/sec
width:
height:
Width and height of the video capture in pixel, default value 0.
In Windows you must specify these values and they must fit with the capture device capability.
For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480,
you must specify one of these couple of values or the opening of the video source will fail.
In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height.
Simple example
**************
1. Texture definition script:
import VideoTexture
contr = GameLogic.getCurrentController()
obj = contr.getOwner()
if not hasattr(GameLogic, 'video'):
matID = VideoTexture.materialID(obj, 'MAVideoMat')
GameLogic.video = VideoTexture.Texture(obj, matID)
GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg')
# Streaming is also possible:
#GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg')
GameLogic.vidSrc.repeat = -1
# If the video dimensions are not a power of 2, scaling must be done before
# sending the texture to the GPU. This is done by default with gluScaleImage()
# but you can also use a faster, but less precise, scaling by setting scale
# to True. Best approach is to convert the video offline and set the dimensions right.
GameLogic.vidSrc.scale = True
# FFmpeg always delivers the video image upside down, so flipping is enabled automatically
#GameLogic.vidSrc.flip = True
if contr.getSensors()[0].isPositive():
GameLogic.video.source = GameLogic.vidSrc
GameLogic.vidSrc.play()
2. Texture refresh script:
obj = GameLogic.getCurrentController().getOwner()
if hasattr(GameLogic, 'video') != 0:
GameLogic.video.refresh(True)
You can download this demo here:
http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
|
|
|
#endif
|