blender/source/gameengine/VideoTexture/CMakeLists.txt

120 lines
2.5 KiB
CMake
Raw Normal View History

VideoTexture module. The only compilation system that works for sure is the MSVC project files. I've tried my best to update the other compilation system but I count on the community to check and fix them. This is Zdeno Miklas video texture plugin ported to trunk. The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html) EXCEPT for the following: The module name is changed to VideoTexture (instead of blendVideoTex). A new (and only) video source is now available: VideoFFmpeg() You must pass 1 to 4 arguments when you create it (you can use named arguments): VideoFFmpeg(file) : play a video file VideoFFmpeg(file, capture, rate, width, height) : start a live video capture file: In the first form, file is a video file name, relative to startup directory. It can also be a URL, FFmpeg will happily stream a video from a network source. In the second form, file is empty or is a hint for the format of the video capture. In Windows, file is ignored and should be empty or not specified. In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394. The user specifies the type of device with the file parameter: [<device_type>][:<standard>] <device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l' <standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc' The driver name is constructed automatically from the device types: v4l : /dev/video<capture> dv1394: /dev/dv1394/<capture> If you have different driver name, you can specify the driver name explicitely instead of device type. Examples of valid file parameter: /dev/v4l/video0:pal /dev/ieee1394/1:ntsc dv1394:ntsc v4l:pal :secam capture: Defines the index number of the capture source, starting from 0. The first capture device is always 0. The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1. rate: the capture frame rate, by default 25 frames/sec width: height: Width and height of the video capture in pixel, default value 0. In Windows you must specify these values and they must fit with the capture device capability. For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480, you must specify one of these couple of values or the opening of the video source will fail. In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height. Simple example ************** 1. Texture definition script: import VideoTexture contr = GameLogic.getCurrentController() obj = contr.getOwner() if not hasattr(GameLogic, 'video'): matID = VideoTexture.materialID(obj, 'MAVideoMat') GameLogic.video = VideoTexture.Texture(obj, matID) GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg') # Streaming is also possible: #GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg') GameLogic.vidSrc.repeat = -1 # If the video dimensions are not a power of 2, scaling must be done before # sending the texture to the GPU. This is done by default with gluScaleImage() # but you can also use a faster, but less precise, scaling by setting scale # to True. Best approach is to convert the video offline and set the dimensions right. GameLogic.vidSrc.scale = True # FFmpeg always delivers the video image upside down, so flipping is enabled automatically #GameLogic.vidSrc.flip = True if contr.getSensors()[0].isPositive(): GameLogic.video.source = GameLogic.vidSrc GameLogic.vidSrc.play() 2. Texture refresh script: obj = GameLogic.getCurrentController().getOwner() if hasattr(GameLogic, 'video') != 0: GameLogic.video.refresh(True) You can download this demo here: http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
# ***** BEGIN GPL LICENSE BLOCK *****
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
2010-02-12 13:34:04 +00:00
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
VideoTexture module. The only compilation system that works for sure is the MSVC project files. I've tried my best to update the other compilation system but I count on the community to check and fix them. This is Zdeno Miklas video texture plugin ported to trunk. The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html) EXCEPT for the following: The module name is changed to VideoTexture (instead of blendVideoTex). A new (and only) video source is now available: VideoFFmpeg() You must pass 1 to 4 arguments when you create it (you can use named arguments): VideoFFmpeg(file) : play a video file VideoFFmpeg(file, capture, rate, width, height) : start a live video capture file: In the first form, file is a video file name, relative to startup directory. It can also be a URL, FFmpeg will happily stream a video from a network source. In the second form, file is empty or is a hint for the format of the video capture. In Windows, file is ignored and should be empty or not specified. In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394. The user specifies the type of device with the file parameter: [<device_type>][:<standard>] <device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l' <standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc' The driver name is constructed automatically from the device types: v4l : /dev/video<capture> dv1394: /dev/dv1394/<capture> If you have different driver name, you can specify the driver name explicitely instead of device type. Examples of valid file parameter: /dev/v4l/video0:pal /dev/ieee1394/1:ntsc dv1394:ntsc v4l:pal :secam capture: Defines the index number of the capture source, starting from 0. The first capture device is always 0. The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1. rate: the capture frame rate, by default 25 frames/sec width: height: Width and height of the video capture in pixel, default value 0. In Windows you must specify these values and they must fit with the capture device capability. For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480, you must specify one of these couple of values or the opening of the video source will fail. In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height. Simple example ************** 1. Texture definition script: import VideoTexture contr = GameLogic.getCurrentController() obj = contr.getOwner() if not hasattr(GameLogic, 'video'): matID = VideoTexture.materialID(obj, 'MAVideoMat') GameLogic.video = VideoTexture.Texture(obj, matID) GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg') # Streaming is also possible: #GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg') GameLogic.vidSrc.repeat = -1 # If the video dimensions are not a power of 2, scaling must be done before # sending the texture to the GPU. This is done by default with gluScaleImage() # but you can also use a faster, but less precise, scaling by setting scale # to True. Best approach is to convert the video offline and set the dimensions right. GameLogic.vidSrc.scale = True # FFmpeg always delivers the video image upside down, so flipping is enabled automatically #GameLogic.vidSrc.flip = True if contr.getSensors()[0].isPositive(): GameLogic.video.source = GameLogic.vidSrc GameLogic.vidSrc.play() 2. Texture refresh script: obj = GameLogic.getCurrentController().getOwner() if hasattr(GameLogic, 'video') != 0: GameLogic.video.refresh(True) You can download this demo here: http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
#
# The Original Code is Copyright (C) 2006, Blender Foundation
# All rights reserved.
#
# The Original Code is: all of this file.
#
# Contributor(s): Jacques Beaurain.
#
# ***** END GPL LICENSE BLOCK *****
set(INC
.
../BlenderRoutines
../Expressions
../GameLogic
../Ketsji
../Rasterizer
../Rasterizer/RAS_OpenGLRasterizer
../SceneGraph
../../blender/blenkernel
../../blender/blenlib
../../blender/editors/include
../../blender/gpu
../../blender/imbuf
../../blender/makesdna
../../blender/python
../../blender/python/generic
../../../intern/container
../../../intern/ffmpeg
../../../intern/glew-mx
../../../intern/guardedalloc
../../../intern/string
BGE: DeckLink card support for video capture and streaming. You can capture and stream video in the BGE using the DeckLink video cards from Black Magic Design. You need a card and Desktop Video software version 10.4 or above to use these features in the BGE. Many thanks to Nuno Estanquiero who tested the patch extensively on a variety of Decklink products, it wouldn't have been possible without his help. You can find a brief summary of the decklink features here: https://wiki.blender.org/index.php/Dev:Source/GameEngine/Decklink The full API details and samples are in the Python API documentation. bge.texture.VideoDeckLink(format, capture=0): Use this object to capture a video stream. the format argument describes the video and pixel formats and the capture argument the card number. This object can be used as a source for bge.texture.Texture so that the frame is sent to the GPU, or by itself using the new refresh method to get the video frame in a buffer. The frames are usually not in RGB but in YUV format (8bit or 10bit); they require a shader to extract the RGB components in the GPU. Details and sample shaders in the documentation. 3D video capture is supported: the frames are double height with left and right eyes in top-bottom order. The 'eye' uniform (see setUniformEyef) can be used to sample the 3D frame when the BGE is also in stereo mode. This allows to composite a 3D video stream with a 3D scene and render it in stereo. In Windows, and if you have a nVidia Quadro GPU, you can benefit of an additional performance boost by using 'GPUDirect': a method to send a video frame to the GPU without going through the OGL driver. The 'pinned memory' OGL extension is also supported (only on high-end AMD GPU) with the same effect. bge.texture.DeckLink(cardIdx=0, format=""): Use this object to send video frame to a DeckLink card. Only the immediate mode is supported, the scheduled mode is not implemented. This object is similar to bge.texture.Texture: you need to attach a image source and call refresh() to compute and send the frame to the card. This object is best suited for video keying: a video stream (not captured) flows through the card and the frame you send to the card are displayed above it (the card does the compositing automatically based on the alpha channel). At the time of this commit, 3D video keying is supported in the BGE but not in the DeckLink card due to a color space issue.
2016-06-10 08:09:26 +00:00
../../../intern/decklink
../../../intern/gpudirect
../../../intern/atomic
)
set(INC_SYS
../../../intern/moto/include
${GLEW_INCLUDE_PATH}
VideoTexture module. The only compilation system that works for sure is the MSVC project files. I've tried my best to update the other compilation system but I count on the community to check and fix them. This is Zdeno Miklas video texture plugin ported to trunk. The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html) EXCEPT for the following: The module name is changed to VideoTexture (instead of blendVideoTex). A new (and only) video source is now available: VideoFFmpeg() You must pass 1 to 4 arguments when you create it (you can use named arguments): VideoFFmpeg(file) : play a video file VideoFFmpeg(file, capture, rate, width, height) : start a live video capture file: In the first form, file is a video file name, relative to startup directory. It can also be a URL, FFmpeg will happily stream a video from a network source. In the second form, file is empty or is a hint for the format of the video capture. In Windows, file is ignored and should be empty or not specified. In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394. The user specifies the type of device with the file parameter: [<device_type>][:<standard>] <device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l' <standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc' The driver name is constructed automatically from the device types: v4l : /dev/video<capture> dv1394: /dev/dv1394/<capture> If you have different driver name, you can specify the driver name explicitely instead of device type. Examples of valid file parameter: /dev/v4l/video0:pal /dev/ieee1394/1:ntsc dv1394:ntsc v4l:pal :secam capture: Defines the index number of the capture source, starting from 0. The first capture device is always 0. The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1. rate: the capture frame rate, by default 25 frames/sec width: height: Width and height of the video capture in pixel, default value 0. In Windows you must specify these values and they must fit with the capture device capability. For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480, you must specify one of these couple of values or the opening of the video source will fail. In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height. Simple example ************** 1. Texture definition script: import VideoTexture contr = GameLogic.getCurrentController() obj = contr.getOwner() if not hasattr(GameLogic, 'video'): matID = VideoTexture.materialID(obj, 'MAVideoMat') GameLogic.video = VideoTexture.Texture(obj, matID) GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg') # Streaming is also possible: #GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg') GameLogic.vidSrc.repeat = -1 # If the video dimensions are not a power of 2, scaling must be done before # sending the texture to the GPU. This is done by default with gluScaleImage() # but you can also use a faster, but less precise, scaling by setting scale # to True. Best approach is to convert the video offline and set the dimensions right. GameLogic.vidSrc.scale = True # FFmpeg always delivers the video image upside down, so flipping is enabled automatically #GameLogic.vidSrc.flip = True if contr.getSensors()[0].isPositive(): GameLogic.video.source = GameLogic.vidSrc GameLogic.vidSrc.play() 2. Texture refresh script: obj = GameLogic.getCurrentController().getOwner() if hasattr(GameLogic, 'video') != 0: GameLogic.video.refresh(True) You can download this demo here: http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
)
add_definitions(${GL_DEFINITIONS})
set(SRC
Exception.cpp
FilterBase.cpp
FilterBlueScreen.cpp
FilterColor.cpp
FilterNormal.cpp
FilterSource.cpp
ImageBase.cpp
ImageBuff.cpp
ImageMix.cpp
ImageRender.cpp
ImageViewport.cpp
PyTypeList.cpp
Texture.cpp
BGE: DeckLink card support for video capture and streaming. You can capture and stream video in the BGE using the DeckLink video cards from Black Magic Design. You need a card and Desktop Video software version 10.4 or above to use these features in the BGE. Many thanks to Nuno Estanquiero who tested the patch extensively on a variety of Decklink products, it wouldn't have been possible without his help. You can find a brief summary of the decklink features here: https://wiki.blender.org/index.php/Dev:Source/GameEngine/Decklink The full API details and samples are in the Python API documentation. bge.texture.VideoDeckLink(format, capture=0): Use this object to capture a video stream. the format argument describes the video and pixel formats and the capture argument the card number. This object can be used as a source for bge.texture.Texture so that the frame is sent to the GPU, or by itself using the new refresh method to get the video frame in a buffer. The frames are usually not in RGB but in YUV format (8bit or 10bit); they require a shader to extract the RGB components in the GPU. Details and sample shaders in the documentation. 3D video capture is supported: the frames are double height with left and right eyes in top-bottom order. The 'eye' uniform (see setUniformEyef) can be used to sample the 3D frame when the BGE is also in stereo mode. This allows to composite a 3D video stream with a 3D scene and render it in stereo. In Windows, and if you have a nVidia Quadro GPU, you can benefit of an additional performance boost by using 'GPUDirect': a method to send a video frame to the GPU without going through the OGL driver. The 'pinned memory' OGL extension is also supported (only on high-end AMD GPU) with the same effect. bge.texture.DeckLink(cardIdx=0, format=""): Use this object to send video frame to a DeckLink card. Only the immediate mode is supported, the scheduled mode is not implemented. This object is similar to bge.texture.Texture: you need to attach a image source and call refresh() to compute and send the frame to the card. This object is best suited for video keying: a video stream (not captured) flows through the card and the frame you send to the card are displayed above it (the card does the compositing automatically based on the alpha channel). At the time of this commit, 3D video keying is supported in the BGE but not in the DeckLink card due to a color space issue.
2016-06-10 08:09:26 +00:00
DeckLink.cpp
VideoBase.cpp
VideoFFmpeg.cpp
BGE: DeckLink card support for video capture and streaming. You can capture and stream video in the BGE using the DeckLink video cards from Black Magic Design. You need a card and Desktop Video software version 10.4 or above to use these features in the BGE. Many thanks to Nuno Estanquiero who tested the patch extensively on a variety of Decklink products, it wouldn't have been possible without his help. You can find a brief summary of the decklink features here: https://wiki.blender.org/index.php/Dev:Source/GameEngine/Decklink The full API details and samples are in the Python API documentation. bge.texture.VideoDeckLink(format, capture=0): Use this object to capture a video stream. the format argument describes the video and pixel formats and the capture argument the card number. This object can be used as a source for bge.texture.Texture so that the frame is sent to the GPU, or by itself using the new refresh method to get the video frame in a buffer. The frames are usually not in RGB but in YUV format (8bit or 10bit); they require a shader to extract the RGB components in the GPU. Details and sample shaders in the documentation. 3D video capture is supported: the frames are double height with left and right eyes in top-bottom order. The 'eye' uniform (see setUniformEyef) can be used to sample the 3D frame when the BGE is also in stereo mode. This allows to composite a 3D video stream with a 3D scene and render it in stereo. In Windows, and if you have a nVidia Quadro GPU, you can benefit of an additional performance boost by using 'GPUDirect': a method to send a video frame to the GPU without going through the OGL driver. The 'pinned memory' OGL extension is also supported (only on high-end AMD GPU) with the same effect. bge.texture.DeckLink(cardIdx=0, format=""): Use this object to send video frame to a DeckLink card. Only the immediate mode is supported, the scheduled mode is not implemented. This object is similar to bge.texture.Texture: you need to attach a image source and call refresh() to compute and send the frame to the card. This object is best suited for video keying: a video stream (not captured) flows through the card and the frame you send to the card are displayed above it (the card does the compositing automatically based on the alpha channel). At the time of this commit, 3D video keying is supported in the BGE but not in the DeckLink card due to a color space issue.
2016-06-10 08:09:26 +00:00
VideoDeckLink.cpp
blendVideoTex.cpp
BlendType.h
Common.h
Exception.h
FilterBase.h
FilterBlueScreen.h
FilterColor.h
FilterNormal.h
FilterSource.h
ImageBase.h
ImageBuff.h
ImageMix.h
ImageRender.h
ImageViewport.h
PyTypeList.h
Texture.h
BGE: DeckLink card support for video capture and streaming. You can capture and stream video in the BGE using the DeckLink video cards from Black Magic Design. You need a card and Desktop Video software version 10.4 or above to use these features in the BGE. Many thanks to Nuno Estanquiero who tested the patch extensively on a variety of Decklink products, it wouldn't have been possible without his help. You can find a brief summary of the decklink features here: https://wiki.blender.org/index.php/Dev:Source/GameEngine/Decklink The full API details and samples are in the Python API documentation. bge.texture.VideoDeckLink(format, capture=0): Use this object to capture a video stream. the format argument describes the video and pixel formats and the capture argument the card number. This object can be used as a source for bge.texture.Texture so that the frame is sent to the GPU, or by itself using the new refresh method to get the video frame in a buffer. The frames are usually not in RGB but in YUV format (8bit or 10bit); they require a shader to extract the RGB components in the GPU. Details and sample shaders in the documentation. 3D video capture is supported: the frames are double height with left and right eyes in top-bottom order. The 'eye' uniform (see setUniformEyef) can be used to sample the 3D frame when the BGE is also in stereo mode. This allows to composite a 3D video stream with a 3D scene and render it in stereo. In Windows, and if you have a nVidia Quadro GPU, you can benefit of an additional performance boost by using 'GPUDirect': a method to send a video frame to the GPU without going through the OGL driver. The 'pinned memory' OGL extension is also supported (only on high-end AMD GPU) with the same effect. bge.texture.DeckLink(cardIdx=0, format=""): Use this object to send video frame to a DeckLink card. Only the immediate mode is supported, the scheduled mode is not implemented. This object is similar to bge.texture.Texture: you need to attach a image source and call refresh() to compute and send the frame to the card. This object is best suited for video keying: a video stream (not captured) flows through the card and the frame you send to the card are displayed above it (the card does the compositing automatically based on the alpha channel). At the time of this commit, 3D video keying is supported in the BGE but not in the DeckLink card due to a color space issue.
2016-06-10 08:09:26 +00:00
DeckLink.h
VideoBase.h
VideoFFmpeg.h
BGE: DeckLink card support for video capture and streaming. You can capture and stream video in the BGE using the DeckLink video cards from Black Magic Design. You need a card and Desktop Video software version 10.4 or above to use these features in the BGE. Many thanks to Nuno Estanquiero who tested the patch extensively on a variety of Decklink products, it wouldn't have been possible without his help. You can find a brief summary of the decklink features here: https://wiki.blender.org/index.php/Dev:Source/GameEngine/Decklink The full API details and samples are in the Python API documentation. bge.texture.VideoDeckLink(format, capture=0): Use this object to capture a video stream. the format argument describes the video and pixel formats and the capture argument the card number. This object can be used as a source for bge.texture.Texture so that the frame is sent to the GPU, or by itself using the new refresh method to get the video frame in a buffer. The frames are usually not in RGB but in YUV format (8bit or 10bit); they require a shader to extract the RGB components in the GPU. Details and sample shaders in the documentation. 3D video capture is supported: the frames are double height with left and right eyes in top-bottom order. The 'eye' uniform (see setUniformEyef) can be used to sample the 3D frame when the BGE is also in stereo mode. This allows to composite a 3D video stream with a 3D scene and render it in stereo. In Windows, and if you have a nVidia Quadro GPU, you can benefit of an additional performance boost by using 'GPUDirect': a method to send a video frame to the GPU without going through the OGL driver. The 'pinned memory' OGL extension is also supported (only on high-end AMD GPU) with the same effect. bge.texture.DeckLink(cardIdx=0, format=""): Use this object to send video frame to a DeckLink card. Only the immediate mode is supported, the scheduled mode is not implemented. This object is similar to bge.texture.Texture: you need to attach a image source and call refresh() to compute and send the frame to the card. This object is best suited for video keying: a video stream (not captured) flows through the card and the frame you send to the card are displayed above it (the card does the compositing automatically based on the alpha channel). At the time of this commit, 3D video keying is supported in the BGE but not in the DeckLink card due to a color space issue.
2016-06-10 08:09:26 +00:00
VideoDeckLink.h
)
if(WITH_CODEC_FFMPEG)
list(APPEND INC_SYS
${FFMPEG_INCLUDE_DIRS}
${PTHREADS_INCLUDE_DIRS}
)
add_definitions(-DWITH_FFMPEG)
remove_strict_flags_file(
VideoFFmpeg.cpp
BGE: DeckLink card support for video capture and streaming. You can capture and stream video in the BGE using the DeckLink video cards from Black Magic Design. You need a card and Desktop Video software version 10.4 or above to use these features in the BGE. Many thanks to Nuno Estanquiero who tested the patch extensively on a variety of Decklink products, it wouldn't have been possible without his help. You can find a brief summary of the decklink features here: https://wiki.blender.org/index.php/Dev:Source/GameEngine/Decklink The full API details and samples are in the Python API documentation. bge.texture.VideoDeckLink(format, capture=0): Use this object to capture a video stream. the format argument describes the video and pixel formats and the capture argument the card number. This object can be used as a source for bge.texture.Texture so that the frame is sent to the GPU, or by itself using the new refresh method to get the video frame in a buffer. The frames are usually not in RGB but in YUV format (8bit or 10bit); they require a shader to extract the RGB components in the GPU. Details and sample shaders in the documentation. 3D video capture is supported: the frames are double height with left and right eyes in top-bottom order. The 'eye' uniform (see setUniformEyef) can be used to sample the 3D frame when the BGE is also in stereo mode. This allows to composite a 3D video stream with a 3D scene and render it in stereo. In Windows, and if you have a nVidia Quadro GPU, you can benefit of an additional performance boost by using 'GPUDirect': a method to send a video frame to the GPU without going through the OGL driver. The 'pinned memory' OGL extension is also supported (only on high-end AMD GPU) with the same effect. bge.texture.DeckLink(cardIdx=0, format=""): Use this object to send video frame to a DeckLink card. Only the immediate mode is supported, the scheduled mode is not implemented. This object is similar to bge.texture.Texture: you need to attach a image source and call refresh() to compute and send the frame to the card. This object is best suited for video keying: a video stream (not captured) flows through the card and the frame you send to the card are displayed above it (the card does the compositing automatically based on the alpha channel). At the time of this commit, 3D video keying is supported in the BGE but not in the DeckLink card due to a color space issue.
2016-06-10 08:09:26 +00:00
VideoDeckLink
DeckLink
)
endif()
VideoTexture module. The only compilation system that works for sure is the MSVC project files. I've tried my best to update the other compilation system but I count on the community to check and fix them. This is Zdeno Miklas video texture plugin ported to trunk. The original plugin API is maintained (can be found here http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html) EXCEPT for the following: The module name is changed to VideoTexture (instead of blendVideoTex). A new (and only) video source is now available: VideoFFmpeg() You must pass 1 to 4 arguments when you create it (you can use named arguments): VideoFFmpeg(file) : play a video file VideoFFmpeg(file, capture, rate, width, height) : start a live video capture file: In the first form, file is a video file name, relative to startup directory. It can also be a URL, FFmpeg will happily stream a video from a network source. In the second form, file is empty or is a hint for the format of the video capture. In Windows, file is ignored and should be empty or not specified. In Linux, ffmpeg supports two types of device: VideoForLinux and DV1394. The user specifies the type of device with the file parameter: [<device_type>][:<standard>] <device_type> : 'v4l' for VideoForLinux, 'dv1394' for DV1394; default to 'v4l' <standard> : 'pal', 'secam' or 'ntsc', default to 'ntsc' The driver name is constructed automatically from the device types: v4l : /dev/video<capture> dv1394: /dev/dv1394/<capture> If you have different driver name, you can specify the driver name explicitely instead of device type. Examples of valid file parameter: /dev/v4l/video0:pal /dev/ieee1394/1:ntsc dv1394:ntsc v4l:pal :secam capture: Defines the index number of the capture source, starting from 0. The first capture device is always 0. The VideoTexutre modules knows that you want to start a live video capture when you set this parameter to a number >= 0. Setting this parameter < 0 indicates a video file playback. Default value is -1. rate: the capture frame rate, by default 25 frames/sec width: height: Width and height of the video capture in pixel, default value 0. In Windows you must specify these values and they must fit with the capture device capability. For example, if you have a webcam that can capture at 160x120, 320x240 or 640x480, you must specify one of these couple of values or the opening of the video source will fail. In Linux, default values are provided by the VideoForLinux driver if you don't specify width and height. Simple example ************** 1. Texture definition script: import VideoTexture contr = GameLogic.getCurrentController() obj = contr.getOwner() if not hasattr(GameLogic, 'video'): matID = VideoTexture.materialID(obj, 'MAVideoMat') GameLogic.video = VideoTexture.Texture(obj, matID) GameLogic.vidSrc = VideoTexture.VideoFFmpeg('trailer_400p.ogg') # Streaming is also possible: #GameLogic.vidSrc = VideoTexture.VideoFFmpeg('http://10.32.1.10/trailer_400p.ogg') GameLogic.vidSrc.repeat = -1 # If the video dimensions are not a power of 2, scaling must be done before # sending the texture to the GPU. This is done by default with gluScaleImage() # but you can also use a faster, but less precise, scaling by setting scale # to True. Best approach is to convert the video offline and set the dimensions right. GameLogic.vidSrc.scale = True # FFmpeg always delivers the video image upside down, so flipping is enabled automatically #GameLogic.vidSrc.flip = True if contr.getSensors()[0].isPositive(): GameLogic.video.source = GameLogic.vidSrc GameLogic.vidSrc.play() 2. Texture refresh script: obj = GameLogic.getCurrentController().getOwner() if hasattr(GameLogic, 'video') != 0: GameLogic.video.refresh(True) You can download this demo here: http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend http://home.scarlet.be/~tsi46445/blender/trailer_400p.ogg
2008-10-31 22:35:52 +00:00
BGE: DeckLink card support for video capture and streaming. You can capture and stream video in the BGE using the DeckLink video cards from Black Magic Design. You need a card and Desktop Video software version 10.4 or above to use these features in the BGE. Many thanks to Nuno Estanquiero who tested the patch extensively on a variety of Decklink products, it wouldn't have been possible without his help. You can find a brief summary of the decklink features here: https://wiki.blender.org/index.php/Dev:Source/GameEngine/Decklink The full API details and samples are in the Python API documentation. bge.texture.VideoDeckLink(format, capture=0): Use this object to capture a video stream. the format argument describes the video and pixel formats and the capture argument the card number. This object can be used as a source for bge.texture.Texture so that the frame is sent to the GPU, or by itself using the new refresh method to get the video frame in a buffer. The frames are usually not in RGB but in YUV format (8bit or 10bit); they require a shader to extract the RGB components in the GPU. Details and sample shaders in the documentation. 3D video capture is supported: the frames are double height with left and right eyes in top-bottom order. The 'eye' uniform (see setUniformEyef) can be used to sample the 3D frame when the BGE is also in stereo mode. This allows to composite a 3D video stream with a 3D scene and render it in stereo. In Windows, and if you have a nVidia Quadro GPU, you can benefit of an additional performance boost by using 'GPUDirect': a method to send a video frame to the GPU without going through the OGL driver. The 'pinned memory' OGL extension is also supported (only on high-end AMD GPU) with the same effect. bge.texture.DeckLink(cardIdx=0, format=""): Use this object to send video frame to a DeckLink card. Only the immediate mode is supported, the scheduled mode is not implemented. This object is similar to bge.texture.Texture: you need to attach a image source and call refresh() to compute and send the frame to the card. This object is best suited for video keying: a video stream (not captured) flows through the card and the frame you send to the card are displayed above it (the card does the compositing automatically based on the alpha channel). At the time of this commit, 3D video keying is supported in the BGE but not in the DeckLink card due to a color space issue.
2016-06-10 08:09:26 +00:00
if(WITH_GAMEENGINE_DECKLINK)
add_definitions(-DWITH_GAMEENGINE_DECKLINK)
endif()
blender_add_lib(ge_videotex "${SRC}" "${INC}" "${INC_SYS}")