blender/source/gameengine/Ketsji/KX_TouchEventManager.cpp

186 lines
5.9 KiB
C++
Raw Normal View History

2002-10-12 11:37:38 +00:00
/**
* $Id$
* ***** BEGIN GPL LICENSE BLOCK *****
2002-10-12 11:37:38 +00:00
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
2002-10-12 11:37:38 +00:00
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software Foundation,
* Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
*
* The Original Code is Copyright (C) 2001-2002 by NaN Holding BV.
* All rights reserved.
*
* The Original Code is: all of this file.
*
* Contributor(s): none yet.
*
* ***** END GPL LICENSE BLOCK *****
2002-10-12 11:37:38 +00:00
*/
#include "KX_TouchEventManager.h"
#include "SCA_ISensor.h"
#include "KX_TouchSensor.h"
#include "KX_GameObject.h"
#include "PHY_IPhysicsEnvironment.h"
#include "PHY_IPhysicsController.h"
2002-10-12 11:37:38 +00:00
#ifdef HAVE_CONFIG_H
#include <config.h>
#endif
2002-10-12 11:37:38 +00:00
KX_TouchEventManager::KX_TouchEventManager(class SCA_LogicManager* logicmgr,
PHY_IPhysicsEnvironment* physEnv)
2002-10-12 11:37:38 +00:00
: SCA_EventManager(TOUCH_EVENTMGR),
m_logicmgr(logicmgr),
m_physEnv(physEnv)
{
//notm_scene->addTouchCallback(STATIC_RESPONSE, KX_TouchEventManager::collisionResponse, this);
//m_scene->addTouchCallback(OBJECT_RESPONSE, KX_TouchEventManager::collisionResponse, this);
//m_scene->addTouchCallback(SENSOR_RESPONSE, KX_TouchEventManager::collisionResponse, this);
m_physEnv->addTouchCallback(PHY_OBJECT_RESPONSE, KX_TouchEventManager::newCollisionResponse, this);
m_physEnv->addTouchCallback(PHY_SENSOR_RESPONSE, KX_TouchEventManager::newCollisionResponse, this);
m_physEnv->addTouchCallback(PHY_BROADPH_RESPONSE, KX_TouchEventManager::newBroadphaseResponse, this);
2002-10-12 11:37:38 +00:00
}
bool KX_TouchEventManager::NewHandleCollision(void* object1, void* object2, const PHY_CollData *coll_data)
2002-10-12 11:37:38 +00:00
{
PHY_IPhysicsController* obj1 = static_cast<PHY_IPhysicsController*>(object1);
PHY_IPhysicsController* obj2 = static_cast<PHY_IPhysicsController*>(object2);
m_newCollisions.insert(std::pair<PHY_IPhysicsController*, PHY_IPhysicsController*>(obj1, obj2));
return false;
}
2002-10-12 11:37:38 +00:00
bool KX_TouchEventManager::newCollisionResponse(void *client_data,
void *object1,
void *object2,
const PHY_CollData *coll_data)
{
KX_TouchEventManager *touchmgr = (KX_TouchEventManager *) client_data;
touchmgr->NewHandleCollision(object1, object2, coll_data);
return false;
}
bool KX_TouchEventManager::newBroadphaseResponse(void *client_data,
void *object1,
void *object2,
const PHY_CollData *coll_data)
{
PHY_IPhysicsController* ctrl = static_cast<PHY_IPhysicsController*>(object1);
KX_ClientObjectInfo* info = (ctrl) ? static_cast<KX_ClientObjectInfo*>(ctrl->getNewClientInfo()) : NULL;
// This call back should only be called for controllers of Near and Radar sensor
BGE: new sensor object to generalize Near and Radar sensor, static-static collision capbility. A new type of "Sensor" physics object is available in the GE for advanced collision management. It's called Sensor for its similarities with the physics objects that underlie the Near and Radar sensors. Like the Near and Radar object it is: - static and ghost - invisible by default - always active to ensure correct collision detection - capable of detecting both static and dynamic objects - ignoring collision with their parent - capable of broadphase filtering based on: * Actor option: the collisioning object must have the Actor flag set to be detected * property/material: as specified in the collision sensors attached to it Broadphase filtering is important for performance reason: the collision points will be computed only for the objects that pass the broahphase filter. - automatically removed from the simulation when no collision sensor is active on it Unlike the Near and Radar object it can: - take any shape, including triangle mesh - be made visible for debugging (just use the Visible actuator) - have multiple collision sensors using it Other than that, the sensor objects are ordinary objects. You can move them freely or parent them. When parented to a dynamic object, they can provide advanced collision control to this object. The type of collision capability depends on the shape: - box, sphere, cylinder, cone, convex hull provide volume detection. - triangle mesh provides surface detection but you can give some volume to the suface by increasing the margin in the Advanced Settings panel. The margin applies on both sides of the surface. Performance tip: - Sensor objects perform better than Near and Radar: they do less synchronizations because of the Scenegraph optimizations and they can have multiple collision sensors on them (with different property filtering for example). - Always prefer simple shape (box, sphere) to complex shape whenever possible. - Always use broadphase filtering (avoid collision sensor with empty propery/material) - Use collision sensor only when you need them. When no collision sensor is active on the sensor object, it is removed from the simulation and consume no CPU. Known limitations: - When running Blender in debug mode, you will see one warning line of the console: "warning btCollisionDispatcher::needsCollision: static-static collision!" In release mode this message is not printed. - Collision margin has no effect on sphere, cone and cylinder shape. Other performance improvements: - Remove unnecessary interpolation for Near and Radar objects and by extension sensor objects. - Use direct matrix copy instead of quaternion to synchronize orientation. Other bug fix: - Fix Near/Radar position error on newly activated objects. This was causing several detection problems in YoFrankie - Fix margin not passed correctly to gImpact shape. - Disable force/velocity actions on static objects
2009-05-17 12:51:51 +00:00
if (!info)
return true;
switch (info->m_type)
{
BGE: new sensor object to generalize Near and Radar sensor, static-static collision capbility. A new type of "Sensor" physics object is available in the GE for advanced collision management. It's called Sensor for its similarities with the physics objects that underlie the Near and Radar sensors. Like the Near and Radar object it is: - static and ghost - invisible by default - always active to ensure correct collision detection - capable of detecting both static and dynamic objects - ignoring collision with their parent - capable of broadphase filtering based on: * Actor option: the collisioning object must have the Actor flag set to be detected * property/material: as specified in the collision sensors attached to it Broadphase filtering is important for performance reason: the collision points will be computed only for the objects that pass the broahphase filter. - automatically removed from the simulation when no collision sensor is active on it Unlike the Near and Radar object it can: - take any shape, including triangle mesh - be made visible for debugging (just use the Visible actuator) - have multiple collision sensors using it Other than that, the sensor objects are ordinary objects. You can move them freely or parent them. When parented to a dynamic object, they can provide advanced collision control to this object. The type of collision capability depends on the shape: - box, sphere, cylinder, cone, convex hull provide volume detection. - triangle mesh provides surface detection but you can give some volume to the suface by increasing the margin in the Advanced Settings panel. The margin applies on both sides of the surface. Performance tip: - Sensor objects perform better than Near and Radar: they do less synchronizations because of the Scenegraph optimizations and they can have multiple collision sensors on them (with different property filtering for example). - Always prefer simple shape (box, sphere) to complex shape whenever possible. - Always use broadphase filtering (avoid collision sensor with empty propery/material) - Use collision sensor only when you need them. When no collision sensor is active on the sensor object, it is removed from the simulation and consume no CPU. Known limitations: - When running Blender in debug mode, you will see one warning line of the console: "warning btCollisionDispatcher::needsCollision: static-static collision!" In release mode this message is not printed. - Collision margin has no effect on sphere, cone and cylinder shape. Other performance improvements: - Remove unnecessary interpolation for Near and Radar objects and by extension sensor objects. - Use direct matrix copy instead of quaternion to synchronize orientation. Other bug fix: - Fix Near/Radar position error on newly activated objects. This was causing several detection problems in YoFrankie - Fix margin not passed correctly to gImpact shape. - Disable force/velocity actions on static objects
2009-05-17 12:51:51 +00:00
case KX_ClientObjectInfo::SENSOR:
if (info->m_sensors.size() == 1)
{
// only one sensor for this type of object
KX_TouchSensor* touchsensor = static_cast<KX_TouchSensor*>(*info->m_sensors.begin());
return touchsensor->BroadPhaseFilterCollision(object1,object2);
}
break;
case KX_ClientObjectInfo::OBSENSOR:
case KX_ClientObjectInfo::OBACTORSENSOR:
BGE: new sensor object to generalize Near and Radar sensor, static-static collision capbility. A new type of "Sensor" physics object is available in the GE for advanced collision management. It's called Sensor for its similarities with the physics objects that underlie the Near and Radar sensors. Like the Near and Radar object it is: - static and ghost - invisible by default - always active to ensure correct collision detection - capable of detecting both static and dynamic objects - ignoring collision with their parent - capable of broadphase filtering based on: * Actor option: the collisioning object must have the Actor flag set to be detected * property/material: as specified in the collision sensors attached to it Broadphase filtering is important for performance reason: the collision points will be computed only for the objects that pass the broahphase filter. - automatically removed from the simulation when no collision sensor is active on it Unlike the Near and Radar object it can: - take any shape, including triangle mesh - be made visible for debugging (just use the Visible actuator) - have multiple collision sensors using it Other than that, the sensor objects are ordinary objects. You can move them freely or parent them. When parented to a dynamic object, they can provide advanced collision control to this object. The type of collision capability depends on the shape: - box, sphere, cylinder, cone, convex hull provide volume detection. - triangle mesh provides surface detection but you can give some volume to the suface by increasing the margin in the Advanced Settings panel. The margin applies on both sides of the surface. Performance tip: - Sensor objects perform better than Near and Radar: they do less synchronizations because of the Scenegraph optimizations and they can have multiple collision sensors on them (with different property filtering for example). - Always prefer simple shape (box, sphere) to complex shape whenever possible. - Always use broadphase filtering (avoid collision sensor with empty propery/material) - Use collision sensor only when you need them. When no collision sensor is active on the sensor object, it is removed from the simulation and consume no CPU. Known limitations: - When running Blender in debug mode, you will see one warning line of the console: "warning btCollisionDispatcher::needsCollision: static-static collision!" In release mode this message is not printed. - Collision margin has no effect on sphere, cone and cylinder shape. Other performance improvements: - Remove unnecessary interpolation for Near and Radar objects and by extension sensor objects. - Use direct matrix copy instead of quaternion to synchronize orientation. Other bug fix: - Fix Near/Radar position error on newly activated objects. This was causing several detection problems in YoFrankie - Fix margin not passed correctly to gImpact shape. - Disable force/velocity actions on static objects
2009-05-17 12:51:51 +00:00
// this object may have multiple collision sensors,
// check is any of them is interested in this object
for(std::list<SCA_ISensor*>::iterator it = info->m_sensors.begin();
it != info->m_sensors.end();
++it)
{
if ((*it)->GetSensorType() == SCA_ISensor::ST_TOUCH)
BGE: new sensor object to generalize Near and Radar sensor, static-static collision capbility. A new type of "Sensor" physics object is available in the GE for advanced collision management. It's called Sensor for its similarities with the physics objects that underlie the Near and Radar sensors. Like the Near and Radar object it is: - static and ghost - invisible by default - always active to ensure correct collision detection - capable of detecting both static and dynamic objects - ignoring collision with their parent - capable of broadphase filtering based on: * Actor option: the collisioning object must have the Actor flag set to be detected * property/material: as specified in the collision sensors attached to it Broadphase filtering is important for performance reason: the collision points will be computed only for the objects that pass the broahphase filter. - automatically removed from the simulation when no collision sensor is active on it Unlike the Near and Radar object it can: - take any shape, including triangle mesh - be made visible for debugging (just use the Visible actuator) - have multiple collision sensors using it Other than that, the sensor objects are ordinary objects. You can move them freely or parent them. When parented to a dynamic object, they can provide advanced collision control to this object. The type of collision capability depends on the shape: - box, sphere, cylinder, cone, convex hull provide volume detection. - triangle mesh provides surface detection but you can give some volume to the suface by increasing the margin in the Advanced Settings panel. The margin applies on both sides of the surface. Performance tip: - Sensor objects perform better than Near and Radar: they do less synchronizations because of the Scenegraph optimizations and they can have multiple collision sensors on them (with different property filtering for example). - Always prefer simple shape (box, sphere) to complex shape whenever possible. - Always use broadphase filtering (avoid collision sensor with empty propery/material) - Use collision sensor only when you need them. When no collision sensor is active on the sensor object, it is removed from the simulation and consume no CPU. Known limitations: - When running Blender in debug mode, you will see one warning line of the console: "warning btCollisionDispatcher::needsCollision: static-static collision!" In release mode this message is not printed. - Collision margin has no effect on sphere, cone and cylinder shape. Other performance improvements: - Remove unnecessary interpolation for Near and Radar objects and by extension sensor objects. - Use direct matrix copy instead of quaternion to synchronize orientation. Other bug fix: - Fix Near/Radar position error on newly activated objects. This was causing several detection problems in YoFrankie - Fix margin not passed correctly to gImpact shape. - Disable force/velocity actions on static objects
2009-05-17 12:51:51 +00:00
{
KX_TouchSensor* touchsensor = static_cast<KX_TouchSensor*>(*it);
if (touchsensor->BroadPhaseSensorFilterCollision(object1, object2))
return true;
}
}
return false;
}
return true;
}
void KX_TouchEventManager::RegisterSensor(SCA_ISensor* sensor)
{
2002-10-12 11:37:38 +00:00
KX_TouchSensor* touchsensor = static_cast<KX_TouchSensor*>(sensor);
BGE performance, 4th round: logic This commit extends the technique of dynamic linked list to the logic system to eliminate as much as possible temporaries, map lookup or full scan. The logic engine is now free of memory allocation, which is an important stability factor. The overhead of the logic system is reduced by a factor between 3 and 6 depending on the logic setup. This is the speed-up you can expect on a logic setup using simple bricks. Heavy bricks like python controllers and ray sensors will still take about the same time to execute so the speed up will be less important. The core of the logic engine has been much reworked but the functionality is still the same except for one thing: the priority system on the execution of controllers. The exact same remark applies to actuators but I'll explain for controllers only: Previously, it was possible, with the "executePriority" attribute to set a controller to run before any other controllers in the game. Other than that, the sequential execution of controllers, as defined in Blender was guaranteed by default. With the new system, the sequential execution of controllers is still guaranteed but only within the controllers of one object. the user can no longer set a controller to run before any other controllers in the game. The "executePriority" attribute controls the execution of controllers within one object. The priority is a small number starting from 0 for the first controller and incrementing for each controller. If this missing feature is a must, a special method can be implemented to set a controller to run before all other controllers. Other improvements: - Systematic use of reference in parameter passing to avoid unnecessary data copy - Use pre increment in iterator instead of post increment to avoid temporary allocation - Use const char* instead of STR_String whenever possible to avoid temporary allocation - Fix reference counting bugs (memory leak) - Fix a crash in certain cases of state switching and object deletion - Minor speed up in property sensor - Removal of objects during the game is a lot faster
2009-05-10 20:53:58 +00:00
if (m_sensors.AddBack(touchsensor))
// the sensor was effectively inserted, register it
touchsensor->RegisterSumo(this);
2002-10-12 11:37:38 +00:00
}
void KX_TouchEventManager::RemoveSensor(SCA_ISensor* sensor)
{
KX_TouchSensor* touchsensor = static_cast<KX_TouchSensor*>(sensor);
BGE performance, 4th round: logic This commit extends the technique of dynamic linked list to the logic system to eliminate as much as possible temporaries, map lookup or full scan. The logic engine is now free of memory allocation, which is an important stability factor. The overhead of the logic system is reduced by a factor between 3 and 6 depending on the logic setup. This is the speed-up you can expect on a logic setup using simple bricks. Heavy bricks like python controllers and ray sensors will still take about the same time to execute so the speed up will be less important. The core of the logic engine has been much reworked but the functionality is still the same except for one thing: the priority system on the execution of controllers. The exact same remark applies to actuators but I'll explain for controllers only: Previously, it was possible, with the "executePriority" attribute to set a controller to run before any other controllers in the game. Other than that, the sequential execution of controllers, as defined in Blender was guaranteed by default. With the new system, the sequential execution of controllers is still guaranteed but only within the controllers of one object. the user can no longer set a controller to run before any other controllers in the game. The "executePriority" attribute controls the execution of controllers within one object. The priority is a small number starting from 0 for the first controller and incrementing for each controller. If this missing feature is a must, a special method can be implemented to set a controller to run before all other controllers. Other improvements: - Systematic use of reference in parameter passing to avoid unnecessary data copy - Use pre increment in iterator instead of post increment to avoid temporary allocation - Use const char* instead of STR_String whenever possible to avoid temporary allocation - Fix reference counting bugs (memory leak) - Fix a crash in certain cases of state switching and object deletion - Minor speed up in property sensor - Removal of objects during the game is a lot faster
2009-05-10 20:53:58 +00:00
if (touchsensor->Delink())
// the sensor was effectively removed, unregister it
touchsensor->UnregisterSumo(this);
}
2002-10-12 11:37:38 +00:00
void KX_TouchEventManager::EndFrame()
{
BGE performance, 4th round: logic This commit extends the technique of dynamic linked list to the logic system to eliminate as much as possible temporaries, map lookup or full scan. The logic engine is now free of memory allocation, which is an important stability factor. The overhead of the logic system is reduced by a factor between 3 and 6 depending on the logic setup. This is the speed-up you can expect on a logic setup using simple bricks. Heavy bricks like python controllers and ray sensors will still take about the same time to execute so the speed up will be less important. The core of the logic engine has been much reworked but the functionality is still the same except for one thing: the priority system on the execution of controllers. The exact same remark applies to actuators but I'll explain for controllers only: Previously, it was possible, with the "executePriority" attribute to set a controller to run before any other controllers in the game. Other than that, the sequential execution of controllers, as defined in Blender was guaranteed by default. With the new system, the sequential execution of controllers is still guaranteed but only within the controllers of one object. the user can no longer set a controller to run before any other controllers in the game. The "executePriority" attribute controls the execution of controllers within one object. The priority is a small number starting from 0 for the first controller and incrementing for each controller. If this missing feature is a must, a special method can be implemented to set a controller to run before all other controllers. Other improvements: - Systematic use of reference in parameter passing to avoid unnecessary data copy - Use pre increment in iterator instead of post increment to avoid temporary allocation - Use const char* instead of STR_String whenever possible to avoid temporary allocation - Fix reference counting bugs (memory leak) - Fix a crash in certain cases of state switching and object deletion - Minor speed up in property sensor - Removal of objects during the game is a lot faster
2009-05-10 20:53:58 +00:00
SG_DList::iterator<KX_TouchSensor> it(m_sensors);
for (it.begin();!it.end();++it)
2002-10-12 11:37:38 +00:00
{
BGE performance, 4th round: logic This commit extends the technique of dynamic linked list to the logic system to eliminate as much as possible temporaries, map lookup or full scan. The logic engine is now free of memory allocation, which is an important stability factor. The overhead of the logic system is reduced by a factor between 3 and 6 depending on the logic setup. This is the speed-up you can expect on a logic setup using simple bricks. Heavy bricks like python controllers and ray sensors will still take about the same time to execute so the speed up will be less important. The core of the logic engine has been much reworked but the functionality is still the same except for one thing: the priority system on the execution of controllers. The exact same remark applies to actuators but I'll explain for controllers only: Previously, it was possible, with the "executePriority" attribute to set a controller to run before any other controllers in the game. Other than that, the sequential execution of controllers, as defined in Blender was guaranteed by default. With the new system, the sequential execution of controllers is still guaranteed but only within the controllers of one object. the user can no longer set a controller to run before any other controllers in the game. The "executePriority" attribute controls the execution of controllers within one object. The priority is a small number starting from 0 for the first controller and incrementing for each controller. If this missing feature is a must, a special method can be implemented to set a controller to run before all other controllers. Other improvements: - Systematic use of reference in parameter passing to avoid unnecessary data copy - Use pre increment in iterator instead of post increment to avoid temporary allocation - Use const char* instead of STR_String whenever possible to avoid temporary allocation - Fix reference counting bugs (memory leak) - Fix a crash in certain cases of state switching and object deletion - Minor speed up in property sensor - Removal of objects during the game is a lot faster
2009-05-10 20:53:58 +00:00
(*it)->EndFrame();
2002-10-12 11:37:38 +00:00
}
}
void KX_TouchEventManager::NextFrame()
2002-10-12 11:37:38 +00:00
{
BGE performance, 4th round: logic This commit extends the technique of dynamic linked list to the logic system to eliminate as much as possible temporaries, map lookup or full scan. The logic engine is now free of memory allocation, which is an important stability factor. The overhead of the logic system is reduced by a factor between 3 and 6 depending on the logic setup. This is the speed-up you can expect on a logic setup using simple bricks. Heavy bricks like python controllers and ray sensors will still take about the same time to execute so the speed up will be less important. The core of the logic engine has been much reworked but the functionality is still the same except for one thing: the priority system on the execution of controllers. The exact same remark applies to actuators but I'll explain for controllers only: Previously, it was possible, with the "executePriority" attribute to set a controller to run before any other controllers in the game. Other than that, the sequential execution of controllers, as defined in Blender was guaranteed by default. With the new system, the sequential execution of controllers is still guaranteed but only within the controllers of one object. the user can no longer set a controller to run before any other controllers in the game. The "executePriority" attribute controls the execution of controllers within one object. The priority is a small number starting from 0 for the first controller and incrementing for each controller. If this missing feature is a must, a special method can be implemented to set a controller to run before all other controllers. Other improvements: - Systematic use of reference in parameter passing to avoid unnecessary data copy - Use pre increment in iterator instead of post increment to avoid temporary allocation - Use const char* instead of STR_String whenever possible to avoid temporary allocation - Fix reference counting bugs (memory leak) - Fix a crash in certain cases of state switching and object deletion - Minor speed up in property sensor - Removal of objects during the game is a lot faster
2009-05-10 20:53:58 +00:00
if (!m_sensors.Empty())
2002-10-12 11:37:38 +00:00
{
BGE performance, 4th round: logic This commit extends the technique of dynamic linked list to the logic system to eliminate as much as possible temporaries, map lookup or full scan. The logic engine is now free of memory allocation, which is an important stability factor. The overhead of the logic system is reduced by a factor between 3 and 6 depending on the logic setup. This is the speed-up you can expect on a logic setup using simple bricks. Heavy bricks like python controllers and ray sensors will still take about the same time to execute so the speed up will be less important. The core of the logic engine has been much reworked but the functionality is still the same except for one thing: the priority system on the execution of controllers. The exact same remark applies to actuators but I'll explain for controllers only: Previously, it was possible, with the "executePriority" attribute to set a controller to run before any other controllers in the game. Other than that, the sequential execution of controllers, as defined in Blender was guaranteed by default. With the new system, the sequential execution of controllers is still guaranteed but only within the controllers of one object. the user can no longer set a controller to run before any other controllers in the game. The "executePriority" attribute controls the execution of controllers within one object. The priority is a small number starting from 0 for the first controller and incrementing for each controller. If this missing feature is a must, a special method can be implemented to set a controller to run before all other controllers. Other improvements: - Systematic use of reference in parameter passing to avoid unnecessary data copy - Use pre increment in iterator instead of post increment to avoid temporary allocation - Use const char* instead of STR_String whenever possible to avoid temporary allocation - Fix reference counting bugs (memory leak) - Fix a crash in certain cases of state switching and object deletion - Minor speed up in property sensor - Removal of objects during the game is a lot faster
2009-05-10 20:53:58 +00:00
SG_DList::iterator<KX_TouchSensor> it(m_sensors);
for (it.begin();!it.end();++it)
(*it)->SynchronizeTransform();
2002-10-12 11:37:38 +00:00
for (std::set<NewCollision>::iterator cit = m_newCollisions.begin(); cit != m_newCollisions.end(); ++cit)
{
PHY_IPhysicsController* ctrl1 = (*cit).first;
// PHY_IPhysicsController* ctrl2 = (*cit).second;
// KX_GameObject* gameOb1 = ctrl1->getClientInfo();
// KX_GameObject* gameOb1 = ctrl1->getClientInfo();
KX_ClientObjectInfo *client_info = static_cast<KX_ClientObjectInfo *>(ctrl1->getNewClientInfo());
list<SCA_ISensor*>::iterator sit;
if (client_info) {
for ( sit = client_info->m_sensors.begin(); sit != client_info->m_sensors.end(); ++sit) {
static_cast<KX_TouchSensor*>(*sit)->NewHandleCollision((*cit).first, (*cit).second, NULL);
}
}
client_info = static_cast<KX_ClientObjectInfo *>((*cit).second->getNewClientInfo());
if (client_info) {
for ( sit = client_info->m_sensors.begin(); sit != client_info->m_sensors.end(); ++sit) {
static_cast<KX_TouchSensor*>(*sit)->NewHandleCollision((*cit).second, (*cit).first, NULL);
}
}
}
m_newCollisions.clear();
BGE performance, 4th round: logic This commit extends the technique of dynamic linked list to the logic system to eliminate as much as possible temporaries, map lookup or full scan. The logic engine is now free of memory allocation, which is an important stability factor. The overhead of the logic system is reduced by a factor between 3 and 6 depending on the logic setup. This is the speed-up you can expect on a logic setup using simple bricks. Heavy bricks like python controllers and ray sensors will still take about the same time to execute so the speed up will be less important. The core of the logic engine has been much reworked but the functionality is still the same except for one thing: the priority system on the execution of controllers. The exact same remark applies to actuators but I'll explain for controllers only: Previously, it was possible, with the "executePriority" attribute to set a controller to run before any other controllers in the game. Other than that, the sequential execution of controllers, as defined in Blender was guaranteed by default. With the new system, the sequential execution of controllers is still guaranteed but only within the controllers of one object. the user can no longer set a controller to run before any other controllers in the game. The "executePriority" attribute controls the execution of controllers within one object. The priority is a small number starting from 0 for the first controller and incrementing for each controller. If this missing feature is a must, a special method can be implemented to set a controller to run before all other controllers. Other improvements: - Systematic use of reference in parameter passing to avoid unnecessary data copy - Use pre increment in iterator instead of post increment to avoid temporary allocation - Use const char* instead of STR_String whenever possible to avoid temporary allocation - Fix reference counting bugs (memory leak) - Fix a crash in certain cases of state switching and object deletion - Minor speed up in property sensor - Removal of objects during the game is a lot faster
2009-05-10 20:53:58 +00:00
for (it.begin();!it.end();++it)
(*it)->Activate(m_logicmgr);
2002-10-12 11:37:38 +00:00
}
}