One of the our important goals is to improve open source tools to better fit to the professional gamedev industry.
I’d like to invite you to discuss about your ideas here: FORUM
One of the our important goals is to improve open source tools to better fit to the professional gamedev industry.
I’d like to invite you to discuss about your ideas here: FORUM
-one thing that would be nice would be some improvement on the logic bricks and a better message brick. The near sensor is also extremely buggy and needs to be rewritten.
-If it’s possible Real Time Shadows? doubt it would be worth it, but it’d be a nice tool to work with.
-Skinned Rag-doll physics. I know there was an example done somewhere on the forums; however, It was obviously very limited.
-I noticed that the IPO actuator needed some work, for instance, the character does not stop moving until the animation is done. It just looks a little “off”
-One added brick or feature would be a velocity collision controller. Like if an object hits the ground at this speed, it blows up.
These ideas are probably all fantasies and are all trying to keep the most untrained in mind. Thank you for listening.
nick,
I think you misunderstood. This is not a call for wishes of the internal game engine, but instead for wishes related to creating content and exporting it to other game engines.
LetterRip
nick – every idea is a good idea and worth to notice, but yes – in Apricot we don’t use internal Blender game engine. It’s preparation to work with external engines, especially Crystal Space, but some ideas can be common (like that ragdoll).
=extend the python script so that game developers can add ‘game object’ types,additional menus – “export_map()’,’compile_map()’
all specified in a single script file,similar to halflife fgd scripts.
=improve verse so that level designers can build maps in realtime.
I have two wishes. One, I’d like the FBX exporter script to work on the new MacBook pros. (In 2.45, pressing export doesn’t work; I have to manually export from the command line.)
Secondly, I’d like exporter defaults that get saved with Control+U. I always use the same settings for OBJ and FBX exports, so why the extra settings window every time?
Lastly, I’d like a global exporter scale. I always have to scale everything in Blender up by 100 in Cinema4D or Unity.
well than, I guess it was too general for me. Sorry, on to the important stuff.
Import/export fixes.
universal UV’s
Open source programs need to extended into Java and flash, or be compatible with html. I’m pretty sure that would allow the creation of online games and would work better than a plugin, or allow open source programs to run on the online Unity plugin, because that plugin actually works pretty well. That actually is probably asking too much. I would stick with the Java and Flash though because it is obviously much more practical and is actually possible.
One of the things that I noticed while creating game content in blender is that the functionality of set smooth/set solid controls is limited to the entire model itself. Back when I was using Maya, it allowed you to select certain edges/faces and make it so that some of the edges were soft and the rest were solid. Such a feature in blender would allow the user to create polygon objects with some soft areas and also some hard edges. Adding a feature like this could be used for polygon objects which uses both hard and soft edges.
I think that if Crystal Space had a map editor that directly used the Crystal Space engine. That way level designers would have a WYSIWYG edit. I have used the Blender2Crystal script (and its good stuff) and I had a hard time getting what I saw in CS look like what I created in blender (most of that could have been my newbieness). With a CS world builder the update and test cycle would be fast do to the fact that I do not have to export and load my level to see my changes. And of course plugin CEL to it. So I guess what I am saying is maybe it would be a good idea to port the Blender2Crystal script to a pure Crystal Space application
In regards to the map editor, I personally think that it’s a good idea to make Blender more map editor friendly. That way, it would also benefit the users that create games inside of blender itself and less coding would be required on your part.
You could make it so that geometry or empties (for 3d sound positioning?) can link to python scripts. If any of this stuff already exists, then let me know. I would think that features like these would make Blender more game map-making friendly.
And as for making it easier to edit maps in blender, you could add a camera control that would cause the camera to pan across the landscape (a bit similar feel when moving the camera in a FPS game, except that the camera would focus on a certain point instead). This would make large-scale world editing easier for users. And I can understand about the content having two different looks. You just have to make sure that blender and CS read UV maps the same way and ensure that the geometry isn’t altered after export. Overall, I think that some development effort could go towards making sure that content made in Blender is viewed the same way as inside of the engine. Even if you decide to make a full-blown WYSIWYG editor, it would be a good idea so that meshes would come out the same way.
Not that it can’t be done, but it is difficult to write a WYSIWYG editor from scratch for any engine. WYSIWYG editors are generally made for a particular genre of game. For a engine like Crystal Space, which can be used for a vast variety of games, writing a WYSIWYG editor will be very difficult indeed. Having an editor like Blender is probably much more intuitive. Being “open” it could be then be tailored for any game.
As for the rest of my comments and suggestions, they are under the handle _neutrin0_ here
http://www.blender.org/forum/viewtopic.php?p=65083#65083
This is indeed a nice initiative from you guys.
I’d love to have materials that behave the same way in the GE like in the render.
Whats the point then if this work is not going into the Blender GE? Relying on working builds of CS or having to build your own.
Eh just a little confused is all. In mac land we have
http://unity3d.com/
Which for some reason imports Blender files really well. And for all intents and purposes out performs any need for Blender GE. But Blender also still lacks GLSL shading … Thats really what we need first is Some sort of real time shading before we export to a different engine.
spaceseel,
see the edge split modifier that is how you set hard edges in blender.
trip blender has GLSL shading for the GE they are done via python.
LetterRip
– Extending VERSE for more robust use in a game development environment.
Spaceseel, you can use the setsmooth in edit mode to smooth only certain edges. I do it all the time. ie. make an aircraft wing round on the front but knife sharp at the back.
A standard for UV mapping, and animation cicles.
Nowadays it is very difficult to export models and their animation cicles to other open source engines. Everything appears fliped or stretched.
Really cross plataform “.EXES” to demonstrate character development and controls. (A viewer able to load animated UV mapped meshes with their animation sounds the same way under Linux Win and Mac).
A stencil shadow “prototyping tool” for architecture development (like this one from Google Sketch Up; in other words it could be fake).
I wrote an API to integrate external renderer in Blender. This project is known as BEEI(http://projects.blender.org/projects/beei/).
At the moment i have no time to test and integrate other Game Engines like irrlicht, ogre,… . There is only a simple test engine from me written in freepascal. Over BEEI you can load dll/so files in blender and the api allow to execute python scripts, create gui and give you redraw and event callbacks. The advantage of dll/so files is that you can directly access the blenderdata.
I have to agree with tripdragon. What’s the point if the blender GE doesn’t benefit? It’s a fantastic piece of work, suitable for many different genres of games (and non-games). Seriously, there’s nothing else like it, and I for one would love to see it grow and evolve.
nomenclature: the point is to test Blender as a program suitable for usage with external game engines. That’s important too because not everyone is making games using the built-in GE. Blender has to be good for usage as a game modeller in combination with external engines. And that’s one of the goals of this project.
Greetings,
I’d like to try the blender crystalspace toolchain. whats missing for me as a blender user are some step by step “hello world” type tutorials for blender crystalspace integration. As a crystalspace newbie, I went to the crystalspace web site and became quickly confused. there is crystal and crystal core and cel and… I understand that for a quick standalone game engine I should use cel or celstart? Are the binaries / install packages for these on os-x? …
Jonathan: unfortunately not due to lack of MacOS/X programmers in our team.
It would be really great if blender would have light mapping capabilities
indrusty? lol.. anyhow realtime IK/bones preview in the GE for demo animations for export to whatever. It’s hard to know what is needed without knowing how blender exports to crystal space.
Some general things like an easy way to demo GLSL shaders onto your objects and then export that info with your model etc.
An easy way of transferring UV’s from a high to low poly object.
Better painting tools.. Ie. proper undo’s and being able to see the paint brush.. painting on multiple axis’s etc./mirror painting etc.
An editor the lets you manage multiple light baking etc. A light manager?
Easier ways of setting edges smoothness/hardness.. it must be fast and interactive.
The main point about any game tools developed, they must be fast and allow batch editing work flows. And when I say fast I mean Lightening fast.. the amount of time you have to spend modelling is very short. so robust and fast must be a priority.
Being able to model using QWERTY keys. faster interface. Or at least being able to set custom hot keys.
Mirror editing.um all I can think of right now.
oohooh.. viewing UV’s of multiple selected objects in the UV editor!
because we develop 3d webgames using Director/Shockwave I miss a Blender exporter for w3d file format. w3d is the interchange format used by Director for importing 3d data from external applications. It supports geometry, shaders, textures motion , bones, cameras and lights.
… A ” target connection” is a fantastic thing to have…
Renderware studio used to do it, as do several of EA’s in house tools, some of Sonys tools too…
ie, live edits to data in blender change the content in an external game engine or viewer in realtime.
not always possible these days with increasing use of XML as an intermediate format, ie Authoring tool(blender)export to—> XML export to —>game engine(crystalspace in this case)
TAK2004’s idea of running external files in a window is great, live update through a target connection would make it perfect!
Kirado is bang on the money about batch workflows…. I haven’t investigated what’s doable through python yet though, so it may be not too hard…
again on the authoring–>xml—>target, being able to batch this all is a massive boost…
Exporting a world should be one click in blender, (background exporting to xml, background compiling/conditioning to launching the target game engine…
Lamberto,
you can export to XSI, then use XSI to w3d converter that is open source at sourceforge.
LetterRip
In seven years of using blender in the industry lots of its features have addressed major issues in toolchain. Two for me stand-out.
1. Meta data. Real game data has bags of meta-data attached (from sound effects for character movement on materials to AI empties, physics properties [those designed for bullet don’t cut it] and so on and so on). Blender needs a concrete mechanism for specifying and accessing meta-data on almost anything.
2. Integrate scripting into core UI. Having to bugger about with a separate script-specific UI is a usability nightmare for content-creators. I’ve had to battle serious antagonism with content creators on this. Allowing scripts to be completely integrated with the UI (e.g. new object types in the create menu, new overlay popups, keypress binding) would help workflow no end and dramatically reduce training/crosstraining costs.
Live on-target content viewing is possible with current blender – we have it running on multiple target platforms over the network (create the python to do it – set up a socket in engine and send new data to it – that’s how RW, Havok, and everyone else do it).
For hobbyists I can see that exporter quality &c. are important, but for serious users don’t worry so much about the tools – it is reasonably easy to create tools, exporters, new entity types, scene graph compilers, etc (at least compared to creating a full game) – I’d love to see the infrastructure supporting the toolchain as a whole improved so making game content in blender isn’t so disjointed and hackey.
One of the best features of blender is the property of a object. You can set on every Blender object properties like name with type string and value “Hans” or DirX,DirY,DirZ with type float and 1.0,0.0,0.0 . If a programmer can define structures(name,lvl,items,…) and a designer can use a button to attach this structure and edit the properties of the selected object then it will a powerfull tool. At the moment this is only possible with self written python scripts for attach button and an own ui form script. You also need a third python script for export the properties or add this to the own customized exporter. I use it for waypoints and entities.
I would like access to more of the blender “interface” through python. When we export our models we would like to do some texture baking, optimization, etc. Some of these steps cannot be done from python, hence we cannot make a fully automated export-script for our models.
I don’t know if this is possible, nor if it’s really usefull. But in my opinion, it would be a nice feature :
To allow dynamic link into blender-made ressources and crystal space objects. I mean, when you create content in blender, and use it in cs, modifications done later in blender are repercuted in real time into cs.
Of course, it would be a gain in time, making export/modifications/re-export/re-modifications more/re-re-export useless.
Mindmapping module…
One thing I haven’t seen mentioned before is improvement of the polygon/triangle counter.
When making 3d models for real-time use, you of course have to pay atention to how many polygons you use. Right now you have to convert the mesh to triangles and also apply the mirror modifier if one is used, to be able to see the final/real triangle count of the model.
What I’m proposing here is an option to see the model’s triangle count without having to apply various modifiers or converting the polygons to triangles.
Better (free) Static code analysis tools is something that’s needed to help produce games of better quality. There are commercial tools like CodeSonar and Coverity Prevent, but we really need high quality free tools to fill this niche.
Logic bricks fixed is major issue. Partially allow them to be used with external engines. Current logic brick system is very poor. There have been many replacement designs never developed. Working replacement that is game engine neutral is critical to rapid prototyping to demo game to funding parties. Saying its part of the core blender game engine is wrong. Some how it has to be made neutral and improved.
Ian,
1. Meta data. – can’t you do this via properties? It was specifically designed for that sort of thing I believe.
2. Integrate scripting into core UI. – comming with the tool api refactor I think. You can already add objects to the main object menu though.
LetterRip
If it’s gonna be an example game. Then perhaps add examples of:
– The type of blur you see when looking through steam or fire.
– Meshes with layers of glow around them. I really wonder how that’s done.
It’s time to post some updates.. something good, if ya ever wanna make those 1000. :/
I paid for my copy of the preorder last week, but I’m with Paul and looking forward to the updates!
Hola.! Feliz ano nuevo.!
A working Collada exporter that does more than static meshes. Like animations, bounding boxes and physics you know like other DCC tools can.
I agree – an update to the blog would be great.
full glsl support in the render window, including parallax attenuation if possible.
Also and probably more important, work in the texture paint department. Namely layers for starters perhaps cut and paste, and move/scale… and if I were to dream normal map painting. π
For Jorrit, that 40 bone limit is a monster of a problem, for anything more than a basic character, and a way to use low poly meshes in place of higher poly meshes when they are farther away. Perhaps by paging them in and out of memory.
oops, better explanation of that last bit..
π
if you are using .dds files which I know CS can use. then each file has all the mipmapped textures pre-generated. If one were to map these lower resolution textures within the .dds file, then touch it up and repack the file, you should be only pulling the factory for the object into memory. Texture space remains the same. I know I can’t do it, but it seems like it’s possible.
Better plugins for procedural content creation. Makehuman, for example, is going in the right direction, but it will be great to have hair, and a way of animating the models easier.
Premade animation armatures, rag doll templates, abbility to export to direct x.
Things i’ve got in my mind:
-Easy sound implementation. Add the possibility to set spatial position (free or attached to an object…) and other parameters (volume,speed,loop
etc.) of SFX and music in the 3D editor.
-To be able to use the VSE (or another
sequencer) to arrange sounds in cutscenes.
-be able to set collision box in the 3D editor.
-Intuitive multiscene management. To be able to organize and sort scenes. Go back and forth between them and easily search for any game object,asset or script and re-use it with a simple drag and drop.
-to be able to use the node editor to achieve realtime shaders.
-add built-in shaders and FX (e.g:glow,Ambiant occlusion,full screen post processing) to drag and drop.
– Create contextual help.i.e, create a box that gives infos and help about anything on the screen by simply “mouseovering” it (alΓ ableton live).
-Detailed doc, with tutorials & samples.
-to be able to easily generate executable for each major O.S
(WIn,Mac,Lin), and be able to publish on the web.
-tools for fonts edition and integration.
Hi all!
First a lot of thanks for all the feedback, it is incredibly valuable. I made a list and now am trying to merge similar requests and classify them, so expect some kind of list soon.
I hope it will help to get a picture of all the stuff that would make from blender the perfect game editor :).
Pablo from apricot team
I would suggest looking at other 3d tools with both game interfaces, game engines and game editors to see where we can improve each. The whole idea should be to make the new features as organized as possible and effective as it relates to game development. My team does force on force simulation modeling using blender using “actors” with autonomous behaviors and actions. These are used to test various tactical strategies. I think this is classified as what is known as “serious” game development.
Hopefully game editors will be available for non-techies, thought I expect the results will not be a fraction of the excitement of the pro products.