PDA

View Full Version : ATTN: Aspiring game developers / hobby modders



Kornman00
November 13th, 2009, 02:09 PM
Back story: I've been researching and toying with the upcoming .NET 4.0 and the new technologies it brings with it (both in terms of .NET and in overall development). One of the things that caught my interest was the new Multi-touch development.

My question: For those of you who are wishing to become/are game developers or whom are just hobby modders, if you got to see such technology implemented into your tool pipeline, where exactly would you want it and/or what features would you like it to boast? How would you want it to behave?

Please try to separate your inputs into two categories: Needs and wants. Needs being things which are basic to your usage, ie 1.0 features. Wants being things which are nifty but aren't exactly needed to get things done. For example:

I need to the game editor to be able to turn rendering off/on for different object types or object definitions. Thus giving visual input for comparison with the basic underlying world.

I want the game editor to further allow me to assign color shading to different object types or definitions instead of just not rendering them at all. Thus giving enhanced visual input for comparison with other objects types and the underlying world

Even though most of the implementations for Multi-touch tech in the game development process should wade in the 3rd party tools (more productive to do things in say 3dsmax than to export, extra step(s), to your engine then do what it is you're trying to do), sometimes you won't see these available (or there's a specific engine interaction you want) so it's up to the tool monkeys to churn out a wrench for the artists to whack their visuals with.


Input doesn't have to be specific to any game engine or 3rd party tool, but I do ask that you provide any reference material in what you're trying to explain. So if it's Unreal related and you're talking about some proprietary tool in the SDK, give a visual or relating article reference (link) so those of us who aren't versed in it can try to understand.

Also, if you think of any other external input that isn't atypical to computer input (ie, digital drawing pad) that could bind with a multi-touch environment (or even just by itself), feel free to chime in with thoughts on them too.

Dwood
November 13th, 2009, 02:15 PM
This may not be multitouch, however:
(want? )
I have a Wacom tablet and it would be really awesome to be able to take advantage of its pressure sensitivity in the aspects surrounding game development.

Ex: if you're using a game's lightmap editor of some sort, editing the lightmaps with the tools and being able to make it lighter or darker via pressure sensitivity would be so awesome.

Kornman00
November 13th, 2009, 02:47 PM
Yeah, like say, the input you provide didn't have to be limited to multi-touch, but was open to any external input device. Just waiting until we have interfaces which employ all four limbs or even better, our brain. Of course, if a device is too proprietary with no open standards or even a specialized SDK, then it doesn't become cost effective to craft a method of implementation into the pipeline or artist environment.

With your idea, I could see something where you have a visual grid which encases the lightmap view (with the main view acting as the touch device or possibly an external touch device if the main view source doesn't support it) then another grid for sensitivity input. Allowing me to use my left hand (or right hand) to move around the lightmap cursor and then my right hand (or left) to input the pressure data for that selected region.

Splitting the interface up allows you then to more specifically define input. IE, the visual grid could allow more than one touch so you use two fingers to expand/collapse (using the same sliding gesture the itouch uses for zooming) the lightmap brush size. Then with the pressure grid you could then use a single finger's pressure to tweak the lighter range or use two fingers to signal that you want to tweak it with a darker range.

When I say "grid" I'm just talking visually, almost like how Pane gui-controls (http://en.wikipedia.org/wiki/Paned_window) are used in today's typical GUI setup to separate concepts. Either the UI of the application has these "grids" split up on the MT-monitor, or on the external device which acts as the MT-input.

Dwood
November 13th, 2009, 03:25 PM
That's a bit more than I was saying, even though I do like where you're going with it.

In my tablet, they have some really sensitive touch sensors... The harder I press down the darker the lightmapping. No one's made a/an (specialized game) app that does light maps that way... :|

The touch device I have, has no screen, and the ones that do (as far as I'm aware) are honkin' expensive.

Kornman00
November 13th, 2009, 04:23 PM
Right, I was just trying to throw some bread crumbs out there for other people in case they were wanting to comment or add

Do you have a link to an article describing the specific device you have? Wiki, sale review, device summery etc

Dwood
November 13th, 2009, 04:29 PM
Right, I was just trying to throw some bread crumbs out there for other people in case they were wanting to comment or add

Do you have a link to an article describing the specific device you have? Wiki, sale review, device summery etc


Yeah I'll grab some articles and chuck 'em at you over aim.

For those interested, it's an intuos3 tablet.

Limited
November 13th, 2009, 07:48 PM
Heres what I'm interested in, a system that detects multiple touch from different users at the same time (finger print recognition to differentiate?), allowing multiple users to play a game with each other on a single "screen" or whatever it is. This opens up massive windows in terms of interaction between players, perhaps a big RTS game where they individually control the squads micro-managing them through a game.

The future of gaming is definitely about interaction between groups of people, Natal, Wii etc.

Kornman00
November 13th, 2009, 08:58 PM
The problem fingerprints is that they smug pretty easily. You'd be better off using some kind of new glove controller which not only should be interactive via touch, but also through gestures made in the air (using oscillators I think they're called? itouch has them)

However I'm more interested in how such tech could enhance the tool pipeline. Also, I think more engines should just target having multiple monitors and multiple input devices in general for multiplayer/single-machine game play. Hell, even in single player games (or when playing in a single environment) additional monitors could open up better game views allow players to track things easier. No longer would you have to limit say GTA to have that small circle of a map view, but you could have an entire screen dedicated to it.
You should be even able to map different inputs to each screen. So for example, I can have a gamepad going to my normal game input, then have all my keyboard and mice events register on the secondary monitor which could very well just be dedicated to lush UI, keeping the main screen at a very dry.

However, if these pussy faggot big brand game publishers don't man up and embrace change, you won't see such innovations anytime soon in big-name games. Especially when they want to make the goddamn games so much like their console counter parts. I'm fucking sick and tired of quick time events (pressing X in sync with Y to do super-awesome-kill-you-and-your-mum-move, finishing the game) or whatever that shit is called. A lazy man's invention if you ask me (though it's been around for freaking years of course, but you haven't really seen big brand use them so much until now).

Con
November 13th, 2009, 09:11 PM
The harder I press down the darker the lightmapping. No one's made a/an (specialized game) app that does light maps that way... :|

I don't think it would be done that way. Pressure should control opacity or diameter. There's more to lightmaps than just light/dark; pressure can't control colour at all. It's more intuitive if it's like Photoshop. Also, I don't see the need to paint on lightmaps. Lighting is a very exact and mathematical, and tricky to get right. And screwing around with it makes it less realistic or is a waste of time.

EDIT: I'd like to see more apps take advantage of pen tilt. Photoshop CS5 is doing some amazing things with it and some fancy new dynamic brushes.

DEElekgolo
November 13th, 2009, 09:16 PM
Or you can just bake them. And support for vertice based lighting.

neuro
November 14th, 2009, 05:12 AM
vertex lighting is one of the most, if not THE most basic lighting setups. what you bake into them (for example lightmaps) it completely up to the user.

need

a definite need i'd want in any editor, is the ability to ctrl-Z (undo) and ctrl-y(redo) and do copypaste actions.

for example, i build an asset made up out of several intersecting meshes, i want to be able to, say group them, and copypaste them to where ever i want, without having to reassembble the entire thing for every instance of it i'd like to place.another most basic need function would be:

the ability to do easy selections. whenever you're working with a more final piece, your environment tends to get cluttered, and has 20 things lying on top of eachother at any time in a viewport.

what i'd like to be able to do is select an TYPE of object, (or select 2 different types of objects) and hit a button (or menu option) to hide everything which isn't one of those 2 types.
i'm guessing viewport flexibility would be the main concern in my opinion, often in editors, everything runs uncompressed, and goes slower, so for example the ability to turn off realtime lighting, post processing, force a certain LODset in a viewport for example would not only increase productivity when a scene gets full and cluttered, but could also be used for debugging.

another important feature i'd like, is thumbnails, thimbnails thumbnails. i've spent alot of time with several engines where when you're populating a scene, you cna only select your things from a dropdown-menu. you often end up placing everything in a scene just to see what it looks like, which is very time consuming. unreal for example has thumbnails for EVERYTHING in it's asser-browser, meshes, textures, shaders, etc.

Kornman00
November 14th, 2009, 05:49 AM
Neuro, I was asking for input on using multi-touch and other external input devices in a development environment besides basic keyboard/mouse combo, not what an editor should do.

neuro
November 14th, 2009, 06:48 AM
well thats your fault for using big words and fancy coders talk and such >:U


in any case, i don't see the need for more than a mouse and keyboard imo.
'scuse the mindfuck