Programmers can talk all about their craft here
Programmers
FMOD 3.63 released!
Now there's non-blocking song loading and various new bug fixes and optimizations.
Imho FMOD's lisence of being free for use if your project isn't commercial is a great way to do this sort of thing. The two main reasons i like fmod though is 1. tracker format support 2. sheer simplicity.
Yay!
pixel shader support in opengl?
you know, i was under the impression the gforce4 said it could use pixel shaders... and yes it can, but i can only seem to find pixel shader support in dx8/dx9.
in opengl the only pixel shader that i can find is the extension GL_ARB_FRAGMENT_SHADER or its similar NV ATI and other precursors... how are you supposed to use ps1.x in opengl if the only cards that support it aren't even in wide circulation yet!?
Kezza: you need to use NV_register_combiners and NV_texture_shader1/2/3, which gives you more flexibility than DX 1.1-3 "shaders".
You need to use these vendor specific extensions to access the functionailty of these chips. Same for ATI's..
ARB_fragment_program is only for Geforce FX or 9500-9800 etc.
Cheers, Burga
damit,
not only did i write ARB_fragment_program wrong... i have to use vendor specific extensions?
[img]http://www.members.optushome.com.au/hmmklord/hazardClip.jpg[/img]
"Oh... that sucks..."
Something for programmers to laugh at
I work for a large corporate mob and we have some excellent "programmers" *cough*
One of these guys is a "programmer" with approx 7 years "experience" in C, Java and
(unsuprisingly to me) VB. Curiously his title has recently changed to "architect"
which I think must mean he's got some house building skills at a level greater than
his programming ability ;->
Anyway, after setting him straight on arrays actually being exactly the size you
declare them to be in C (he thought they automatically get an extra element added
to them), I was asked to tell him what was wrong with the following code:
[code]#include
#include
#include
#include
int main()
{
char *tcaseline;
char *ptrthing;
tcaseline = (char*) malloc(256);
ptrthing = (char*) malloc(10);
strcpy(tcaseline,"");
ptrthing=strpbrk(tcaseline, ">");
ptrthing++;
free(tcaseline);
free(ptrthing);
}
[/code]So if anyone's looking for "creative programmers", or an architect to
remodel your outhouse, just ask and I'll put you in contact with him ;->
Make sure you read the other sections.. http://rinkworks.com/stupid/
Tech support people have it tough.. [;)]
[C++] Automatic singleton template
Being way less than gifted with artistic skills, I won't torture
people with any sketches I've done (no, I'm not being modest,
I really _do_ suck at art [;)])
Instead, here's some code I wrote that implements a reasonably
easy-to-use (and IMHO rather nifty) singleton template class...
Stuff to go in the header file "autosing.hxx":
[code]#ifndef AUTOSING_H
#define AUTOSING_H
template
class AutoSingleton
{
public:
AutoSingleton() {}
~AutoSingleton()
{
delete mptSingleton;
}
static Tx * instance()
{
if (mptSingleton == NULL)
mptSingleton = new Tx();
return mptSingleton;
}
private:
static Tx * mptSingleton;
};
#define SingletonInstance(Type) AutoSingleton::instance()
#define DefineSingleton(Type) AutoSingleton AS_ ## Type; Type * AutoSingleton::mptSingleton = NULL
#define DeclareSingleton(Type) extern AutoSingleton AS_ ## Type
#endif /* AUTOSING_H */[/code]To use it, add this to your class's header:
[code]class MonsterGenerator
{
public:
// ...
};
DeclareSingleton(MonsterGenerator);[/code]Then to your class's implementation, add:
[code]MonsterGenerator::MonsterGenerator()
{
// ...
}
DefineSingleton(MonsterGenerator);[/code]Then finally to use it in your code, include the header and do the following:
[code]int main()
{
Monster * ptMonster = SingletonInstance(MonsterGenerator)->generateMonster();
return 0;
}
[/code]Got some other geeky code samples? Post 'em ;)
Cheers,
CombatWombat the exhibitionist (umm, that didn't quite come out right, did it [;)])
Yeah, that's alright, but there are a few problems with it, as in the constructor's still public, which could lead to some trouble, and you can also call the constructor of the object that you're wrapping too, so it wouldn't really be a singleton.
But I've also noticed that in your case you delete the pointer object upon destruction of an element, hence it isn't really a singleton, because the first time an object of that type gets deleted then all of them get deleted, and you end up with some bad runtime errors. I think you also need to implement some sort of reference counting into that code to make it work properly.
Instead of going that way, I made my singleton class be instanciated on first call to Get()... and all it is:
quote:template class TSingleton
{
public:inline static T & Get( void );
};template inline T & TSingleton::Get( void )
{
static T singleton;
return singleton;
}
with the application being something like:
quote:
class SingleFoo : public TSingleton
{
private:
friend class TSingleton;
SingleFoo();
public:
// blah
};
This allows you then to create a nice singleton class that cannot have multiple instantiations of it in the program, although you do have to make the parent singleton class a friend, so that it can access the private / protected constructors.
Then all you need to do to call functions is SimpleFoo::Get().whatever() etc.
(P.S. Shouldn't this be moved to the programming forum?)
Here is my singleton class, you inherit from it, and the instance of the object is created on the first call of GetInstance(), and is automatically deleted when the program closes (if it hasn't been already)
They way I use it either as-is, or by #define g_pType Type::GetInstance() (which seems so evil :( )
[code]
#ifndef SINGLETON_HPP_INCLUDED
#define SINGLETON_HPP_INCLUDED
#include "Core.hpp"
namespace thang
{
template< typename Type >
class Singleton
{
protected:
Singleton( Type* pInstance ) { ms_pGlobalInstance = pInstance; }
Singleton() { }
Singleton( const Singleton& ) { }
~Singleton() { }
private:
static Type* ms_pGlobalInstance;
public:
static Type* GetInstance()
{
if( !ms_pGlobalInstance )
{
ms_pGlobalInstance = new Type();
atexit( Delete );
}
return ms_pGlobalInstance;
}
static void T_CDECL Delete()
{
delete ms_pGlobalInstance;
ms_pGlobalInstance = 0;
}
protected:
void SetThisInstance( Type* pInstance )
{
ms_pGlobalInstance = pInstance;
}
};
template
Type* Singleton::ms_pGlobalInstance = 0;
} // namespace thang
#endif // SINGLETON_HPP_INCLUDED
[/code]
Singleton::Singleton( Type* pInstance ); is there as a kind of hack, it allows things to use GetInstance() while it is in the Type's ctor
quote:Originally posted by Daemin
Yeah, that's alright, but there are a few problems with it, as in the constructor's still public, which could lead to some trouble, and you can also call the constructor of the object that you're wrapping too, so it wouldn't really be a singleton.But I've also noticed that in your case you delete the pointer object upon destruction of an element, hence it isn't really a singleton, because the first time an object of that type gets deleted then all of them get deleted, and you end up with some bad runtime errors. I think you also need to implement some sort of reference counting into that code to make it work properly.
Sure, I appreciate the issues that you mention. Given appropriate
use, it works nicely for me.
quote:Instead of going that way, I made my singleton class be instanciated on first call to Get()... and all it is:
But how can you get by without using a funky ## operator? [8D]
Yeah, I think your solution is a safer one (esp where inexperienced
developers are involved in a project).
quote:(P.S. Shouldn't this be moved to the programming forum?)
Yeah probably since we're now discussing it :) My bad...
I've never really bothered making a singleton class cause I?ve never needed to make anything a singleton b4, but if the whole idea of it is just stop people from accessing the constructors and making instances of it.
Why don?t you just avoid making instances of it instead?
Also what type of things are you making a singleton too ?
lava_monkey: The way you talk about using singletons is the way I was doing it, keeping a single static pointer and only instantiate the object once. While it works, there are certain evil things that can happen. I've found that if you actually have a singleton class it makes it a whole lot simpler and easier to use. Plus then it's automatically destroyed.
CombatWombat: I use the ## operator only when I need to, in the debug macros for instance. And another thing is that you're not setting the pointer to NULL after you've deleted it, therefore you could be using invalid memory occcasionally.
quote:Originally posted by Daemin
Instead of going that way, I made my singleton class be instanciated on first call to Get()... and all it is:
The more I think about your solution the more it grows on me :)
I'd gone the way I had because I was trying to be quite deliberate
about avoiding problems with initialisation order of globals but
still wanted to clean the singleton up properly. (ie if you have
your instances as statics of a class then you have no control over
the order they get initialised).
But statics within a function is a much more elegant solution. You
still have your control over order of intialisation.
Thanks for the info on your template idea [:)]
Cheers,
CW
Floating point framebuffer
Hey, i've been thinking about bootlegging these fancy new DirectX 9.0 cards to do some cool oldskool software rendering in a float buffer, and i've designed a next generation concept for my TinyPTC library to work with floating point r,g,b,a pixels.
I've got a brief outline up at http://www.gaffer.org/tinyptc, more specifically the design is sketched out at http://www.gaffer.org/tinyptc/tinyptc.hpp
What do you guys think? floating point color, future of software rendering or insane crazy stuff :)
I'm going to shoot myself....now
[url="http://www.sumea.com.au/forum/topic.asp?TOPIC_ID=424"]Who cares? (poll)[/url]
(punches maitrek in the nuts) it can be useful knowing how to program, and it can be useful to do things that other people may have already done themselves, as you might be able to use it for things that other versions simply wont allow.
i occasionally use a float frame buffer, and i used it in a application i wrote and use almost everyday, and i look at comercial software that does the same thing horribly (the max5 bake texture is trully horrible) Yes, someone else already did it, but they did it like _SH_T_
i used a float frame buffer to hold rgba and a 32 bit amask, also had a temp float to hold another coverage related number to save recalculating...
i meant software rendering, ie. for raytracers, demos etc. doing software rendering to a framebuffer - but its a float framebuffer, made possible and practical by nice stuff like dx9 cards and SSE2 - floating point hardware rendering is nothing new, i'm just proposing to use that hardware for something different, for hobbyist purposes
btw, i do a lot of software AND hardware rendering these days, i'm by no means arguing that anybody SHOULD do their rendering with software instead of hardware, but if they want to code framebuffer stuff, why not benefit from the coolness of floating point pixel formats - it would be nice for high end renderers, raytracers etc, or for realtime software rendered demoscene type stuff etc etc... :)
New free 3D Development Studio for Visual C++.
Create top level games or add 3D content to your application.
3DSTATE 3D Developer Studio is based on one of the fastest 3D engines delivering top quality graphics.
Probably the easiest 3D engine to use !
Whether games, tourism, education, or else - no matter what your project is about, using 3DSTATE 3D Developer Studio you will get top quality results within extremely short time.
The new SDK is based on 3DSTATE 3D engine version 6.0
Visit www.3dstate.com - whats new section.
I downloaded that product (not due to it being spammed here). It's about as user-friendly
as a cornered rat. I can't speak for the pros but what I find works is to use the Quake/Quake2
engine. It's good for artists since they can import/export quite easily, and there are file-readers
out there to convert levels into a simpler, readably file format for programmers. I suppose
some prototyping could also be done with the scripting language.
Programming AI
I have been doing a decent bit of research on AI. I am very fascinated by this area and think I would like to develop my skills here.
Just wondering if there is anyone lurking around here with some AI experience that would like to answer some questions. Or even know someone whos working with AI that they could put me in contact with.
As I have found through my readings I have so many questions to ask! I realise there are some great AI forums around the net but I really like the idea of talking with someone locally.....
pathfinding is cool and easy to explain, and a simple AI like chess type AI (simple chess AI) isnt far off pathfinding, uses the same basic idea.
You just gotta sort the paths you've found, and then look at the possible moves your enemy could make X turns ahead, and you got yourself some cool AI.
All that advanced AI, like nerual-networking and stuff, dont ask me.
I sent off an email to BioWare and Dice to see what advice they might be able to give as well as ideas on what they would want to see in AI demos. I got an email back from Ray saying he'd forwarded on my email to one of the senior BioWare programmers who's a PhD in AI.
I will post what I find out here :)
Ok here's the full email I received from BioWare about what to do for an AI Demo:
An AI demo is a somewhat tricky proposition. Personally, artificial intelligence has, at its basis, a lack of artificial stupidity. Depending on what you code in the rest of the demo, it can be complicated or rather simplistic. Either the demo is complete (fun and
playable) or the demo should be able to illustrate excellence in the field of artificial intelligence.
To elaborate, if you are considering modifying a game such as Unreal Tournament, the AI would have to be demonstratively better than the packaged ones that come with the game. If you are implementing a simple X and O type pattern to illustrate a squad's tactics over a specific terrain, people would expect more from the AI than they would if you had some 2000-polygonal characters running around and making decisions, complete with your own sound effects and flare effects.
What sort of design approach to AI do you want to illustrate: top-down design (i.e. with a scripting language and a ruleset/environment that leads to complicated decision making) or bottom-up design (taking fundamentally sane principles and generating complicated decision making through planning/optimization of states)? Top-down design is probably the better approach for a demo ... being able to demonstrate a framework (of your own design) that allows designers to input their rules interests more than just the programmers in a company. Take Lilac Soul's NWScript Generator, for example. It's great at what it does ... avoiding the tedium of writing scripts. However, are there better approaches for designing GOOD AIs? How much control/assistance do you expose to the designer? Give us a toolkit with an artifical character in an environment, and allow us to tweak the AI so that the character dies in elaborate and interesting ways ... eventually learning what to do to get out of the room. If you're going to use learning techniques, illustrate how you can show other people what the AI is actually learning. An AI that is using neural networks or genetic algorithms isn't that useful, unless you can extract the relevant information from the genes or network.
Don't underestimate the fun factor in the demo. Every time the person laughs out loud at what they are seeing on the screen, or is reminded about one of their favourite games, is an interview that you've already landed.
Be prepared to submit full source code of your demo along with the executables, and include a README illustrating what the demo is supposed to show (and what, if you had more time to work on it, would you do in the next two months of working on it!)
I'm an odd case, because my background is in traditional AI "games" such as chess or checkers.
Take Minesweeper as an example, and the "hard" AI level where there are 30x16 squares and 99 bombs. Can you write an AI such that the maze is generated after the first square is chosen (to guarantee that the first square is never a bomb), that it succeeds at solving the maze more than 20% of the time? It's an interesting example of bottom-up design that might impress some people if it was a "visual" test. (For beginner level (8x8, 10 bombs), you should be able to get 75% success without too much effort, and intermediate level (16x16, 40 bombs) isn't much harder with about a 70% success rate.) If you're allowed to make at most ONE mistake, how high can you push the success rates? Can you get the success rate over 50% on "expert" mode if you're allowed to land on at most one bomb? (Most of the time, you find the bomb on the second move of the game ... ;-] )
Thats a fairly porrly worded challenge there kezza :P
By logic the second shortest path WOULD be the path that just goes one extra tile and otherwise has the same exact path. It seems that you want an entirely different path, that is also short, but it is doughtful that this complete alternate route would be the second best possible path. :)
CYer, Blitz
Yeah, its something that a friend asked me... but it completely stumped me. I think one way to do it is to use a tree based space representation for path finding and remove some of those upper level nodes that represent the area near the path.
However, its pretty hard for AI to appear to be intelligent if it is so predictable :)
... sorry about the wording
-- edit "just fixing some spelling :)"
I'm coding the A* algorithm for my game, and currently I looking at how to modify the cost function to take such factors into account, if we consider that the a* algorithm works around the basic formula: Total = heuristic to goal + cost from origin
If the heuristic cannot be modified, then we can only really modify the cost side of the function. In the simplest sense, the cost could be either 1 or 0 (either passable or not) for a single node in the map. If the cost for each node were a float value, that were adjustable based on a number of parameters, the a* will utilise paths which have lower traversal costs. In the case of avoiding enemy units and specific areas, the nodes within the vicinity of those areas can have their traversal costs increased. This can be done using a number of functions, such as squared, logarithmic or constant (i.e. the decay of cost as we move further from the zone). One should find, that the a* will path away from undesired areas, and avoid walking too close to enemy units if we want to avoid them!
I'm speaking on Cheryl's Student Panel this year at the AGDC, so if my theory is fundamentally flawed, you can throw rotten vegetables at me then! From initial experiments, this kind of method seems to work, although I need to find out the exact limits of operation for the cost function.
cheers
Matt
Hemiware
Guys, go check out http://www.hemiware.com ..
quote:Our flagship product is the Serenity Engine ? a suite of libraries and tools that provide the complete solution for game development on PlayStation?2 and PC (NINTENDO GAMECUBE? and Xbox? coming soon) with support for Lightwave 3D (3ds max and Maya coming soon).
..did I mention that it was locally developed? Good. [:)] Download the trial, and give the guys some feedback!
Free 3D development package for Delphi
As quoted at www.3dstate.com - whats new section.
----------
March 24, 2003
Up to now the only free 3D development package for Delphi users was Morfit SDK V3.0, released in 1999 ! The new version, released today, closes a 4 year gap and enables strong 3D capabilities for Delphi developers. The new SDK is based on 3DSTATE 3D engine version 6.0
----------
The download is available at www.3dstate.com - whats new section
Or follow this link -
http://www.3dstate.com/comV/VC6/Delphi_3D_Developer_Studio6.exe
How do you like your coffee?
It's no secret that we programmers consume more coffee (or caffine products) per indervidual in any given time than any other group (obviously with a few exceptions).
Generally each cup of mine is with sugar, a tiny amount of milk and enough bean goodness to drive a freight train for a month [:D].
The question is... how do you like your coffee?
I drink about 2 cups of coffee in the morning(? - well afternoon when I wake up) before Uni, then just drink Tea afterwards, and then kinda just easily manage to stay awake till 2am each night.
Generally I find that too much coffee is kinda bad for programming, as it'll get me far too finiky and figety so less coding gets done.
Caffiene is good for twitch reflex gaming though :)
I'm a bit of a health nut myself so generally I have a fair bit of energy. "Eat more" is what I say (but do exercise to keep the metabolism going) - generally a better tactic than using caffiene. Drinking heaps of water is good too, caffiene dehydrates too much, lose concentration span.
Another 2c...
Take one mug and add 1 or 2 heaped teaspoons of nescafe gold and 1 level teaspoon of white sugar. Add hot water until mug is almost overflowing. Place on saucer with 3 arnotts nice or mum's choc-chip biscuits for dunking.
If I don't have at least 1 coffee each day, I end up with a splitting headache. =[ Damn caffiene addiction.
Well My coffee.. HEHE.. Gotta be percolated Coffee, white, Straight up. For a Quick Fix Double Shot Short Blacks.
Hot Mocha though is for relaxation Coffee.
And a Good sized Capacino for a long discussion.
If Coffee is Unavailable Will resort to a Red Eye, V, or Lift Plus. Any energy Drink.
He He.. Coffee Is Important to a coder, though when I have a few Scotches, my code so much better. I look at it the following day and question how it was possible why does it work, and why is it so fast. but hey i dont question it for long cause it does what i needed it to.
0xBaaDf00d,
--------------------------------------------------------
Too much blood in my caffeine system.
quote:Originally posted by 0xBaaDf00d
Too much blood in my caffeine system.
I always thought a drip right into the ole blood stream would be a much quicker way :)
I am fussy with my coffee so I can't stand "generic office coffee". Normally its at least a Mocha Kenya but when I pull out the ole brew pot its my Folgers coffee. Its a US import I know - Shoot me. Sad part is that even as an import its still cheaper then most of the coffee here. And its got the best smell *drool*
I think an average coffee day for me is about 6 or 7 cups of coffee. I normally do 1 small spoon of coffee and 1 small spoon of sugar - both level. I try to keep away from moo milk so I use soy milk.
As for coffee messing up your system I am proof that it doesnt. I drink a coffee before bed and go right to sleep! I am way past the coffee affecting me stage. I am just feeding my addiction.
Its my one bad habit. I dont smoke or do drugs. My drinking is kept to about once a month at school social functions - Canberra City Walk be very afraid!
I am the opposite of you, Jacana. I haven't drunk coffee in about a year (it was on a social occasion, so hey.. I was pressured! ;)). If I drink coffee now (say, with just two tea spoons of it), that'll pretty much gurantee me staying awake for another 24 hours... Where I used to work, I did have coffee in the morning at times, but I usually felt so crook for the rest of the day.. Down with coffee!! *boo!!* :) 7 cups of coffee a day! Yikes! :)
There's plenty of pros and cons to coffee in terms of health, it does depend on your bodies natural metabolic rate. If you are fit and on the lean side of the body mass index then coffee generally does more harm than good - you should probably have enough energy as it is and you probably don't need the extra kick.
If you are on the not so lean side and don't do much exercise, the increase in metabolic activity can be good for you except in the case where you over-stress your heart for a prolonged period of time. If you are at the point where you body's metabolism is really dependent on coffee, then don't go cold turkey - it really screws you up - you have to make some pretty big lifestyle changes in order to stay healthy off the caffeine and maintain a reasonable amount of activity. :)
Also it can fiddle with your blood pressure if you aren't careful, but this isn't a huge problem for most people.
If anything it 'statistically' has more effect over a prolonged period of time (like ten years of ridiculous consumption) rather than being able to notice anything in the short term.
This has been a public service announcement :)
Now I can definitely say that Coke and Jolt aren't good for you :)
Contain *too* many simple carbohydrates (there's obviously acceptable levels of everything), not good for consistent blood glucose levels, not so good for the internals.
I can't talk though, I drink way too much Farmer's Union Iced Coffee and that has more sugar by volume than coke.
quote:Originally posted by Maitrek
There's plenty of pros and cons to coffee in terms of health, it does depend on your bodies natural metabolic rate.If you are on the not so lean side and don't do much exercise, the increase in metabolic activity can be good for you except in the case where you over-stress your heart for a prolonged period of time.
If you are at the point where you body's metabolism is really dependent on coffee, then don't go cold turkey - it really screws you up
Also it can fiddle with your blood pressure if you aren't careful, but this isn't a huge problem for most people.
Lol :) Thanks for all of that!
I will say that you have hit on quite a few of the right areas. When I had time (and $$$) I was going to the gym. I was told that coffee can be good because it does exactly what you said. Speeds up your metabolic rate.
The down side of the addiction (as you said) is horrid. It can be so nasty to kick! Lovely migraines for a couple of days. Long live asprin!
As for heartrate - again you are correct. I have quite low blood pressure normally and was prone to blacking out from time to time. Since having more coffee I haven't had a blackout. Says good stuff to me :)
On the bad side of that - you will find that it triggers a fight or flight response in your body. Thus the increased heart rate etc. This also causes your vessles to restrict and slow down circulation. I find that my hands get cold quite easily. Tho I had poor circulation to start with!
Coffee is really my one major "bad" in life. I have a well balanced diet otherwise!
Also read someone talking about simple carb's! Those are so no-no! Your system ends up turning them into fat because it can not use them up fast enough. Simple carb's are the root to all evil. I am sure Dr Atkins would agree - if he were still alive. The only really good carb's for your body are complex ones. Slow burn!
I have been trying to watch what I eat a bit. Not a diet :) Just better general eating habits and carb's have been quite interesting to learn about.
I realise this has totally gone off coffee so I am just going to shut up and go back to coffee now.....
*drinks coffee*
if you are in canberra, the iced coffee at valentinos in civic is the best, i used to hang out with morgan jaffit game designer at irrational a while back, playing go, near the end of freedom force work -- *all* afternoon
be warned tho, an iced coffee ever afternoon for 2 weeks has the adverse side effect of making you gain ~ 5 kilos ;)
Working in the Industry
Hey guys, My name is Gav, I met Souri when he first started this project(SUMEA) quite a ways back in time, anyways, along with a couple of fellow mappers we were invited to display our work here. It has been quite a while since I have been back here(moving house and computer problems), but I have just caught up on most of the subject matter in here and I was wondering how many of you all actually work in the programming industry. I study programming and have done for several years now, It would be fantastic to break into a role in a company where everything is all hunky dory(just any programming role), but everywhere I look people are asking for experience. For those of you who are working, How did you get started? Where did you get started? And what do I have to do to get a foot in a door somewhere?
Absolute bottom level entry doesnt worry me in the slightest, as long as I am learning and the old grey matter is ticking over, I am happy as a pig in *&%^.
Lastly, nice to see you all here, hope all is well with everyone
Will check back soon
Gav(FrEeB@LLiN)
Look, I'm in pretty much the same situation as you... not in the industry but wanting to be (as soon as i finish my degree).
However the advice I have been given is to maintain friends and people you know who are in the games industry. The other thing is that most educational qualifications don't count at all compared to one of two things : 1 work on a previously released game (good+), 2 personal projects that are finished & complete
I'm just going by what i've been told though.
I'm in same boat...studying at Uni (would nearly consider it a waste of time if it weren't for the fact it's a pretty good resource) and don't have quite the ability just yet to run off into the industry.
I imagine alot of people on these forums are at this stage...although some have recently made the transition to actual industry figures (just look for the people whose post rates have dropped) :)
Texturing terrain blocks
I was wondering how people reccomend texturing terrain blocks, its pretty standard to have elevation based textures.. but what about roads and stuff that don't apply to this?
I would suggest forgoing the evelvation based texturing method for terrains, it makes them all really simple looking after a while if you are suing them for a game, and because you'll need to store information about what other objects are where on top of the terrain.
I'd just store the surface properties (texture etc) on each block of terrain (not vertexes, the stuff between 'em), and also within that spot have a few flags or otehr properties that dictate what "extra features" are placed on that section of terrain, such as a road etc. Then after the terrain has been rendered just render the stuff on top of it, so the road is really an object on top of the terrain.
Lua 5 is out
If you didn't know, Lua 5 was released last week.
I created a vc6 project for it http://members.optusnet.com.au/redwyre/files/lua-5.0-vc6.rar (Project files & source)
I also compiled release binaries that use the MSVCRT DLL http://members.optusnet.com.au/redwyre/files/lua-5.0-win32.rar
If you want debug binaries you will have to download the source and compile them yourself
if anyone doesn?t know what lua is, its the best, its a scripting language design to be used through something like C/C++, not designed to be a standalone and then hacked to be used by something else (*cough python)
and its very low level and powerful, writing a good c/c++ wrapper for it isn?t that hard, and there?s a few out there already.
its my scripting language of choice.
I've checked it out briefly, but currently for my projects I am developing a unique scripting system langauge and associated utilities set. I'll release it when its finished to an appropriate level.
I would be interested in looking at stuff like Lua, Ruby, Tcl, and Python more, but I don't have enough free time at the moment.
Lua is designed to be embeded into a host program, as lava_monkey says. It is small, fast, and very simple and clean (is the way that you make me feeeeeel toniiight... er.. *cough*), and is gaining popularity very fast as a game scripting engine. It can also be used on XBox or PS2, except for the garbage collection problems, which are top priority for 5.1.
As a measure of how easy it is to integrate, I had it running in less then a day after writing a simple wrapper for it.
Let me see if i got this right...
step 1: build lua lib files
step 2: make project that uses them
step 3: scream in pain as it causes linking errors with msvcrt.lib... for stuff it shouldn't
step 4: failing to fix step 3, perform the strong-bad patented computer desk vault on your keyboard.
no seriously, is it meant to be this hard to get an interpreter that can write hello world to work?
my personal preference is ruby which i use for a game i'm developing at home, very nice c++ integration, but a very high level, and elegant language.
we used python for freedom force, the next best choice imo -- but LUA, it does seem to be a bit lower level than what i want, but i'm sure its much faster and better suited to embedding in a game than either python or ruby, but how ELEGANT is its syntax? is it OO for example?
cheers
GameDev have a new article on lua.
[url]http://www.gamedev.net/reference/programming/features/lua/[/url]
It's targeted at people who have never used lua before...
If you don't link with msvcrt (and why the hell not??), then you will have to goto http://lua-users.org and find the win32 source there. Otherwise you will get lots of linking errors.
My few words on AuranJet
I'd like to point out a few things about AuranJet as firstly alot of people are misinformed about it, and secondly there's some flaws about it (which have really made my day painful).
AuranJet isn't what i'd consider a "game engine", its more like DirectX on steroids. It basically provides a great many common functionalities that 99.9% of games require, but they aren't part of any low level API (scenegraph, skeletal animation, archive file formats + transparent access, etc).
I have a few gripes with Jet,
1. the current JetApi can't compile under VC7
2. the documentation is awful (thank god for the super active forums)
3. some aspects of the api 'encourage' hard-coding parts of your game, worst one i've seen yet is the default animation system which deals directly with files
4. using stl in a program that uses jet is quite an ordeal
5. you will ALWAYS get link warnings... you cannot escape them!
aside from those 5 problems, i fully endorse jet as an awsome thing... full of goodness to a degree that meets my highest expectations.
For the time that I've worked with Auran Jet I've come up with a few gripes as well:
1) Jet exclusively deals with a lot of files, thus its data-driven, which is a good idea sometimes but bad in so many others. This restricts the users (programmers) ability to create game data on the fly (program generated textures for one thing).
2) The documentation is missing a damn few things, and considering it's the *official* API then that's a serious issue. You need to actually look through the Header files to find out fully what is supported etc.
3) Some of the lower level / common functionality is missing - removing / deleting directories in particular - not that you'd want to use it all the time, but the API doesn't feel complete without it.
4) The graphics system is tied too closely to the configuration file structure and options, as in you can't change it from inside your program, which is irritating.
Overall though its not that bad to work with, the User Interface section is quite easy to learn, it has (all?) most of the necessary container types (so there's little need to use any STL stuff). Even though it cannot compile with VC7, I'm sure that everybody that has VC7 also has VC6 so I don't see that much of a problem.
Can't wait to see what they do to V2.0 of Jet.
Whats the gamecube like to work on?
I hear playstation2 is a headache and the xbox is rather good, what about the cube?
Devkit is the machine you run the complied programs on. Comes with an accompanying SDK, and I'm pretty sure you've gotta supply your own IDE to work with. Devkits usually come in several flavours - for PS2, there's a TEST, which looks physically alot like a normal PS2 (I haven't played with those too much - I think the main diff is that they can use burned CD/DVDs for testing), and a TOOL, which is basically a normal computer that communicates with an internal PS2 (TOOL's are you're main debugging machine); XBox's have green Debug machines and White SDK's (the white ones are the ?ber machines, I've only played around with a debug kit); and I've got next to no idea about GCNs, the one's I've seen are big blue boxes.
Touch typing
Does anyone here touch-type, and if so, it it helpful in coding?
I've been meaning to learn...
Touch typing is quite a good skill to have. You can look at what you're writing as it appears on the screen and consequently fix mistakes (and once you get good you don't even have to look at the screen to know you've fucked up) and it's generally quicker than key hunting. I originally taught myself my own style of touch-typing, then they made me learn the proper way at TAFE, so I combined the two and have my own style again. My WPM count could probably go even higher if I touched type normally instead of my half-bodge way, but I'm too deep set in my ways to change now... :P
quote:Originally posted by Daemin
I don't really touch type, but when i want to type something I can do it without looking, simply for the fact that over the *years* I have taught my fingers where the keys are in relation to each other so now I can just type without looking at the keyboard.I guess practice makes perfect :-)
yeah, that's what I do.. but I always make mistakes..
And rezn0r, If you aren't designing as you are coding you can code alot faster :)
how so? wasn't qwerty established as the standard keyboard because they had some big competition to see which was the fastest and the guy using qwerty beat the pants off the other guy?
(then stole he stole the pants and ran far far away).
Acutally my opinion on this is you'd be faster with whatever you learned to use best... unless the keyboard is totally useless, but even then skill could compensate.
You can find them on the net in lots of places, that is DVORAK layouts etc.
The base premise is that it puts the least often used keys on the bottom row of the keybaord, then the most often used keys under your fingers, and then the second most often used at the top (or could be that most often used at the top etc, I can't remember exactly). And they arrange it in such a fashion that your fingers don't move too much when typing. Just looking at my hand now my fingers are jumping all over the keyboard while typeing, on DVORAK they wouldn't barely move from the "initial position".
I guess the battle between DVORAK and QWERTY is like a battle between 80x86 architectures (common as QWERTY, and just as complicated), and faster RISC machines (simpler, faster, although struggling to be seen). But that's another story altogether.
I personally want to get a fancy keyboard that can switch between QWERTY and DWORAK with the hit of a key, now I just have to get the money for it - and find the site again.
',.pyfgcrl/=
aoeuidhtns-
;qjkxbmwvz
takes about 2-3 months to switch and get productivity back, but its a lot faster and more comfortable to type than a qwerty
notice all the common letters are in the middle row, and you alternate left/right hands most of the time -- vowels tend to be left, consonants on the right
its an awesome layout
TY Level design
[8]
Hello,
I wasa wondering if anybody know how level design was done
in Ty. I would like to write a game like that, but am unsure
on how to move forward.
Could you just design a level in 3ds and import it?
Would I have to develop my own level editor ?
How is the landscape done ? the moutains / road you can run up and down on ?
Basically what would be needed to accomplise this type of game.
cheers
I'm not going to talk specifically about Ty, but in answer to some of your questions...
"Could you just design a level in 3ds and import it?"
Yes, but for a large level it will be inefficient to disaply the entire level every frame, and things will be veeerry slow.
"Would I have to develop my own level editor ?"
No, but you would likely want to write some sort of level loading function to break the level into bite sized chunks so you can cull parts of the level that aren't visible.
"Basically what would be needed to accomplise this type of game."
Oh, about 5 programmers, 5-10 artists, and about 16 months :P
CYer, Blitz
Who cares? (poll)
I'm going to start the most common type of server space wastage, I'm going to poll the populace of this thread
Three questions, with a few minor queries - so that people might think about elaborating their posts - although simple yay nay responses would also be appreciated due to the general inactivity in this thread. Even if you don't know what the crap I'm asking of ya, at least pipe in to say that you think I'm a tool :)
1. x86-64 vs IA-64 - not which one is better, but whether you actually care or think it affects you or your programming in any way. Has assembly for all intents and purposes finally been slain by high level languages?
2. Floating point colour representation vs 32bit RGBA integer colour representation - Do you actually care, which do you prefer?
3. Visibility determination - is it soon to be a thing of the past or still a valid part of any 3D environment parsing software? Oct-trees, quad-trees (+ derivatives), portals, what do people think the future of vis-det is?
Okay, obviously I haven't managed to create any controversial flame war, that's a good start. But perhaps I'll try to kick start some debate by stating my opinion on the first question.
x86-64 v IA-64 does it affect me.
Personally, I think it does, but only to a certain extent. I have a bit of a problem myself with wanting to stay close to the machine level. Alot of coders don't like this, because admittedly, it's very hard to maintain the code in a reasonably manageable state whilst sticking close to the machine. x86-64 is a very good extension to the current x86 and IA-32 instruction set, it's not a RISC architecture which appears to be the way everyone is going nowadays, but it's still very workable, very backwards compatible and friendly and familiar to me. It's also very fast, and addresses all the problems I have with the current architecture. IA-64 is a pretty radical change, it supports all the old isntructions, but it treats them rather separately to the new instruction set. It too solves alot of problems with the old architecture, but the fact is it makes some of my old code very slow because it was made for the old style dual pipelined pentium processor and sometimes multi-pipelined floating point unit, and the data structures were optimised for that and reflected that style of architecture, even in a high level language.
As a coder, sometimes I have a tendency to use assembly simply because the compiler just doesn't compile an inner loop the way I want it, or it does some floating point instructions in an innefficient manner. I can't say I like having to do this all that much, because compilers overall do a very good job of optimising code, and it's a bit of a pain to use assembly. Changing the ISA radically means I have to change all that code, also it means changing the way I code to better suit the many registers we now have and the 3 levels of cache possible, and also the fact that code size is now far more important in a RISC - this mainly applies to the IA-64 architecture.
That is (in short) how i think the changes will affect me if we radically change the architecture, and these same statements will exist if we go with x86-64 or IA-64 but maybe less of an issue with x86-64. Basically I have to learn the best way to code for these processors, which isn't that bad, but I was just starting to get good at the old processors. And if we have double standards, then I'm in an extra bind because it'll bloat the executable size with different levels of support for the two styles of processor.
My thoughts...
1. Any change in architecture will affect performance code - just look at the multitude of common, high-level tricks employed to take advantage of modern processors (explicit parallelism, padded data types, loop invariance etc.)
While the established techniques will carry through to the x86-64, the IA64 architecture will introduce new problems. Much of its primary focus is in pushing the limits of compiler technology to produce fast code - this means we will have to find the best way to write our programs to assist the compiler to take advantage of these optimisations.
Having said this, compilers still have a long way to go before totally eclipsing the power of low-level languages.
2. Floating point is, without doubt, the way of the future. With the advent of shaders, and other hardware tech, it's becoming more difficult to justify using a discrete representation when most of our calculations require more accuracy.
3. From my limited knowledge, depends on how powerful hardware gets. I get the feeling vis-det will soon become an unneccessary load on the CPU, and should probably be hand-balled off to the gfx hardware.
For the response to the second question.
Hardware support of floating point colour representation should come sooner rather than later. Although it takes up more space to store these details in memory, and if we store permanently floating point colours in our image formats then it'll have hard disk space issues, it's still a far more accurate way of representing the colours.
Also it brings up the issue of colour mode independence, so that it doesn't matter what video mode you are in, we only have to handball off the one format (floating point).
The only drawback I see, is that the large amount of floating point operations the GPU will have to cope with. I think however, a consistency of colour format will greatly reduce conversions between integer and floating point, it will reduce alpha mapping and bilinear filtering pixel format conversions and all sorts of other time could be saved, so from a performance point of view it'll definitely break even, at least look alot smoother, and I would imagine decent implementations will run faster.
The amount of texture memory and hard disk space, and even removable storage device space has increased a very large amount, and I don't think that there is any real issue with having floating point colour representations being stored in image formats as well. Sure we might in the short term be restricted in the number of textures we can fit into the graphics cards' memory, but it'll only be a matter of time before we have more than enough again...
1. x86-64 vs IA-64 - Theses days it's not about speed, but productivity. The only way this affects me is if the compiler is immature and doesn't know what it's doing.
2. Floating point colour representation vs 32bit RGBA integer colour representation - This is great. Floating point buffers allow so much precision. You would only use FP buffers with pixel shaders and as the back buffer, you don't need any more precision in your textures. FB buffers allow you to do multi-pass rendering that doesn't loose accuracy, which will improve image quality. You won't see more colours, but the ones you do see will be more smooth and calculated more correctly.
3. Visibility determination - well, since DirectX9, your hardware can do this now :) Occlusion and culling will always be a part of game dev, because the number of triangles keeps increasing.
I'd say that we only really need floating point color representation for the intermediate stages of the pipeline, since textures would take too much space if they were RGBA 32 floating point (that'd be 16 bytes per pixel, far too large). And we will not need to store the final image (that outputs to the monitor) in 32 bit floating point either.
One final point to make is that 32 bit floating point is slow - 24 bit is much faster - as witnessed with the Radeon 9500+ vs GeforceFX battle.
I think it's similar to the days of 3DFXs problems with 16bit - floating point is a far more robust pixel format for rendering purposes, and another question you have to ask is, is floating point slow in the GeForceFX because it's having to convert lots of 32bit values to floating point values and back again? It's part of that self-fulfilling prophecy type garbage, current software is designed with 32bit RGBA in mind, does that affect performance of floating point colour representation hardware? Converting integer to floating point is a rather slow operation, and obviously floating point math is slow(ish) compared to integer math, but there are obvious benefits in terms of rendering pipeline (multi-pass, blending) and image quality issues as well - that floating point is far better for than integer math.
Nokia Development
Hello All,
Has anyone here tried programming for the Nokia N-Gage or the Nokia phone (the dev kit is the same), using VC6.0++? If so did anyone have problems using the dev kit under windows XP?
Thanks
Sam
I personally don't have any experience programming small devices (yet), and I might get into it depending on what the future holds (who knows?).
But a friend of mine has done development for Nokia phones using the Java Micro Edition SDK and some apps that emulate the phone that he has, and I think he actually got it to upload to his phone too.
I would like to get my hands on the development kit and see what it can do and is capable of, no doubt that we'll start seeing funky demos like those from the last century on these next generation phones.
I await the future.
Be warned though that the download is something like 126meg with no resume software being tested on the download. I got it working by doing some real dodgy work - but now there are a few errors with the emulator that I have - the reason I ask is there is a huge market at the moment for phone software and games and these can be distributed via nokia for no up front charge - so it would be a good way to make money making games for phones.
Sam
Lead Programmers
Just curious about this...
I always had this idea that the lead programmer was one of the more competent programmers in a group.
Now after talking a bit more with some people I am now getting the impression that the lead programmer has to be competent but might not be the best programmer on the team. That it?s more of a team leader type role (goes back to project management theory now) where the person has a more sound general knowledge about things....
Would anyone care to shed more light on this?
-I spent my Valentines Day getting drunk with 40 guys!
All of the good programmers in a development studio are usually labelled as "Senior Programmers" (even though most of the senior programmers that I spoke to at the agdc were under 30), the "Lead Programmer" position is usually given to someone with a reasonable amount of programming experience, as you might expect for the senior programming position, but they also have a lot of management skills.
The lead programmer position is more that of a manager, and thus they fit into the lead designer, lead artist, producer group of people.
Lead programmer is definitely a personel/project management position, and I guess it's your responsibility to report to all the other "lead" people in a team. Extensive programming knowledge (and I guess *alot* of experience) is mandatory for that kind of position, but as for being the most "skilzed" coder in a group, I don't think it's a necessity, but it may be the case sometimes.
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
Aside from the managerial role, the lead programmer is usually one who knows enough about all areas of programming but not necessarily the knows the most about each area. I tend to go to the lead programmer when I have questions about implementation that may affect other areas outside my expertise. Usually the lead can say whether or not it definitely does affect other areas, what consequences it may have and bring in the other coders who are more expert in that area of the code.
I've found the best leads are ones who know alot about everything but they aren't the best in everything. The worst leads try to do everything themselves instead of delegating because they think they can do a better job.
GAmes programming courses in Melboure
Does anyone know anywhere in Melbourne where there have a programming coarse orientated and around prgraming for games or any good 1 year programming coarses that teaches c and c++
hey Crazy,
I think its RMIT that delves into it a bit. While the course itself isnt I remember reading that one of the classes in the course actually had people from like Torus come lecture.
Also keep an eye out on Monash. One of my instructors at Swin Tafe went to Monash part time. She was telling me they were trying to get a games course up.
And Daemin - Not everyone does well self taught. I am glad you think buying books and learning it yourself is good and all but you also very much miss out on the social aspects of school when you do that.
I went to school to learn how to program because doing it myself didnt make sense.
"Yes I Code"
Shirts for AGDC 2003:
http://www.thinkgeek.com/tshirts/ladies/5b3d/
http://www.thinkgeek.com/tshirts/ladies/38f0/
http://www.thinkgeek.com/tshirts/ladies/38ed/
I agree with Daemin, you can learn a bit in a class (not usually much) but all your career you have to teach yourself anyway, so get use to it now.
not sure what text books to get.. i guess that matters what point your at, i'd recommend "programmers reference c/c++ second edition", great reference..
Probably Daemin can tell u some great books to get, i just stick to free books and tutorials mostly..
The funny thing is I learn pretty much all I know from just "experience", that is reading articles (especially off the internet), and actually programming a lot, even if nothing full comes out of it for the first few months. The compiler, the warnings that it gives and the descriptions in the corresponding help file are a great learning help!
I would suggest try and finding some relevant C/C++ books in your local library first if that is at all possible. Other than that try and find some decent C beginners articles on the net, they're literally everywhere.
And Jacana, its true that some people learn well at school, but I think what lava_monkey said is that throughout the rest of your career you won't be learning things from a school so yeah. I would however like to add the point that its beneficial to do a CS degree anyways like I am, so that you can learn up on the other things like Maths and Physics that are also required as part of a career in Computer Science (Games Programming, Engineering etc)
I think regardless of how you learn you are still going to have to relearn most everything to fit in with whom ever your working fors style. Also, I find lot of the basic C/C++ books I have looked at tend to have some very bad coding practices.
Just to add - that if you do everything on your own (alone) you will miss out on a fair amount of social time. By that I mean being able to get recomendations from teachers, learning different ways to do things from peers, etc etc.
The social side of things was brought back up again recenlty. One of the tutors at school made a comment about how they can't really teach people to be social. I think the hope is that by mixing personalities in a classroom that people might pick up on some of the social aspects :)
-I spent my Valentines Day getting drunk with 40 guys!
Would have thought it would be better to go for a 'vanilla' Comp Sci degree, and find out for yourself how
to apply the material to games (like many do, myself included). It'd be a better regarded qualification IMO
than a diploma from some Sonic the Hedgehog school. There is that Atari school.. do you really want Atari
certification? Have you played Temple of Elemental Evil lately? *weg*
I wish i went to a sonic the hedgehog school, then i would be able to run really fast and curl up into a ball and run over monsters and save the day!
Learning a programming language is eeeeeasy. It is just a matter of remembering simple grammars and structures really, and a very small vocabulary.
Learning to...engineer software (or program, or code) is a much more difficult task. It is not something you are likely to learn well simply by going to a 1 year C/C++ class. It is not something that can be taight very easily or quickly IMO, as stated previously by daemin, lava_monkey, it is something best learned through experience.
If you want to learn how to program (games or anything else) i would suggest doing something like a CS degree which runs over a long period of time. This gives you plenty of time to "experince" programming, while also giving you the opportunity to study maths, graphics, software design etc.
Anyway, if you're just after a class that will teach you the language, there should be plenty of TAFE courses that will do CERTs in c/c++ that last about a year (they may be called info tech or software engineering certificates...). If you want to learn to program, see above for my NSHO :)
CYer, Blitz
Engine Experience?
Any other coders on sumea have experience with the available commercial game engines, Auran Jet, Torque, Quake3?, Unreal?.
And if you have experience with it, how would you rate the engine / package?
Hey there,
Earlier this year I was hoping to do some work with a company in the US but haven't heard much from them lately but they were going to use Renderware as their base engine. The guy at this company told me to drop his name to Criterion who then sent me the graphics engine to the PC version of Renderware.
I found it quite useful - there is a choice to use DirectX or OpenGL as the base graphics platform, there are plenty of examples to have a look at and a great amount of tutorial and documentation to read through - there is something like 3 volumes for the user manual.
Sam
Anyone wanting to play around with a comercial engine, without having to pay, sign NDA's or such, should look into The Nebula Device, by Radon Labs in Germany.
It's a full game engine written for a game that these guys made, Project Nomads. The interesting thing about it is that it's totally opensource.
I think the way that it works is that you can use it for whatever, just as long as it's opensource, but they can sell it with their games.
It's a good way to start looking into a game engine.
Auran Jet is mostly good. There are some really well thought out parts. I suggest looking through their docs, I based a bit of my engine on theirs, and I keep finding solutions in their code :)
GarageGames Torque I got an educational licenece for this to use while I was at Qantm, but never did. What I saw of the source wasn't as nice and clean as most of Jet is, but it is a very nice engine and has alot of good features. And there is a HUGE community writing addons and features and fixes.
Unreal is probably the best FPS engine there is. Tim Sweeny is a GOD. I found some old Unreal source and the work that Sweeny has done is amazing. I am yet to look at UT2003, but I'm sure it's only better :)
x86-64 vs. IA-64
...for the sake of fostering more interaction between fellow code monkeys, as this forum seems almost eerily quiet...
What does everyone think of the upcoming 64-bit processors from both Intel & AMD? Apart from the fact that they're good.
I've only had the oppurtunity to check out AMD's x86-64, which seems pretty sweet - flat memory addressing & SSE2 style extensions are a great step forward, and crucial if AMD is to catch up to Intel.
Or is assembly so dead that, as programmers, they could release the next-generation of PC's as 4-bit beasts, laugh in the face of L1 & L2 cache, and we wouldn't bat an eyelid?
More inane topics to follow every week until somebody talks to me! Anybody! Please!
-Soul
I'd have to read up on this topic a bit more, perhaps someone could post up the links to the relevant sections for the AMD and Intel 64 bit chips.
Personally It doesn't really matter when they bring our 64 bit chips, I'd be happy with onyl a 32 bit processor still, but that it could also be massively SIMDised, that instead of operating on just one data type at a time it'd have multiple data pipelines that would use data quite quickly. Or maybe perhaps I am thinking of a processor that's a multiprocessor system in one, with multiple pipes that could run multiple threads at once.
Although one thing that we should add is to get away from the 70's chipset architecture, and strive for something more with the times. So far I think only the P4 has gone this way, interpreting the 80x86 commands and running them on its own instruction set.
One nice thing about the 64 bit processors is the amount of ram that could be accessed, mmm, 2^64 bytes of ram... nice.
Blabbing now so I'll stop.
Actually the Athlon, AMD K6 and the Pentium Pro onwards all had internal RISC Machines. The Athlon's was based of DEC's Alpha same as the Pentium Pro.. although the Athlon copied the EV6 Bus plus implemented a more super scalar FPU compared to the Pentium's PRO (and derivates).
The K6 was a NexGen CPU in disgused which makes it First x86 compatible CPU that used a RISC core.
I played poker with a Tarot deck the other night. I got a full house and four people died.
--Steven Wright.
Internal RISC - aren't you just talking about micro-ops there? Essentially its still a CISC architecture for all those chips, except that they decode the more complex instructions into a series of small fetch-compute-store micro-ops (or micro instructions), hence the RISCness of it.
I believe its not the same as a pure RISc processor that interprets CISC 80x86 instructions like the P4.
The P4 uses the same technique as the NexGen K5, in fact there where earlier attemps made by IBM. They wanted the PowerPC to succeed so badly it made and botched to make it handle x86 code.
Rememeber the Transmeta CPU does the same thing except on Software, and it's VLIW processor akin to the Itanium.
I played poker with a Tarot deck the other night. I got a full house and four people died.
--Steven Wright.
I think if you read the IA-32 Intel documentation it states that the P4 still uses a CISC decoder (micro-ops as they call them) seeing as it uses the intel NetBurst micro-architecture which is basically just a multi-threaded pipeline specification with some catch phrases and also that out-of-step execution which I like the sound of.
As for the IA-64 v x86-64, I've got no definite idea, but it definitely looks like AMD have had to play a bit of catch up lately, but look to have the 64 bit processor of the future.
SSE2 is important, and 64bit addressing is good to hear as well as we start to breach that 4 gig of memory that we used to think was oh so much.
As far as assembly concerns go, I imagine it's going to screw alot of old code up in terms of efficiency, and it's going to give compiler and OS developers a headache and a half :)
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
Anyone here got large sums of assembly language code in their programs? I mean - can someone clear up this misconception if it is one - but how on earth is the IA-64 going to cover all the old packed instructions like SSE/MMX has, and things like that, if it's a RISC architecture? Seeing as it's going to have (virtually) 127 integer registers surely we are going to have to kill off any previous MMX/SSE/3DNOW assembly code we wrote and rewrite a bunch of "implicitly parellel processed" C/C++ code or whatever language you use? This is obviously a pretty big pain in the arse if you've wrote a very large backend that has some machine dependent code in it for the purposes of speed.
How good are the compilers going to be at taking advantage of these things? Sure it's a tired question to say "the compiler isn't fast enough" but surely we were getting pretty good at compilers for the current range of chipsets, we'd have to completely change the structure of them now to keep up the same level of code efficiency for the IA-64.
And what about the bloating of the compiled code too? Obviously there is the deficit of 64+ bit instructions instead of 32bit, but also the compiled code will be longer too being a RISC architecture having to repeat the same instructions more frequently.
Anyone got a quote on how much L1 and L2 cache we'll get on these chipsets to make up for that?
Personally I'm leaning towards a preference for x86-64 architecure due to backwards compatability.
(cue Spartacus style responses plz? Or don't people attend this forum?)
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
Being Intel I'd imagine that they'd include the SSE instructions at least, since technically the SSE and SSE2 instructions are made to operate on a vector of data in parallel. I'd say they'd dump MMX since that's what the RISC insturctions would be doing anyway, just much better and much faster.
Also with the instructions, even though its a 64 bit machine, the instruction might only take up about 16 bits in total, including encoding for the different parameters used etc. So most likely you'd also include several parameters within the full 64 bits.
(Oh well, I guess we'll learn more of this in Computer Architecture eh?)
It seems that the IA-32 instruction emulation on the IA-64 chipsets are Pentium III based, meaning MMX is supported and SSE1 and SSE2 when operating in IA-32 instruction set modes.
There seems to be a huge amount of mess when it comes to running in this IS mode in terms of the level of protection and things like that. It's got protected, real, VM86 memory addressing modes and all sorts of register mapping going on. To me it seems like they've really sorta copped out on their plans to create a real RISC architecture. Except of course, for new programs that run purely in the IA-64 mode.
Intel are trying to dig themselves out of a hole they made fifteen years ago and I think it might be a wee bit late.
To me it seems the x86-64 architecture doesn't have the old instruction set emulation...ie - it's fully backwards compatable - don't know for sure though. Although I'm probably AMD biased so don't take my advice.
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
DirectX 9 ?
I'm wondering how everyone is finding DirectX9 so far, is it easy to work with, how are the shaders, is it worth upgrading if you have an older (or ancient) video card? And has anyone used it yet with any language apart from C++ and VB? (Like C# etc?)
I personally haven't used it, and am reserved about downloading and installing it.
I ported all my code to DirectX 9. It seems to run a little bit faster. I haven't looked at the shaders at all, but the code changes to get your program working under DirectX 9 are minimal. Take a look at this article I wrote to see what I encountered moving from DX8.1 to DX9: http://groups.msn.com/BrisbaneGameDev/migratingfromdirectx8.msnw
Ethan Watson, teh brand spankin' new Krome employee.
Daemin, you should already know what happens when you take an ancient video card and run DX-9 on it, I lost 3000+ 3D Marks when I 'upgraded' to DX-9, instead of DX-8.
Of course, there are other factors. Old operating system, new card etc etc....anyone else had this happen to 'em?
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
Maitrek: Yeha, well I thought this would be a nice topic to get the programming thread up and running actively :-P
So far it looks like I'll still be using DirectX8.1 for all my things until I get a better computer.
From what I have read through it seems that DirectX7 has made a comeback with its directdraw component, for all the people still wishing to make 2D games easily. I doubt anyone here's used this feature so far.
The way i hear it, using directdraw for 2D games is a bit easier, but it's much slower on accelerated hardware than using 3D quads, because gfx cards these days are optimised for drawing polygons rather than blocks of screen space... So depending on if there was anything really fancy you wanted to do, using directdraw would be fine.
CYer, Blitz
Ive noticed some good changes already going from 8.1 to 9.
Im not sure about performance but code wise theres alot of better support.
They've added some great support to DirectShow with VMR filters where u can use pixel shaders on videos, and the speed of playback + loading both for audio and video.
Im writting a few tutorials for DirectShow+Music atm if anyone wanted to play around with these new feaures.
I don't see how you could possibly make fast enough software shaders anyway, so I doubt they spent hundreds of man hours trying to figure out a decent way of implementing them. Plus even if they did have software shaders, it's unlikely they'd be able to use what your old hardware can do + software shaders anyway.
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
quote:Originally posted by Blitz
Something i've wondered, and been to lazy to really look for, is whether DX9 implements vertex and pixel shaders in software if they are not available on the hardware? Anyone know? It's difficult to program shaders if you can't afford a gfx card that can run them hehe.
CYer, Blitz
Vertex shaders can be emulated on the CPU quite efficiently (sometimes better then the GPU), although pixel shaders cannot. The reference rasteriser emulates *everything*, even pixel shaders - but will run slow as hell if you try. So you can forget pixel shaders untill you get a card that support them. Vertex shaders are still quite powerfull and fun, and if you are running them in software you can use any version including vs_*_sw (no limits!).
Oh, and learn HLSL too. Although it's still rather young, the compiler can produce some very nice code.
--redwyre
I've been playing around with Vertex Shaders a bit lately. Pretty damn kewl. I could probably quite easilly rewrite my Worms In Tanks' heightfield to use vertex shaders instead of locking/unlocking vertex buffers, and even then that's really only using them in a simple way. Simple = taking control of transforming the vertices using the normal matrices, complex = taking alot of the strain off the CPU and memory transfer by doing as much as possible in VS.
Haven't really looked at HLSL, will probably look at it sometime soon though.
Ethan Watson, teh brand spankin' new Krome employee.
quote:Originally posted by GooberMan
I've been playing around with Vertex Shaders a bit lately. Pretty damn kewl. I could probably quite easilly rewrite my Worms In Tanks' heightfield to use vertex shaders instead of locking/unlocking vertex buffers, and even then that's really only using them in a simple way. Simple = taking control of transforming the vertices using the normal matrices, complex = taking alot of the strain off the CPU and memory transfer by doing as much as possible in VS.
Yeah, but then you loose the ability to deform it, since you can't re-generate the normals.
--redwyre
I'm going to make a revision to my statement above, directx 9.0 sucks unless you have a GeForce FX, or a Radeon 9500/600/700/800 (ie pixel shader 2.0). This is performance-wise speaking as a gamer/consumer, as a programmer yet to really find out whether I think DirectX 9 is any good.
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
quote:Originally posted by Maitrek
I'm going to make a revision to my statement above, directx 9.0 sucks unless you have a GeForce FX, or a Radeon 9500/600/700/800 (ie pixel shader 2.0). This is performance-wise speaking as a gamer/consumer, as a programmer yet to really find out whether I think DirectX 9 is any good.
I totally disagree. I've only seen perfomance increases or no change (gf4mx). The API hasn't changed much, mainly just a few aditions and refinements.
--redwyre
I lost about 2.5K 3DMarks in 3DMark2001 when I "upgraded" from 8.1 to 9.0 on my Athlon XP 2100+ with a GF4 Ti4600...now that's a performance loss on a benchmark, so who knows what it means gaming wise, I'll find out when I actually get to play some games someday soon :)
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
Okay for the sake of the "3DMark2001" is fairly old....for starters 3DMark2002 doesn't exist, and 3DMark2003 only runs on DX9 - so you can't compare the loss of performance between DX9 and DX8.1 with 3DMark2003...
And also, yes I realise they always make the latest 3DMark software very harsh in terms of marks and framerates because GPUs and CPUs are getting fast at an extreme rate so you'll probably see scores of like 9000-10000 by the end of the year anyway.
My opinion, only get DX9 if you have a card which supports pixelshader 2.0 or above. If you have pixelshader 1.x DirectX8.1 will run faster, and if you don't have any pixelshading capabilities on your graphics card, buy a new video card if you need to play the latest games.
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
quote:Originally posted by Maitrek
And also, yes I realise they always make the latest 3DMark software very harsh in terms of marks and framerates because GPUs and CPUs are getting fast at an extreme rate so you'll probably see scores of like 9000-10000 by the end of the year anyway.
Perhaps they should have called it 3DMark 2004? :P
Ethan Watson
Current job: Programmer, Krome Studios
Funny story
I had a clean install of my system with only DirectX 9 ever installed on it - no previous versions etc etc and I got a result of 8500 or so 3DMarks in 3DMark2001.
Scrapped that due to crappy performance levels, then reinstalled the whole system again exactly the same and instead of DX9, I installed DirctX8.1 - and I got 11500 again. All good.
Then I installed DirectX9 again over the top of DX8.1 because I wanted to run 3DMark2003, and now in 3DMark2001 - I got scores of approximately 11500 once more with that setup
Wierd as...
My answer to this problem - NFI.
Snootchie bootchies!
Any off-topic issues send to maitrek@austarmetro.com.au
Hey Daemin,
Thanks for that its great to hear that there is more than just one game coder here in Adelaide - its a pity that there arent more developers here hey. I have almost finished the DirectX download I thought that I should get off my lazy ass and learn some DirectX as it has been awhile and I feel that my OpenGL skills arent too bad at the moment. Wondering how your ???? (was it the Auran engine) engine work is going it would be quite interesting to hear - perhaps could chat over ICQ some time.
Working Wisdoms
Zaph,
Since your a working member of the community care to share any wisdoms about the real world? [:D]
Always great to find out where people started, how they got where they are, what duties your current position involves....
"Yes I Code"
As found on AGDC name tag 2002
quote:Originally posted by Jacana
Zaph,
Since your a working member of the community care to share any wisdoms about the real world? [:D]
Actually I'm a Holidaying member of the community at the moment!
I'll do a proper reply to this tomorrow.
If anyone really needs to know what I've worked in a hurry [:D] then wander over to my website: http://www.torps.com and click on 'My Games' to see the list.
The industry is rapidly changing. The only recommendation is learn as much about everything - hardware, optimization, 3D, AI, etc. as possible. There are a ton of brilliant people in the industry, but many old farts who haven't kept up with the times and are becoming less in demand.
Oh yes, if you do next-gen work, you have to know your maths and physics....
I know Zaph is on holidays after working 1.8 years on Grand Prix Challenge, but I'm also eager to hear how he got started/about his work etc!.. (I'd also like to hear what he has to say about the 'old farts' remark too. hehe. I would've thought it'd be the other way around. Having that much experience surely accounts to a lot)
quote:Originally posted by Souri
I know Zaph is on holidays after working 1.8 years on Grand Prix Challenge, but I'm also eager to hear how he got started/about his work etc!.. (I'd also like to hear what he has to say about the 'old farts' remark too. hehe. I would've thought it'd be the other way around. Having that much experience surely accounts to a lot)
I'm back :-) after a lazy 10-weeks of holidays :-)
What was the question again ?!
How I got started:
I wrote my first games back in high-school on the TI-99/4A and got some published in a book in 1982/83. I then went to Uni, got a real job, saved the world, and *then* discovered that you could actually get a job programming games!
In 1995 I started at Beam Software (aka Melbourne House, aka Infogrames Melbourne House) and have been there ever since.
I started as a senior programmer, then progressed to Lead Programmer, then to TDG (Technology Development Group) Lead, and now I'm a Producer (yes, I went over to the dark side!). For those that are interested here is the amount of time spent programming each position (I'm making the numbers up)
Programmer - 70-80%
Senior Programmer - 60-70%
Lead Programmer - 10-30%
Producer - 0%
quote:The industry is rapidly changing. The only recommendation is learn as much about everything - hardware, optimization, 3D, AI, etc. as possible. There are a ton of brilliant people in the industry, but many old farts who haven't kept up with the times and are becoming less in demand.
ok, I think I disagree with this.
If you want to be a lead programmer then this is probably true, you need a good grounding in every field. However we are becoming more and more specialised these days - you couldn't create Grand Prix Challenge with a team of general-knowledge programmers, you need specialists who know their area inside-out - and we are moving more and more in that direction as teams get larger and specialisation is even more important. The same is becoming true of artists too
Old farts are always in demand... the technology behind games is important but the hardest thing to find in this industry is experience in game-making - not the coders who know what a quaternion is and how to program the different units on a PS2 (although we need them too) - any decent programmer can have a shot at learning that, but someone with 5-10 years of experience making games knows much more than programming.
Having said that, old farts need to stay up with the times... many techniques have gone the way of the dodo as hardware changes and new features in games have new requirements (an obvious example is the 2d-3d switch over the last 7-8 years).
Zaph
quote:Originally posted by Daemin
Hrm, I would've figured being the lead programmer would result in about 30-50% coding?Also what other qualifications do you need (or experiences) to become one of those "Lead" guys?
Nope - it's definately nowhere near 50% for a true lead.
We're talking leading a team of 10-20 programmers, so much of your time is taken up just organising those guys, helping them out, talking with management, etc
A Lead Programmer needs many things - a few are:
- Great people skills, they need to be able to organise and rally a team to do the job. This is not-negotiable as far as I am concerned. I consider a Lead Programmer as a leader above a programmer.
- Great programming skills (even if they dont get to use them as much)
- Great understanding of games and their game in particular, as well as all the things that go wrong in making a game.
- Knowledge of all areas of a game (but only enough to converse with management, spot problems, and to be able to ask the right questions of other programmers)
There's much much more too, thats just a sample of the kinds of things we look for.
I agree! (yay for mods!)
Hurrah!