Skip to main content

Progress vs Patience

Forum

Is everyone here happy with the speed at which technology is progressing? I read in forums about people wanting faster, better, more but are we just advancing for advancements sake? Are we giving software developers enough time to really harness the capabilities of systems before launching new ones? I think there's a lot of untapped potential and a lot of fundamental lessons in game design that simply aren't being learnt because people are always looking towards the next best thing. If I ever fall off the gaming wagon, it will be because I financially and intellectually will not be able to keep up with the new technology... and then I guess I will be my father.
I'm reasonably happy with getting 4 or five years out of my gaming system, but I have a feeling that this will soon turn into a new system every year or two. Who does that benefit really?

Submitted by palantir on Thu, 28/04/05 - 1:10 AM Permalink

Good point. I think that software development needs to come a long way to catch up with the level of hardware. Back in the early days of games, software had to be incredibly efficient to function under such tight technical limitations, and the result was beautifully designed software (so I believe, anyway). These days there is so much power in the hardware that highly efficient software design isn?t necessary and too expensive, so instead we incorporate relatively poor software design in order to get a product to market ASAP.

So I guess I?m saying that I think modern software design leaves much room for improvement, and I think said improvement won?t happen if we keep aiming for the biggest and best hardware. If hardware improvements hit a wall with what can be done (which seems inevitable at this point ? there are physical limits), then I guess software will finally have a chance to catch up.

I really think we could be doing so much more just with better software development.

I feel that games systems should be on the market much longer before the next generation is developed.

Submitted by Daemin on Thu, 28/04/05 - 8:50 PM Permalink

I personally think that for tasks such as word processing, most business applications, and most database tasks the desktop computing power that we have currently is far more than sufficient. Therefore to get the best out of the current technology the developers have to learn to use it more efficiently (read: using "bloated" libraries and languages such as Java, C#, and specifically .Net doesn't count).

However I think the main area that needs improvement is the actual user interface to the computing resources that we have. We're stuck in an aging paradigm and I see no real way out. All of the recent interface advancements are just really incremental improvements. (Did you notice how the preview images of longhorn look exactly like XP with a sidebar and different coloured skin?) In addition the applications built on top of the OS don't really have a better interface either. Applications that are made to save time in performing some tasks actually make them take longer than they should (I've had friends make comments about this with the software that they have to use).

With that said there will always be a market for faster processors and more processing power, especially in areas of high computation such as scientific physics simulations, and military systems. In these areas the software is pretty much completely optimised and therefore they need more hardware grunt.

We have to keep in mind though that the processing environment has to be tailored to the area that it will be used in, hence why the space shuttle has bugger all computing power compared to modern systems, why a microwave doesn't cook your food by using its processor, etc. It kind of disappoints me that the major x86 chip manufacturers aren't making the slower chips anymore, since you don't need 2GHz to do word processing with.

(I would highly recommend the podcasts from www.itconversations.com, especially the one from Claydon Christianssen (sp?), they are very interesting and relevant to these sorts of topics.)

Submitted by Maitrek on Sat, 30/04/05 - 1:27 AM Permalink

Advancing 'technology' and hardware is usually of such great focus because you can see the improvements straight away (a faster better designed processor always runs things better than the previous slower processor). Whereas the fruits of refining software development techniques and the programming paradigm we operate under aren't immediately apparent.

quote:especially in areas of high computation such as scientific physics simulations, and military systems. In these areas the software is pretty much completely optimised and therefore they need more hardware grunt.I doubt that those systems are as highly optimised as you say - having spoken to a few people who work in the defence industry (for example) it appears writing non-buggy code is challenging enough, let alone accuracy of the simulation - and furthermore the performance of it! As for scientific simulations (increasingly fluid and aero mechanics), well those systems are all coded for reliable repeatable simulations - they aren't 'optimised' for speed as such either...so they need all the hardware resources they can get.

Submitted by Leto on Mon, 02/05/05 - 7:54 PM Permalink

From my own experience, I've found that the more realistic you want your simulation to be, the more computing power you need. I would love to be able to run a proper weather simulator so I can model cloud formation properly but the sheer weight of computation needed makes that impossible. Instead, I'll have to come up with an approximation (not to be confused with optimisation) that looks convincing but is really nothing like the real thing. I guess it depends on the application in the end - do you want interactivity, or scientific accuracy?

More to the point though, the progress of technology might be a bit bewildering because of its speed, but I think that's half the reason I enjoy working in this field. It's exciting!

Submitted by mcdrewski on Tue, 03/05/05 - 1:53 AM Permalink

In engineering the word "model" typically refers to any approximation/algorithm which gives a result that represents the real system for specific observations. Just like you say Leto - it's all about what you want out of your "model" that says how good it is.

A "cloth simulation" as used by animators will normally need to act in a similar way to real cloth on the human scale, but the same term could refer to a weaving model used to detect manufacturing flaws. (I worked on one!)

Your approximation is just that, a model which focusses on the visual appearance of clouds. Sure, you might get more accurate results using a better model, but if it gives you what you need it's still a valid model.

Forum

Is everyone here happy with the speed at which technology is progressing? I read in forums about people wanting faster, better, more but are we just advancing for advancements sake? Are we giving software developers enough time to really harness the capabilities of systems before launching new ones? I think there's a lot of untapped potential and a lot of fundamental lessons in game design that simply aren't being learnt because people are always looking towards the next best thing. If I ever fall off the gaming wagon, it will be because I financially and intellectually will not be able to keep up with the new technology... and then I guess I will be my father.
I'm reasonably happy with getting 4 or five years out of my gaming system, but I have a feeling that this will soon turn into a new system every year or two. Who does that benefit really?


Submitted by palantir on Thu, 28/04/05 - 1:10 AM Permalink

Good point. I think that software development needs to come a long way to catch up with the level of hardware. Back in the early days of games, software had to be incredibly efficient to function under such tight technical limitations, and the result was beautifully designed software (so I believe, anyway). These days there is so much power in the hardware that highly efficient software design isn?t necessary and too expensive, so instead we incorporate relatively poor software design in order to get a product to market ASAP.

So I guess I?m saying that I think modern software design leaves much room for improvement, and I think said improvement won?t happen if we keep aiming for the biggest and best hardware. If hardware improvements hit a wall with what can be done (which seems inevitable at this point ? there are physical limits), then I guess software will finally have a chance to catch up.

I really think we could be doing so much more just with better software development.

I feel that games systems should be on the market much longer before the next generation is developed.

Submitted by Daemin on Thu, 28/04/05 - 8:50 PM Permalink

I personally think that for tasks such as word processing, most business applications, and most database tasks the desktop computing power that we have currently is far more than sufficient. Therefore to get the best out of the current technology the developers have to learn to use it more efficiently (read: using "bloated" libraries and languages such as Java, C#, and specifically .Net doesn't count).

However I think the main area that needs improvement is the actual user interface to the computing resources that we have. We're stuck in an aging paradigm and I see no real way out. All of the recent interface advancements are just really incremental improvements. (Did you notice how the preview images of longhorn look exactly like XP with a sidebar and different coloured skin?) In addition the applications built on top of the OS don't really have a better interface either. Applications that are made to save time in performing some tasks actually make them take longer than they should (I've had friends make comments about this with the software that they have to use).

With that said there will always be a market for faster processors and more processing power, especially in areas of high computation such as scientific physics simulations, and military systems. In these areas the software is pretty much completely optimised and therefore they need more hardware grunt.

We have to keep in mind though that the processing environment has to be tailored to the area that it will be used in, hence why the space shuttle has bugger all computing power compared to modern systems, why a microwave doesn't cook your food by using its processor, etc. It kind of disappoints me that the major x86 chip manufacturers aren't making the slower chips anymore, since you don't need 2GHz to do word processing with.

(I would highly recommend the podcasts from www.itconversations.com, especially the one from Claydon Christianssen (sp?), they are very interesting and relevant to these sorts of topics.)

Submitted by Maitrek on Sat, 30/04/05 - 1:27 AM Permalink

Advancing 'technology' and hardware is usually of such great focus because you can see the improvements straight away (a faster better designed processor always runs things better than the previous slower processor). Whereas the fruits of refining software development techniques and the programming paradigm we operate under aren't immediately apparent.

quote:especially in areas of high computation such as scientific physics simulations, and military systems. In these areas the software is pretty much completely optimised and therefore they need more hardware grunt.I doubt that those systems are as highly optimised as you say - having spoken to a few people who work in the defence industry (for example) it appears writing non-buggy code is challenging enough, let alone accuracy of the simulation - and furthermore the performance of it! As for scientific simulations (increasingly fluid and aero mechanics), well those systems are all coded for reliable repeatable simulations - they aren't 'optimised' for speed as such either...so they need all the hardware resources they can get.

Submitted by Leto on Mon, 02/05/05 - 7:54 PM Permalink

From my own experience, I've found that the more realistic you want your simulation to be, the more computing power you need. I would love to be able to run a proper weather simulator so I can model cloud formation properly but the sheer weight of computation needed makes that impossible. Instead, I'll have to come up with an approximation (not to be confused with optimisation) that looks convincing but is really nothing like the real thing. I guess it depends on the application in the end - do you want interactivity, or scientific accuracy?

More to the point though, the progress of technology might be a bit bewildering because of its speed, but I think that's half the reason I enjoy working in this field. It's exciting!

Submitted by mcdrewski on Tue, 03/05/05 - 1:53 AM Permalink

In engineering the word "model" typically refers to any approximation/algorithm which gives a result that represents the real system for specific observations. Just like you say Leto - it's all about what you want out of your "model" that says how good it is.

A "cloth simulation" as used by animators will normally need to act in a similar way to real cloth on the human scale, but the same term could refer to a weaving model used to detect manufacturing flaws. (I worked on one!)

Your approximation is just that, a model which focusses on the visual appearance of clouds. Sure, you might get more accurate results using a better model, but if it gives you what you need it's still a valid model.