PDA

View Full Version : Mass Effect 2 PC Requirements Released


modeps
11-25-2009, 05:49 AM
http://evavhost.com/i/news/masseffect2.jpg

Over on Bioware's official forums (http://meforums.bioware.com/viewtopic.html?topic=710074&forum=144), they've confirmed what you'll need under the hood to play Mass Effect 2 on your PC in a few short months.

Digital Rights Management (DRM) The boxed/retail PC version of Mass Effect 2 will use only a basic disk check and it will not require online authentication. This is the same method as Dragon Age: Origins. Digital versions will use the retailers protection system.

PC MINIMUM System Requirements

OS = Windows XP SP3 / Windows Vista SP1 / Windows 7
Processor = 1.8GHz Intel Core 2 Duo or equivalent AMD CPU
Memory = 1 GB RAM for Windows XP / 2 GB RAM for Windows Vista and Windows 7
Hard Drive = 15 GB
DVD ROM = 1x Speed
Sound Card = DirectX 9.0c compatible
Direct X = DirectX 9.0c August 2008 (included)
Input = Keyboard / Mouse
Video Card = 256 MB (with Pixel Shader 3.0 support). Supported Chipsets: NVIDIA GeForce 6800 or greater; ATI Radeon X1600 Pro or greater. Please note that NVIDIA GeForce 7300, 8100, 8200, 8300, 8400, and 9300; ATI Radeon HD3200, and HD4350 are below minimum system requirements. Updates to your video and sound card drivers may be required. Intel and S3 video cards are not officially supported in Mass Effect 2.

PC RECOMMENDED System Requirements

Windows XP SP3 / Windows Vista SP1 / Windows 7
2.6+ GHz Cure 2 Duo Intel or equivalent AMD CPU
2 GB RAM
ATI Radeon HD 2900 XT, NVIDIA GeForce 8800 GT, or better recommended
100% DirectX compatible sound card and drivers
DirectX August 2008
NOTES: For the best results, make sure you have the latest drivers for your video and audio cards. Laptop or mobile versions of the above supported video cards have not had extensive testing and may have driver or other performance issues. As such, they are not officially supported in Mass Effect 2. Intel and S3 video cards are not officially supported in Mass Effect 2.




Thanks VG247 (http://www.vg247.com/2009/11/25/mass-effect-2-pc-gets-system-specs-basic-drm-confirmed/).

I like how they are being open about what kind of DRM will be included. Good show as always, Bioware.

Demo_Boy
11-25-2009, 06:01 AM
I don't really feel that game requirements are news.

Nothing topical about them.

Paranoia
11-25-2009, 07:10 AM
Normal DVD checks? Fine by me.

pwnophobia
11-25-2009, 07:15 AM
I don't really feel that game requirements are news.

Nothing topical about them.

You are SO RIGHT! No one with a PC needs to know if their rig can handle the latest release of a game, right?

brandonjclark
11-25-2009, 08:56 AM
Man, it sure makes me happy to see specs like that! Those look actually quite small, thanks to the age of the Unreal 3 engine, I suppose.

Flatulus
11-25-2009, 11:13 AM
Glad to see that the 8800GT Ive purchased 2 years ago is still standing strong.

I guess I'll wait maybe another year before upgrading my vid card to something like an ATI 5870.

alienchild
11-25-2009, 12:16 PM
You know, in a way they announced a Steam release with that press release... which is f'ing awesome since I just finished the first one (couple of weeks ago) and can't wait for the sequel!

Kweli
11-25-2009, 12:49 PM
Anyone willing to 'gift' me the first one on Steam? Willing to give 10 bucks (paypall i guess?)

Anenome
11-25-2009, 12:59 PM
Anyone willing to 'gift' me the first one on Steam? Willing to give 10 bucks (paypall i guess?)
Sure, but first you have to do the Truffle Shuffle!

t5whaRkuipU

Kweli
11-25-2009, 01:10 PM
Damn, it will take me a few weeks to gain enough fat to make a dance like that funny.....

Syl
11-25-2009, 01:31 PM
Is it bad that i'm kind of waiting for the day that a game comes out where the specs are actually ABOVE mine so i can have an urge to upgrade?

Jotoco
11-25-2009, 01:40 PM
Is it bad that i'm kind of waiting for the day that a game comes out where the specs are actually ABOVE mine so i can have an urge to upgrade?

Yes it IS bad.

And if I had the money I would always keep my rig in the top of the line. I would be buying one (maybe two) 5970 and some serious core i7 to overclock the hell out.

But for the moment being, my 4870 can handle everything I throw at it.

Syl
11-25-2009, 04:22 PM
Yes it IS bad.

And if I had the money I would always keep my rig in the top of the line. I would be buying one (maybe two) 5970 and some serious core i7 to overclock the hell out.

But for the moment being, my 4870 can handle everything I throw at it.
Exactly, my 4870 can easily handle anything that currently exists, and what worries me is that it seems to be able to handle anything that will exist in the forseeable future.

I guess another "Crysis" is unlikely, but the fact that the game existed is part of why I ended up upgrading to a 4870 in the first place, Crysis was a game that my 8800GT struggled with. The only games where my 4870 struggle are poorly coded, such as STALKER.

I really, really like some of the DX11 stuff available with the 5870's, but i honestly just don't see any reason to even think about getting one, as my 4870 is more than enough power at my 1680x1050 resolution.

Anenome
11-25-2009, 08:39 PM
Exactly, my 4870 can easily handle anything that currently exists, and what worries me is that it seems to be able to handle anything that will exist in the forseeable future.

I guess another "Crysis" is unlikely, but the fact that the game existed is part of why I ended up upgrading to a 4870 in the first place, Crysis was a game that my 8800GT struggled with. The only games where my 4870 struggle are poorly coded, such as STALKER.

I really, really like some of the DX11 stuff available with the 5870's, but i honestly just don't see any reason to even think about getting one, as my 4870 is more than enough power at my 1680x1050 resolution.

- This is because the graphics card in general is getting to be 'good enough.' The industry and the product has matured. Progress is leveling out and the industry will find new ways to compete, ultimately fleshing out other aspects and areas. Just as the CPU industry no longer competes on mega-hz or giga-hz, but on performance per watt and multi-cores. Similarly, car companies compete in way unimaginable in decades past.

Personally, I'm waiting for the next paradigm to emerge, if it ever does. We're in a polygon world right now. But, increasingly realism may become increasingly difficult using polygons. If we instead begin modeling everything with micro-spheres, using a voxel-representation, things that are difficult with polygons become very easy with voxel-simulation. Ex: destructible environments? Simple.

Thus, PhysX integration may become important for the future. We might even look into simulation voxels -with- polygons, micro-polygons. Generally, the math is a lot harder; that's just the beginning of the difficulties for voxels.

In any case, I have a hard time believing that we're going to be using polygons for all time, from now on.

brandonjclark
11-25-2009, 08:51 PM
Yeah, Anenome, that reminds me of an article (http://www.firingsquad.com/features/carmack/page13.asp) I read with John Carmack on FiringSquad.com (who posted that, was it you?) and it does seem that polygon rendering is an old tech pushed to nearly its own limits. Whether it be calculated voxels or maybe a hologram-type representation model, I'm thinking it's time to bring out something new, something with more headroom.

I know a lot of people who say that "videogames today are sooo realistic they're almost 1-1 with reality", but all I see are aliased lines and poor anisotropic filtering. Videogames look NOTHING like reality to me. Now, they don't need to, but to say they're getting close is a joke.

Poly is dead!

Anenome
11-25-2009, 10:32 PM
Hey, no I didn't post that article, but it looks great, seems Carmack and I are on the same page :P What I love about reading Carmack articles/interviews is that he's not a cocky bastard, but he's clearly one of the best tech minds in the industry, and it shows in his interviews. He'll toss this in casually:
((speaking of NURBS and Bezier curves)) And you don't recognize all of these "in practice" problems, like with invalid normal but degenerate edges. How you can't do an arbitrary cut of a patch without raising it to like the square of the order of the side, you can't stitch two junctions by fixing the other things

As a tech junkie, I love reading that stuff. It makes sense. I'm sure some people's eyes glaze over.

I try to keep my finger on the pulse of technology and forecast trends. As an author who is planning a scifi series set in the near future, tech trends are very important ;) I've been a fan of the concept of voxels for a very long time now. And basically there's going to come a tipping point where the power of our computers becomes such that voxels become practical, and easier to do a given advanced effect on than doing the same effect with polygons. But if it happens it will happen naturally. Market forces will force it to happen. I haven't seen the first signs of any drastic changes yet, except for the fact of the growing feeling that graphics are now 'good enough.' They were good enough when the 360 and PS3 were released. Honestly, they were sooo good that both Sony and MS decided to go hi-res, a previously unthinkable decision. This entire generation wouldn't have had the slightest difficulty making the most gorgeous games ever produced if the PS3 and 360 had not been hi-res.

And this is another reason why this console generation is being extended. Graphical changes did actually drive the upgrade-fever of the last several decades, with major graphic jumps occurring every 5 years like clockwork. But notice what happened with the Wii. With graphics being 'good enough' what took control of the market was not graphical innovation, but controller innovation. This is the same trend that will affect the graphics market but in a different fashion. We've solved the problem of rasterization. It's gorgeous, it's beautiful, we have achieved near unlimited variation in polgyons and textures. And now that GPUs are integrating with PhysX, you put those two together and you could build a rather powerful voxel engine in hardware. That could occur as a function of market differentiation.

The result would be perhaps a market war on who has the best hardware-based voxel engine, which would start the graphical arms race all over again. It's that arms race that the companies need to differentiate themselves--that consumers need in order to be moved to upgrade--that developers need to expand their conception of what a game could be and create new forms of entertainment--that technology itself needs for voxel-technology to become focused on and expanded theoretically and practically.

And you're right, today's graphics are still a far cry from perfectly realistic. All we really know is just how much closer it is now that what it used to be :P To achieve perfect realism you'd have to model real-world objects on a cellular level, on an atomic level. Only voxels can do that accurately, and those only within a physical simulation (ala PhysX). You want realistic skin? Well, you don't have to worry about some silly texture-based sub-surface scattering effect when your voxel model is using 'real' skin. It's real down to the protein structures on an atomic level :P With voxels, there's no such thing as a texture, you can zoom in on any object to any level of magnification you want--all the way down to viewing individual simulated atoms flying around, and never lose clarity. Beat that, mip-mapping.

Like Carmack says, it's a technique whose time hasn't yet come. But, actually, this division is the key breakthrough in the novel I'm working on. By making the voxel-leap, the world is saved ;) It's set in 2018 or so, so I need to get working on it :P

Namielus
11-26-2009, 01:02 AM
I thougth that voxels, just had the dimentionality, so they wouldn't use standard texture mapping techiques. The resolution of the volex based model would be the contributing factor then to the image quality, I don't understand how that translates 'world simulation' down to the atomic level?

Anenome
11-30-2009, 02:11 AM
I thougth that voxels, just had the dimentionality, so they wouldn't use standard texture mapping techiques. The resolution of the volex based model would be the contributing factor then to the image quality, I don't understand how that translates 'world simulation' down to the atomic level?

The idea of a voxel is to capture the world using 3D pixels, or VOlumetric piXEL. Currently, it's the best way to capture fine resolutioned 3D data structures, so it's biggest use right now is in medical imaging, CAT scans, MRI, etc.

You're right about what you're saying, about how it would translate to image quality.

Where I'm taking it in a new direction is to say that eventually you could get the size of a voxel so small that each is able to function or take the place of an atom in a simulation. Instead of cubes, make them spheres. Then, using a physics simulation, which is easily parallelizable, you can create a physical simulation being tracked on an atomic level. What you have then is a simulation of the real world itself on an atomic level. Program each voxel to mimic the chemical properties of particular atoms and follow the rules of chemistry, etc., and there you go. A real world simulation achieved with voxels.

In a world quickly coming upon us where we will have computers with 100 million CPU cores on a single chip, we need to create problems that are parallelizable and demand that much power. Where we are now, we've nearly maxxed out texture and polygon capabilities. Processing power is taking off, and challenging programs becoming more difficult to find. In fact, what's really holding everything back is bandwidth problems between CPU and memory, etc. If there's a breakthrough in bandwidth which completely eliminates the bottlenecks between processor and memory, as I'm quite certain there soon will be, that alone will represent a giant leap in performance.

And... well, if you want to read more of the implications of this idea, you'll have to wait for my novel to be released :P