Jump to content

MemoryLeak

Members
  • Content count

    298
  • Joined

  • Last visited

Recent Profile Visitors

1,487 profile views
  1. MemoryLeak

    SOLO: A Star Wars Story

    Avoid the Guardian. Peter Bradshaw predictably recites the entire plot. He liked it though.
  2. MemoryLeak

    Stone Roses

    Sounds like a seahorses cover band
  3. Yeah I don't think Layte was just being master race, he had a good point, you can't emulate a PS3 because the CPUs aren't powerful enough. In fact I'm pretty sure that no current PC chip is powerful enough to emulate a Cell either. Over the years we did some crazy stuff on SPU which I'm sure wouldn't run on any current CPU hardware. Its interesting because I don't think we're going to see the PS3 emulated for a long time yet.
  4. I play Destiny whilst drinking wine. This explains why I'm simultaneously skint and dreadful at Destiny.
  5. I don't think this makes any sense. Neither platform had a "native" resolution for rendering. Unless I'm misunderstanding you.
  6. Nah, sorry, the whole thing is nonsense. Literally every statement and assumption in that post is rubbish. Also, GPUs deal with reads and writes appropriately for the memory subsystem with caches and queues, ESRAM makes no difference here.
  7. Don't be silly. One plays on keyboard, the other plugs a joystick into the soundcard port.
  8. Sadly a lot of that article is wrong, I'm not sure the guy really understands what DX12 is, or even how game engines have worked for a while now. The stuff about cores speaking to cores, and lighting is just completely wrong. The big change in DX12 is to how state is represented, i.e. how the CPU tells the GPU what to do. They've seperated things out such that most of the state is static, and there is a sensible interface for changing only the minimum amount of data as is necessary. Previously, as state changed (e.g. changing parameters, textures, shaders etc. between different draw calls) a lot of data would be invalidated at once, and would need rebuilding. Now it doesn't, so its much faster. They've also designed it in such a way that means these command buffers can be built entirely on seperate threads (they sort of could be on DX11, but the driver still had to do a lot of messing around which cost CPU time on the main thread), so that's nice too. The end result of this is a much lower CPU overhead for normal game engines. This in turn means you can do more draw calls, so you can split things up for better culling etc. which in turn could possibly improve GPU perf too. But the main win is CPU side.
  9. MemoryLeak

    Discuss the future of game graphics

    Has anyone looked at Shadertoy? Its an interesting webGL based site that allows people to write shaders for fun without needing a whole game engine to mess around with. The cool thing is some people are really pushing what you can do in just a single shader, with demoscene style graphical showcases. A friend of mine did this: https://www.shadertoy.com/view/lsSXzD Its the opening area of Doom E1M1, in a shader. There are no polygons (other than the two triangles to form the screen), no textures, no file loading, no anything. Its just a shader, everything you see is completely procedural. Except its Doom. Mental. Oh yeah, its also got audio. Procedurally, from the shader. WTF.
  10. MemoryLeak

    Glastonbury 2014

    Lars said this was a greatest hits set. Puppets is safer than a bubble wrapped Volvo.
  11. MemoryLeak

    Red wine reduction for idiots (aka me)

    If you're trying to recreate a reduced red wine sauce which you've had in a restaurant, the chances are what you're actually after is more like a Demi-glacé. The richness of flavour and the slightly syrupy texture come not only from the red wine but beef or veal stock. Make a couple of litres of beef stock first, add a bottle of wine, then gently simmer down until you have a beautiful glossy reduction. I then store this in ice cream trays in the freezer. For a steak sauce, once you're steak is cooked deglaze the pan with a bit of wine to take the heat out, then chuck in a couple of cubes of Demi and warm through. Then add a good knob of butter and whisk to incorporate. It's a lot of work to make the Demi, but you can make quite a lot at once, and the results are amazing.
  12. MemoryLeak

    Why relatively weak CPUs for next gen?

    The "game" itself doesn't need a fast CPU, but yeah directx and driver overhead kills most PC games. Hence Pulsemyne's point about Mantle.
  13. MemoryLeak

    Why relatively weak CPUs for next gen?

    Games don't actually need particularly fast CPUs, by far the bulk of the processing power is needed in the GPU, so that's where it's focussed on the new consoles. The CPU needs to be fast enough to feed the GPU with information, and since consoles don't have much overhead in the graphics library this is pretty easy.
  14. MemoryLeak

    PlayStation 4 Console Thread

    Um, almost definitely due to the CPU overhead of the graphics API and driver. The last game I ported to PC spent about 10% of the time in gameplay code, and 70% in the nvidia driver. The PS4 and Xbox CPUs will be plenty fast enough for game code, its really not an issue.
  15. MemoryLeak

    gamesTM issue 138

    I worked with Ed Clay for ten years and really miss him. Smartest person I've ever known. PGR was loved for its car handling and graphics, both of which Ed was responsible for since he wrote the vehicle physics and most of the graphics engine. A genuine genius, and terribly sad loss.
×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.