Powercolor or Sapphire?

Discussion in 'Discussions' started by iwanttodownloadfiles, Feb 1, 2013.

  1. Nicholas

    Nicholas Technology Director Staff Member

    Why are we still talking about graphics cards when there's an adorable kitten there trying to kill us?

    FWIW, a 6450 *should* be fine for CE? So should your 5850, Omni. (Famous last words.)
     
    OmniaNigrum and Kazeto like this.
  2. OmniaNigrum

    OmniaNigrum Member

    A 5850 is more than enough to handle Crysis on maximum settings at a reasonable framerate. Not to brag, but ten 6450s would still be less GPU horsepower than my old 5850. :)

    Still, I concur that the 6450 alone should be able to do CE with some reduction in settings. (Kill antialiasing. See how it works. If still too choppy, vsynch is next to axe. Then try reducing the detail level and draw distance. But it most certainly can work.)
     
  3. Dude I can't buy from newegg they only ship to the US but thanks :)

    Machine seems to be stable though I haven't tried stress testing it.
    I might buy a new vga cable next as after fiddling around with it it occurred to me that one of the pins is completely bent but it seems to work fine.

    Does this mean that any video card *should* be fine for CE (excluding grampa's video card found in the attic) ?
     
    Kazeto and OmniaNigrum like this.
  4. OmniaNigrum

    OmniaNigrum Member

    They also ship to Canada, China and Puerto Rico. But yeah. I did not even think of that. Sorry. :(

    I think Nicholas means any video card with Shader Model 3 support and a Gigabyte of RAM will likely work, with some working better than others. (The card supports Shader Model 5 like mine.)
     
  5. Kazeto

    Kazeto Member

    I think it can be summed up with "if you could run stuff made in 2010, you'll be able to run it.
    Which might happen to be true, considering it is an indie game and thus GLG guys aren't likely to just make the game prettier [to the point of requiring a GPU from the future to draw it] to compensate for lack of coherent gameplay like some other companies do.

    That being said, I'm not afraid about it not running on my laptop. Sure, my integrated graphic card is an oldie one, but I managed to run TES4: Oblivion on it with a high-resolution texture pack, so I'll manage to become a ruler of a clockwork empire of my own with the very same graphic card (and if not, I'll use it as an excuse to buy a new laptop; I've been putting off an upgrade for the whole last year because I've been too lazy [read: sleep-deprived] to drive to a store to get a new one).
     
    OmniaNigrum likes this.
  6. Bah :dmg_necromatic:
    linux+fermi core => can't underclock. The best I can get is to run it on low speed or full speed depending on load.
    nvidia-settings.png

    Tell them Torvalds. (starts about @1m14s)
     
    OmniaNigrum and Kazeto like this.
  7. OmniaNigrum

    OmniaNigrum Member

    When a hardware maker is flipped off and told those two words by Torvalds of all people, they are at Hitler level in my book.

    And it is so very true. AMD has been not so good to Linux, but they do support it somewhat.

    On the Windows side, both AMD and Nvidia are well supported. The driver packages are massive, but they always seem to work and have everything you will need. But Linux is something they do not want to really support since it requires them to be honest and disclose things they can otherwise keep mysterious and hope the competitors never pick up on. (Like optimizations, and what they cut corners on with regards to rendering.)
     
    Kazeto likes this.
  8. Actually Torvalds does like to talk trash like when there's a major GUI change in the OS he's using he goes up raging on the internet :D

    Yeah AMD actually does support the open source drivers contrary to Nvidia but I've heard their closed source ones are not that great.

    Thing is I'm using the closed source drivers so it's just that they simply don't care about it that much with negligible market share and all that I guess I shouldn't be surprised but it caught me off-guard.
     
    Kazeto and OmniaNigrum like this.
  9. Haldurson

    Haldurson Member

    In any case, calling someone a Nazi for supporting one O/S over another is kinda unreasonable. If they are doing it, it's because it doesn't make economic sense for them to act otherwise. You have to expect highly successful businesses to act like businesses, and not like a public service or you are bound to be disappointed. The fault is not in the business, but in your expectations of it.
     
    OmniaNigrum and Kazeto like this.
  10. Daynab

    Daynab Community Moderator Staff Member

    Yeah we really could do without the random godwins.
     
    OmniaNigrum likes this.
  11. OmniaNigrum

    OmniaNigrum Member

    Hey, call me a Godwin if you like, I speak my mind. Nvidia is not as admirable as they want people to think.

    The problem as I already said is that they want to keep their secrets. They will not make drivers that even work in half the GPUs out there, much less make them open sourced. I speak from experience. If they do not want to support open sourced projects, that is very much their business. But even the closed source precompiled drivers are full of holes and often defective.
     
    Kazeto likes this.
  12. That's not necessarily calling someone a Nazi it's more like saying that you regard that individual as a very very very very loathed individual :p

    Economic sense is pretty vague man. Having better PR also makes economic sense.


    Sort of back to topic.
    My peak temps while gaming for a time was:
    52 C° for CPU and 59 C° for the video card those are okay right?
    Idle they are 44 and 34 respectively.
     
    Kazeto likes this.
  13. Kazeto

    Kazeto Member

    Those are very okay. For as long as your video card has a temperature of less than 60~65 degrees when you slave it for all it's worth, it's fine.
    And you can raise that limit by about 5 degrees when using a laptop, by the way (because, unless reconfigured, laptop fans tend to start working for real when the temperature reaches 60 or so degrees).

    Either way, while keeping them at these temperatures would not be optimal, nowadays video cards are capable of surviving 85~90 degrees without any damage, so with what you have there you are good to go.
     
    iwanttodownloadfiles likes this.