Friday, 30 May 2014

Amd Vs Intel Benchmarks..the last of my OCN blogs..



Originally posted (Christmas 2011) on my former OCN blog. different time, different place, different thoughts.




......




After years of trying to like AMD's plucky 'do it on the cheap' attitude

I finally came to a conclusion today... i don't think they can cut it. i have my own long story in slow understanding of this, which i won't bore you with, neither the clich'e of starting a thread to whine about AMD vs Intel.




just to put it simply, I like Intels now. Intel's seem to do what I want, when I want, not be singing a wee song while doing the washing up. the little sparkler (candle) in my heart wants to defend AMD, but y'know just because i can afford it on my budget doesn't make it good. i can admit that to myself now, seeing how an AMD six core was beaten by a Intel quad. also 'yknow




3 is the magic number..ohh yes it is, ..its the magic number!.




Why didn't AMD bring out a range in 3Ghz back in the day? why because i think they couldn't ..  they don't know how to count to 3 (jk). their hz rating must be off or everyone in the whole I.T. community must be an intel fanboy lolololol 6 cores really?? you couldn't beat him with six cores? ..what do you need, an uzi*???




there are people in the eastern block given enough incentive could beat AMD with two years, a tin can, and some copperwire.




Intel are like the extra stick of ram in the slot. AMD won't ruin the Warranty on your PSU.




just a wee rant to get this off my chest, sorry. feel better now.







if you like AMD i didn't mean to offend you, i was just sick of pretending. not trying to be heavy.




peace




Dava




*@AMD was it because your brother had broken out of jail and your missus was pregnant, so you had a lot on your mind? W.O.E were you thinking?










(Updated spelling and grammar / 05/Aug/12/ back 'into' AMD just now.. very embarrassing though 8-cores being beat by still 4 cores.. even their own)




and again on 11/12/2012.










Got the new Vishera..erm.. im in lurvvv! wow.. some stupid thoughts I always had on the amd vs intel.. I had always thought AMD did capacity over intels speed. vboxing is easier with an amd proc. not by much, but man those cores really help with both stability and be more organised. this pixie magic, erm i mean Vishera, with flatten i7 if we work out how to clock 6ghz. in other words kik az! Im not going back to the 'yahyak nahhgg i hate amd/intel fanboiz' rubbish.. i like that people care a bit less about that these days.. its nice to have peace.. but also some and i stress the word 'friendly' rivalry. :)







-----------------------------------------------------


First time I spoke to Rob .. :)


-----------------------------------------------------







WeRNothiNg 12/16/12 at 2:28pm Very accurate analogy. AMD is not a bad CPU, don't get me wrong, but it is like driving a Saturn. No one wants to, it's just all they can afford......




dava4444 12/16/12 at 10:39pm Thank you for your comment, I feel *FINALLY* with the Vishera AMD have came back on form, Phenom was a nice series for gaming, the first gen FX, was no upgrade, Intel are still ahead in many respects, but MAANN, I have a Vishera and let me tell you, this is quite a chip! and is clipping intels top heels.




capacity and price vs speed and operability, I VBoxed for a hobby for many years, not any more, but man, I would take AMD's stability over Intel's sluggish vbox performance, especially with these new 8 core chips.




sorry, maybe that was OTT, ..it's just cos AMD guys ain't get'in respect back over the Vishera, most Intel guys prolly think it's like old FX, I guess. but man it's not.




peace




Dava




WeRNothiNg 12/17/12 at 3:51am What's the product number on this cpu you are speaking of? As they have many Vishera models. Unfortunately I am one of those who cannot afford a Core i7 3960X Extreme Edition, which is $999 plus shipping of course, but am going to have to upgrade soon. I am currently running a Core2Quad Q8300, 6 GB PC2 6400, Invidia GeForce 8800 GTX OC, on a ASUS P5Q Pro Turbo motherboard. It has primarily been used for media, but I do play quite a bit of games as well. It will probably serve my movie and music purpose perfectly for some time now, but I believe by the next generation of games I will be forced to upgrade whether I have the money or not. I have always been an intel guy, but I think if it came down to it, I would buy an AMD before I went without online gaming.




dava4444 12/17/12 at 4:34pm I have the FX8350 which about $210 stateside. it's AMD's top of the range AM3+ processor. also the current gen GTX's are pretty amazing, the GT640 is like a GTX560ti for less than half the price, there are a few models, but one of those is very very similar to the 560ti, and the only difference's are the 128bit bus and DDR3 memory. for $100 bucks that's not back for 384:32:16 that's 384 cores. What's your budget?




WeRNothiNg 12/17/12 at 4:57pm I cant seem to find a AM3+ motherboard with PCIe 3.0. That can't be right.




dava4444 12/17/12 at 5:11pm AMD doesn't have PCIE 3.0 yet. but it doesn't really impact you until you hit four cards anyway.. the ROG CHV (Crosshair formula V) has better lanes for the AM3+ socket.. really thinking about getting one myself so I can 3SLI. it's the only board that supports 3SLI.




WeRNothiNg 12/17/12 at 5:12pm Ok, this is something I put together rather quickly, but a base price rig with upgradable CPU options.

ASUS Sabertooth 990FX motherboard, ASUS GTX 560 TI graphics, 16gb pc3 12800, and AMD FX-6100 3.3Ghz 6 core($104). This was I could upgrade for the Vishera when the price drops a bit, but should be happy with this setup for quite sometime shouldn't I? I just don't want to buy an AM3 board and be stuck with no upgradability options other than video. The grand total for this would only be $550 plus shipping.




WeRNothiNg 12/17/12 at 5:18pm How do I fix the PMs? Says I am limited to 2 per day!




dava4444 12/17/12 at 7:55pmYes WeR man thats sounds good stuff! you should be able to game on that np. and yes save for the upgrade, smart. amd @panchoeltroll yes I had the ASRock version of that board (I think) mATX nice board.




WeRNothiNg 12/18/12 at 3:00pm Well, still wont let me send a PM. I do have steam, but it wont let me add friends as I have not bought any games through them. All the retail games I own are through origin. Do you have Origin?




dava4444 12/18/12 at 7:38pm yes, same username as before




Thursday, 29 May 2014

How to Stack A Deck of Cards.. or whyohwhy can't a Titan-Z fallout the sky for me?

Hi Guys

I'm glad to see you guys becoming regulars, you're support is appreciated.

What is a GPU core?

A GPU core is made up of shader cores (USC) Texture mapping cores (TMC) and Rendering cores. The one to *really* pay attention to is the shader cores.. let's take for instance, a fav of myself and Rob's.. a Nvidia GTX 660Ti.. it has 1344 shader cores.. it also has 112 texture mapping cores and 24 rendering cores.. depending what you do with these parts, they can go from being essential to being window dressing, except the shaders, they always play a big part in what you are doing.

For example, if you were playing a 2D side scrolling game but it has many many effects, realistic fire, realistic waves/water, the texture mapping cores wouldn't be doing much, but I *think* the shaders and rendering cores would.but I find this all hard to translate to you guys as I don't know the inner workings of SMX's that well.

What is a 'good' graphics card?

Well as with most things computer that depends on what you will use it for. there is no use buying a Titan-Z to just play flappy birds and pokemon on emulators. any card *just* below mid-range can do those things comfortably and sometimes passively cooled so silently.

But you want to game right?? with cutting edge visuals and cutting edge engines! .. so yes, that takes a lot of Mister Ed's under the bonnet!

You have many options, asin~ if you are reading this you already know,
but.. 'biggest and best or divide and conquer'?

Rob's two 660ti's will give a 780ti a run for it's money, they are (basically) equal and Rob has got to enjoy that for the last year or so..ahead of when the 780ti came out. He got extra time enjoying his system, he got to choose how he would like to fund his gaming, becuase he bought one then later bought another.

Downsides are this is a more noisy way to game unless you go water.
if you use *excellent* headsets primo noise cancelling etc..then this problem goes away. SLI (and crossfire) has been known in the past to give people problems..but personally after many years SLI'ing I have not run in to a problem that wasn't caused by me ever.

Going SOLO
1 card to rule them all, etc.. This is probably the most expensive way to do things, instead of putting together a few high mid-range cards. but what you get is something simple and elegant in both it's power and it's beauty.

The prestige factor of even owning a top of the range from either AMD or Nvidia is HUGE, and others will come round your house like you own a ferrari and want to see.

..and yes it can totally game. ..but that brings me to the CPU aspect of this adventure.

It's no use buying a high end graphics card if your CPU stops the GPU at the front door everytime it just wanted to be nice and go and make you a coffee, CPU standing there thinking 'what is a coffee?'.
Sorry.. I mean, the data needs to be able to 'flow' and if it can't people call that a bottleneck.. imagine a river the river is data, if there is a narrow point along the river the water finds it harder to pass and so the water slows.

with single cards and many CPU's this is no problem, as long as you have 4 cores, you *should* be good to go.

RAM and textures
simply put.. the more Video RAM you have, the larger the texture = the quicker the load. bit-bus plays a part in this too and also AA.


To wrap up

IF I could.. I would watercool 4 GTX 670's (4GB) or 680's (4GB) and stomp all over a Titan for less than half the cost.. my 3 660Ti's will give a Titan Black a run for it's money, I lose out on high textures, with 2GB of VRam but I have more shader cores.

I have 21 Kepler SMX's the Titan Black has 15, the Titan-Z has 30.

the only downside is the extra noise and set up time.. but if I have a card failure, I can continue with the remaining cards.

sorry if any of that was boastful, I'm just so pleased with the 660ti's and think many would find a great advantage in both cost and usability that I feel gushing to tell you guys.

hope some of this made sense..

Dava :)

Edit:

The 660ti has a 3 card limit to SLI. the 670/680 is 4. 690 is 2 card limit, becuase it already IS two 680's combined on one card.





Monday, 26 May 2014

I'd rather have a bigger byte than a 'bit'..or ISP lingo and bingo

Hi guys

a few years ago now, I blogged on the topic of ISP terminology. I thought I would re-hash this for anyone who was interested. hash browns, bigger bytes, queue at mcdonalds, queue in servers for data loads from the ISP mmm.

When you buy a lovely BigMac one of the nice fresh ones, in an area/place you feel confidant has had no tempering, up market, the place might be ten years old, but as you sit it is so clean you feel like you are the first and only one ever sit in that booth.

You were told on TV in an advert, BigMac's for sale '300 Guineas'  you supposed it was some kind of dismissible joke, no one uses 'Guineas' nowadays they were only used in american colonies and in the UK in the 1700's. You go along with your £/$3 thinking 'what a great deal!', the restaurant is packed with slightly annoyed customers, but happy never-the-less. You approach the counter and ask for a BigMac and ask about the deal, The assistant says 'yes sir, that'll be 300 guineas' you reply how much is that in pounds(or dollars)?' he replies 'oh that's 5 pounds' you say 'erm that's not a great deal, its the same as your regular rates.. .. >:( ..:( ..' and he replies 'well since you are here and hungry, do you still want it?' sadly you are hungry and reply..':( ... :) okay'.  ....  this is just my imagination, but done to illustrate a point.

ISP's have an old way of counting speeds as a marketing tool to increase shopping confidence. They count in megabits per second, kilobits per second.. not *bytes* which is the more common counting method for everyone else. To explain, there is 8 bits in 1 byte, so a 20 megabit per second, is actually a 2.5 megabyte connection.. but there are many more factors that determine your true speed.

1. How far you are away from the ISP server, if they use traffic shaping, something i feel should be tolerated, but monitored by the government, that ISP's are not just cutting us for little to no reason, it has to be justified, and if not, why not? ..why not increase infrastructure to meet the demand you said you could prove to people you have now locked to a contract?? that costs money and if you were willing to enter a contract you should be proving the agreed service or have a just reason why you occasionally (1 hour a day) limit their internet.

2. loads on servers, much like traffic shaping, but not controlled by the ISP, like the old told tale of the world cup, and everyone* when to put the kettle on for a cup of tea and there was a blackout, too much demand all at the one time.

3. home networking ability. simply your ethernet/card/wifi or its drivers is not good enough to soak up all that data quickly enough.. harddrive/ssd plays a big part in this too, where the data is being stored has to be able to write the data quickly enough from the download.. slow drive can = slow download.

4. finally the ISP's themselves and their server policies, some but not all reserve bandwidth in case things get 'choppy/hectic'.. this bandwidth pretty much just sits there doing nothing, and unless Elvis comes back and Yahoo news or a Facebook viral video are first to report, I guess it's only used to bounce youtube around a little to bolster your bandwidth.


buuuut.. its this thing where.. they shouldn't be using an old measurement to impress people.

Story
I recently switched to Sky after years and years with Virgin Media.. they did a *very* bad thing to me 2 years ago, and I think they must have some the worst staff policies of any company out there. in 2011, they sold me a V+ box, I had talked it out with VM salesman over the phone..he had called me, its was £100 over three months (2 and a half), I said yeah sounds good..but that's not what happened, it £200 over 2 months, I had my internet cut off in a bad time in my life, I had broke up with my ex and was quite lonely, I was relying on the internet to keep in touch with friends as I don't have a car and they live far enough away thats it's not cool to ask them to come for me. I phoned VM up being a long standing customer I thought: 'I have put loads of money in their bank over the years surely they will give me an extension until I get paid two weeks later'. they didn't. and I was alone at already very isolated point in my life... I decided not to pay next time either I was furious at them at all the loyalty I had showed them for them to not even care when I needed them.. I rung them up and said I wanted to clear my bill and leave.. they told me with getting the V+ box I had renewed my contract and could not leave for another 2 years.. I had already told them I did not want it and wanted to return the box..which I did 2 weeks later.

I never forgot that..I wanted out of the painful reminder of what they had helped do to me in a bad time in my life..

So in October.. I phoned them up and said my contract is up and I want to leave. they danced around this and it took 3 separate phone calls to end my contract with them.. at one point a customer manager came on and tried her hardest to talk me out it.. I even told her the story above and she didn't want to let me go .. what a B*!! try to convince me by saying 'standard policy' hen you were doomed from before you lifted the phone but c'mon, not even trying here, except when it came to scare tactics.. oh she was good at that.. fine this! and fine that! but I held my ground. heartless B*.
I gave her 'laldy' she heard exactly what I thought of VM for doing this to 6 year (in 2011) customer.


when all was said and done I spoke to another gent a nice guy who wrapped it all for me/us. he even sent £13 refund for the end of a bill cycle.. I also felt he felt sorry about they way I had been treated, but that might have been my imagination.

I had good experiences with VM too, like when I wrote them some feedback in 2009 about a separate thing I don't remember, I mentioned I had a got one of their london call centers and the londoner had been abusive to me..they phoned me up and asked how i thought this could be avoided and I asked for scottish call centers..i think they did this, as a year later while calling them, I got through to a wow.. scottish voice. and the londoners 'I can't understand you.. speak english! do.you.know.english.?' seemed to fade away.

wow this is all just pouring out..should have I just continued with the article?.. I mean you get it right?


Dava





*(this was supposedly in London in the 60's or 70's IIRC)


Friday, 23 May 2014

I am not a number!! .. or Apple discovers a new kind of isolation.

Hi guys

Just wanted to share a few thoughts on the new iCloud locking feature..

Firstly, yes I recently bought one of these iCloud locked iPhones on eBay, and yes I unlocked it after a week of trying..but I had many thoughts during that week and since.

In case you didn't know, with the release of iOS 7, Apple now tries to 'police' their phones, with the assumption that whomever has the phone in their possession should *only* be the first person that activated the phone with their iCloud account. ..is it just me or have you also instantly seen the huge fallibility in this idea??

in case you didn't, let me share what I instantly realized, people who activated the phone could sell the phone legally, then illegally claim on the insurance (home or phone or work). Not to mention the old caveat, that many users would not *really* understand what a device is, never mind them know to remove it from *some* account 'Apple made them get' to use their phone.

For these reasons, but not only, the morality of iCloud locking is *VERY* fallible.

Now to ulterior motives by Apple themselves, by doing this they push the price of (non locked) second hand units very high and force people to consider buying a new iPhone... however, they may have failed to see how high even some of the cheapest Android phone specs are. I can buy a Chinese *high quality* phone, for around £130/£170..top specs, higher than even the iPhones in many regards, be-lying the fingerprint reader. and have a great experience.

Have they cut the stalk to spite the goose? .. they could disappear from commercial view.. as the grassroots of ALL the traffic of commerce comes from resale of a brand product, as an introduction to a first time user.

So first time users according to this logic are also criminals, if they unknowingly buy an iCloud locked iPhone/iDevice, I am not a first timer, as I have owned iPhones before, it didn't occur to me that a phone might be stolen.. but am I a criminal?? noooo.

There are many scenarios I could use to defend myself, but the two above are the most likely, as by the law of averages, they have already happened. I find it hard to believe I should lack such faith in high reputation sellers on eBay..so.. WHO is APPLE to make me question the eBay community??

Apple are not very smart in this ham fisted approach to security, once a phone was attempted to be used, they should have asked for whomevers Apple ID, that presently has the phone and talked it out with them what happened..but no, they do this clumsy blind everyone is locked approach to security.

They use the IMEI number as a locking tool.. this number is now registered to a person, pretty much forever, if they die, forget they have an account, since they only used it once to sign up, or if this was a work phone in a company that went bust, and the boss needed the money (Reg'd to a former employee), again not really understanding what iCloud was, not wanting to contact the former employee.



In the UK, I can drop this phone off at the police station and if no one claims it or the police cannot find the original owner, I can 'legally' own the phone I recently purchased on eBay within 3 months.
What does Apple do then? it is no longer the property of the former owner and IMEI becomes mine, with all related features also becoming mine...including iCloud activation.

I am not a number.. I resent being locked to a IMEI.. I don't think of it as an intellectual security device, as a way to catch criminals.. when added with iCloud, it's a marketing tool.. 'your phone will never be lost or stolen, or we will disable it' .. they track you, they collect information on your whereabouts, they resent re-sale of iDevices.

if not then why?...

..they didn't contact me, but if they had and been intermediary between myself and the first owner, this all would have been easy, a few emails back and forth, so I know the first owner is genuine, wants the phone back and so I get the price I paid on eBay back.. call it a 'finders fee'. then this would have been amicable and not embarrassing. if they have replaced the phone and see I am a genuine person, then they maybe willing to release the IMEI to me which Apple could walk them through if need be.

A sense a fairness in the second hand market is what Apple is lacking.. I hope they see sense in this.

My debit card has stop working in iTunes/Apple.. coincidence?

I am not a criminal, I am not a number.

peace

Dava


Thursday, 22 May 2014

The Future of PC Gaming ..

(First posted on my blog on OCN , 30th May 2013)

Hi!

Do you ever wonder how much power the community has within big corporations? I am glad the old attitude of those grey stault buildings, and the attitude of 'we hate you go away, release the hounds!' from the 70's has disappeared, but now in the twenty-tens, it now seems like a place of confusion for big business.

WHY?
because who do they listen to? someone who is paid to tell them which is the most profitable way to go, or people on the internet, dedicated loyal fans..

I am really thinking about MS when speaking about this, because they have came on LEAPS and bounds with the community, since the *shock* they got with Vista.. it actually did them a lot of good, coz after then they had to come out of their grey office and see other people in the sunlight. ..life beyond the screen and numbers.

I mentioned they became friendlier.. it was only here and there.. but even extending to the FOSS community.. to which they are still hostile, as per the Novell patents, but there was improvement. They are, I hear, working on some cross compatibility with Windows and improving the Windows kernel learning from linux*, over time they have been friendlier and withdrew also, take Moonlight Mono, a MS/Novell Silverlight for Linux, they worked on that for 2 years, but have since ceased.

What I have written has made even me feel 'grey' but this was just a salad* now for.. the main course!

The Xbox One was announced and shown a week and a half ago, I had read and seen many things about this console, while waiting for that aforementioned day. Microsoft plans to have this as their mainstay and have no hardware refresh for ten years. that means.. WE the PC gaming community will be using DX11.. for the next ten years. I heard they also plan 'lock ins' which is a system where if you buy something it is yours and no one elses ever. no resale value on any media, zero. Nothing new with Steam, and also I think the cloud is an amazing way to go, ..but they mean disks too. so sad.

Ask yourself this how many DX9 based games do you own?

DirectX9 first released :December 19, 2002. last release : April 21, 2008.
Consoles didn't use DX10, so we didn't for the most part either, DX11 will see its full potential which I am glad of, but think on this.. how much barrel scraping did we do to DX9?? DX9 died in 2008.. and all devs could do was dig it up and shake the bones at us! ..for 4 years.

What if this happens with DX11 in 3 or 4 years time? microsoft like repetition on ideas that worked, sometimes that's pretty smart, but the nature of what they perceive to be the 'community' has changed and I do not think they know it. WHY? coz they waited faaar too long to release the new xbox.. the barrel scrapping was done by 2010/11 in THAT year they should have released a new console.. not 2 years later, in which those two years.. people everywhere noticed, that the dev in town was skeletor!

I actually find the graphics of DX9 are quite macabre now.

Steam Linux and OpenGL.

Unless Devs and publishers have decent distribution platforms, they won't move and rightly so, they don't want anyone to lose their jobs, even if it is a bit noble. I agree. but WE are now as locked in as console gamers.. locked in to DX11.. we cannot progress neither in hardware (what's the point??) or software, past DX11. Unless Devs give us a second option.. OpenGL and Steam and Linux.

If not then. Windows it is! again! for ten years! no way out! Metro 2033 claustrophobia get me out of here.. I'm a Doctor Jim not a chairlift.

peace

Dava





*while this doesn't seem like a win for FOSS we will benefit inside Linux with interpolarity. accessing Windows from Linux and vice versa. hey, maybe they will even throw in a read/write EXT filesystem extension/driver for windows.
*you needed this to know a bit of background.

UPDATE: 20/06/13. for the sake of accurate information.. ie. the truth. Today MS announced that they are NO LONGER doing the 'lock in' disc system. and I would like to praise them for taking it even further and getting rid of the anachronistic 'regional restrictions'.. a left over from the 1980's and 1990's Japanese games that is no longer needed. what does that mean for this article.. who knows?

A Laymans Guide To What Overclocking Is


(first posted on my former Blog on OCN)

Posted 15/4/12

Hi Guys..

Sometimes it's embarrassing to ask questions about things that you think everyone else knows..
So a wee guide of things I had to learn and kinda wish i'd known.. but y'know it's all part of the journey!

*Nay fancie pants boastin' here! terms I would *actually* use in my head!

Okay, so, your CPU has a 'Wall' a top boundary of how hard you can push it, based on your cooling and motherboard and how well you adjust your settings.. complicated already?? no sweat, we'll take it slow..

The idea is *not* to cobble parts together, but to have an idea what the comp will be used for from the start.. and so a [Purpose] I imagine most on here built their rigs as Gaming computers, as Gaming rigs are a kind of 'do it all' affair, they can do everything from Media Encoding and Streaming to Image Manipulation to Audio Production, although the RAM for AuP might change.

Air is lowest form of cooling, but even air can be more effective than water cooling, if the water cooling is done badly and the air cooling is done well, but air is super noisy regardless, note my rig..I was trained as a muso, I love sound, this is why my rig is the way it is. silent.

But we're starting with the basics, say you've just built your first rig, and want to know what all this 'Overclocking' fuss is all about..

So you made a rig for around £$ 300 or so, cheapo PSU hoping it won't blow up*, cheapo Biostar £$20 motherboard off ebay or such, AMD dual core, RAM and bits and bob's to fill the MIDI case (how do i know this?.. i did it too!).. but you want a faster computer for free! free is good! and lots of guys on here know how to do that!

Well, depending on your motherboards ability to adjust the Front Side Bus, FSB, which is the speed of that the Motherboard 'Talks' to the CPU, relies on whether you can Overclock at all! if your motherboard doesn't have the option to adjust this setting then you cannot do it at all!*

There are lots of scary settings in your BIOS*, that can harm your computer and make you lose the work you have already done, but getting into OverClocking also known as 'OC'ing' is a kinda of thing where, while there will be losses, you cut your losses and try again, if not, then think hard of what you value more, ..data and work, or growth and personal experience.

Firstly, hit whatever key takes you into the BIOS, just have a look around, you don't need to do anything except look and absorb the information that your seeing, the terms you are seeing are very ''coke and pepsi'' meaning they vary depending from one manufacturer to the next. One company calls something this, and another calls it that, but they are all the same thing. Don't worry though, when you ask around using those terms, people on here will know what you mean.

The ''FSB'' of the motherboard, is like the speed of a motorway, imagine you have a transit system, and on the transit system there are buses, the buses go around picking people up and dropping them off where they need to be, the people are data, and the Front Side Bus is the top speed of the motorway. If you tell the Motherboard the speed can go up, then it will send the buses faster around the system, and so the people/data will get there faster, but too fast and all the buses will crash and all the people/data will be lost.

So ..we don't get silly with the FSB. The guys on here prolly been OC'ing for years, they know their stuff, but ..they didn't start out that way from day one, and you shouldn't feel intimidated if you don't know something, remember they did the silly things too, made all the mistakes you're about to make, ..but hopefully this article will speed up your understanding of what's happening inside your computer and your understanding of overclocking.


The BIOS has more than one place it can tell the buses to speed up on the transit system, it can tell the buses to speed up at certain places, think of a posh village called 'NorthBridge' if you tell the buses to move fast there because the people/data is needed quickly, then this is what will happen.

The RAM is like the bus depot where people/data waits until needed.

The CPU is like a factory where all the people work, and the CPU Multiplier is like a work-shift in the factory and you can set how many hours they all work.

Lastly for this, A thing that chokes many a PC unnoticed, is 'Drive Bandwidth', the more harddrives you have, the more the your bandwidth can ease up and deliver data quicker and more orderly.



A few of *RULES OF THUMB* for PC building in general..

1. if something goes wrong it's *usually* the RAM.

2. The more you pay for your parts the more reliable the computer *usually* is as a whole. reliability is essential for OverClocking!

3. When Windows fails, and it will*, make sure you have your docs and pics and music backed up on another drive. drivers wouldn't hurt too.

4. Linux is HARD. it's like a whole separate hobby. not always but sometimes and often. Ubuntu is your best bet for Mac on PC.. go Intel and you should be fine. like Hackintoshing without the Hacking or the Toshing.

5. Finally, PC Gaming is an experience, but more than that, because you picked the parts yourself, waited for them to come, did your build, install after installed a huge variety of things, and now you get what other people were talking about.. performance. Your Computer that YOU built, prolly faster than anyone's computer you know, because you built it. well done!



peace

Dava


*am Scottish n'sometimes it jist comes oot.

*NEVER cheap out on a PSU!

*I know about 'Software overclocking' but since canard took SetFSB off the radar, i think we're stuck. just IMO.

*not used an EFI yet personally.

*got a lot better over the years to be fair, but y'know.. it will happen :/

Realworld Performance and What It Means to You...

(First posted on my former Blog on OCN. December. 30th, 2012)


Realworld Performance and What It Means to You...

Posted 12/30/12


For years many enthusiasts have debated hardware, I am one of them, you probably are one too, seeing you are here, on OCN..

But in these times the gaps become shorter and shorter I feel, between competitors, and we get much more 'bang for buck' than we ever could have imagined ten years ago..

it's only when a company with a large user base makes a mistake that the community really takes notice these days..

I am going to do some barrel scraping now, so cover your ears if you have sensitive hearing!

AMD vs Intel

many many moons in the past AMD trumped Intel, for one shining moment for many. and then it was gone. not to be repeated. But now with the new line from AMD, which I have myself, why should I buy a Core i7? it performs better than my CPU, it's desktop performance (meaning Windows response) is better but also cost 7 times the price of my CPU. these things now seem small to me because I feel I realized something, that nowadays we have powerful enough hardware to run everything, not like the bad old days of processing video or photoshopping, now we can comfortably run a whole OS at a good speed and play the latest PC games for a reasonable price. 'Is this an advert for AMD?'..no it was my little way of saying I am happy with my rig, and feel the Vishera 8350 is a fitting processor to be AMD's flagship.

In the 'realworld' I gain nothing moving to Intel now.. sure you could say 'but you will get more frames in games' or 'your single thread apps will start faster' but yknow.. I'm not that fussed.. even worse.. I'm actually happy with the Vishera..oh. my!

Now, to every story there are two sides..I know this.. let's take Intels..

The reason Intel has done so well is because they worked harder and smarter than AMD, over 70% of the world's CPU's are Intel, they hand pick geniuses, in all fields to fit the job that the person was born to do. and it shows in the results. I am not dis'ing AMD staff, I only say that Intel were/are richer and smarter for many a'year out of the years on top. Core 2 was a masterpiece, a breakthrough of vision, and at the time AMD had the original Phenom series, which, I feel, was not up to par. Had it been golf.. they were Tiger woods and we.. were.. the ball boy.

Again Intel comes to us with a masterpiece! the Core i series.a thing of beauty in a world of architectural silicone. why? because we know how well and efficiently it performs, and this adds beauty.

But will I buy it? no. would I buy it if i were rich? possibly. but it wouldn't be something that I would ever notice.

I mentioned Photoshop and video encoding before..

If you max out your board with very reasonably priced RAM (around 100 to 150 pounds) and a sensible CPU, (AMD or Intel ) You can work faster and be more versatile than has ever been possible in the history of computing.. flip! shove on youtube and facebook while you're waiting! your system wont even notice.* I drop entire games to memory and forget for hours on end. only to come back to a still working game.

More Bang for Buck!

I feel we have came to point where, the difference is slight.. it is slighter than it has ever been.

I could go on about how much I like Nvidia.. but y'know what? right now AMD has the edge in both price and power.
I'm still with Nvidia though.. they were my first gaming card by accident. fondly.


So i'll stop dithering and come to the point, what does paying double the price get me? not much, only a few frames more.

1. if you have an SSD

2. a modern OS.

3. quite a lot of RAM.

4. a current gen mid to high range card.

5. a current gen CPU with 6 cores or more.

What do we need all this power for? rendering? folding?
What happens in the next two gens.. what do we need all that new power for??

Soon we will outgrow operating systems, the hardware is too powerful.. no. the OS is too weak.

I have Windows 8 now.. its a bit of a pain and flakey at OC, but okay i guess.. but then again I have used Win98.

These are the things of museums.. the little window that goes side to side, minimizes and maximizes.

We have the power in our PC's now that MS had pushed us to have with Vista, but I find 'the same IS the same'.. bring on holograms and give us a challenge, involve us like every sci-fi movie has promised us since we have been children!


Dava

*IF your a PRO reading this and are thinking about the office computer, no. .. please buy either professional dual socket or suffer the consequences.

NOTE: this has kind of turned out as a little ad for AMD which it wasn't supposed to. it was meant to be what is all this power for? and why do we ever need more? ALL OS's are getting bland in contrast to this powerful hardware that we all have now.

Wednesday, 21 May 2014

384-bit ways to squint your eyes.. or AA vs Better Textures

19 April


Hi guys.. wanted to share some info about 'AA' for those who are new to gaming..

Sometimes menu options are little like trying a new restaurant, you open the menu and don't know what's good. And if it's ethnic food..you don't know what all those strange words are ..

AA is 'anti-aliasing' it's designed to trick your mind into thinking far away objects are blurry but still have definition.. in a small way, it's something your eyes do in real life. not every game is good at AA implication, and makes mistakes (this is on the software level) but many games are.


using AA can make close things blurry too, but on a much smaller scale. It rounds the hard edges of pixels.. Imagine a 8-bit retro ball made from pixels and one using AA.

Memory bit bus bandwidth also plays a role on how well the card can effectively use AA..the GTX 660ti and the GTX 670 only have 1 feature change but it's an important one, the 660ti has a 192-bit bus and the 670 has 256-bit bus, the cards otherwise are identical.

Why did Nvidia do this? to mark a price point, because the bit bus is designed to enable certain AA levels..which I'll explain.

depending on the game..

64-bit bus = no AA.
128-bit bus = no AA to x2 AA
192-bit bus = x2 to x4
256-bit bus = x4 to x8
384-bit bus = x8 to x16
512-bit bus = x16 to ?


The smaller the bit bus the more it 'chokes' the effectiveness of the VRAM and the AA. But if you SLI or XFire.. you get double the bit bus coz each card counts as a single bus. .. if you have enough shader cores to handle high AA settings and a decent bit bus.. (256-bit) then you can use AA comfortably without a huge impact on your frame rate.. however if you don't ..and even heavy OC'ing you card.. I mean you'd have to be able to 'toast bread on it hot' to match a card with a bigger bus. The higher the AA the more heat inside the GPU if the card can't shoot out those images fast enough, also overclocking the card.. ooffft! ..so if you wondering why you card is stuttering.. try without AA on.. ?

When I open up the menu.. I think 'yeah I want it all!'
like I just hit an all-you-can-eat buffet ..but no. I can play Farcry3 in SuperHD with x2 AA or I can watch it stutter for 4 hours.

hope this fills your mind instead of fills your eyes.. yumm! haha

peace

Dava

addition:
I think its very true how an engine implements AA can make like a 20 frame difference and that's difference between being able to play a game and not. The Source Engine is the lightest engine i know of.. and its also feature complete.. controller compat. AA scaling liquid participle lighting etc.

NVCP.. contains more settings for AA all the way up to-----  

x2 AA
x4 AA
x8 CSAA
x8 AA
x16 CSAA 
x16 Q CSAA
x32 CSAA
--------
SLI
x12 
x24 CSAA
x24
x48 CSAA
x48Q CSAA
x96 CSAA

Your settings might show up differently, as I have 3 way SLI, or it could be different in the the AMD catalyst control center. It's under the [Anti-Aliasing Mode] 'Enhance the application'.


but ultimately I feel.. higher quality textures are always better than.. a bit of trickery.

Monday, 12 May 2014

Dava's Hitchhiker's Guide to Operating Systems ..BYOBF (bring your own babelfish!)

[I hope you enjoy this lighthearted look at operating systems. now please imagine hitchhiker's guide voice in your head for the remainder of the article*]

Far Far Away, long long ago in the awaking land that is the 1950's people got bored, and without the internet and iPhones and interstellar travel, it's easy to see why.. they wondered 'wouldn't it be nifty if this calculator thingy did something fun too?' and they decided Tennis was fun.. 'let's do that!'.* This had been thought of before by others, but they had only succeeded in making a strange christmas tree machine* that got angry when it lost, and often thought to itself, it would have rather been a politician. ... or tennis player.

It wasn't until the Planetial Conglomeration Interstellarfractuation, who are known to the locals as ' I B M' , introduced new tools, GUI and new ways to interact with said calculator, that an idea that an Operating System could be more than just words and numbers, started to form in the minds of those who saw.

The machine was used to make money for them, as well as count money, and was called a 'Till', they used the same code used within this machine as basis for their UNIX architecture. so yes, by association.. Android is based on a cash register. :D .. others had operating systems, but frankly they weren't very good at much, except at annoying people playing text based adventures due to the inability to save game progress and giving them thoughts of AI world domination.

Sometime later in the land of 70's people took many mind altering drugs that led them to believe this had already happened.

Three companies emerged, to be the leader in the 'operating system with calculator function!'  new technology. Microsoft,  whose time travel technology enables millions today to travel back and laugh at their ancestors, had this slogan: 'operating system with calculator function.. and clock!' .. rival company Apple was the second, the Apple CEO had seen the advert and started his own campaign
'operating system with calculator function ..and *attractive* clock!' while IBM thought this was all too silly and used the slogan 'Calculator.. with bundled operating system'. Due to IBM's campaign and Apple's pricing structure.. Microsoft won by a landslide.


It wasn't until the 1990's that a man claiming to be Linus Torvalds, invented the 'Linux Kernel', he was in fact a bored Microsoft employee from the year 2821, who was tired of the humdrum attitudes of his colleagues always saying to him 'WHY do you want to do THAT??' and  'oh no!! you can't go back to change history!!' ..how wrong they were.

Once Microsoft had dominated the market, they found it sad in their hearts to let the consumer go.. they tried many things to keep the consumer happy. none of which worked, except by chance, they produced an operating system called XP. XP was cheap fast worked.. well none of those, but people seemed to like it anyway because you could change the desktop background to *sexy girls*.. and write 'Elvis Lives!!!' on the screensaver.

and the same is true today.. this is what most people use their calculators for, putting sexy girls on the desktop background and trying to find Elvis in GTA.


I hope you enjoyed : whatever on earth i just said.

and always remember cattleprods should be cleaned after AND before use!

Dava


*






*





*
Nimrod

Thursday, 8 May 2014

CaseZilla vs OOKaDApus or Please Release Me Let Me Go..

Hi guys.

If any of you have seen the pics of my case*, you know my case is fighting for its life with my water setup. For me it's just much easier and less costly to let this continue..but actually.. I should have chosen a better (bigger) case.

I have a Xigmatek White Knight 7 slot case, which is quite pretty but I have struggled.. and if you will indulge me some opinions, I would like to share my thoughts on case sizes and form factors.

The case itself is a nice case and I would buy Xigmatek again, and have since buying the White Knight.
but  ideally, I would have a 9 or 10-12 slot case. I know I have strange tubing, I admit that, and I have an order on OverclockersUK to correct that..(not in stock but on order)..but even with that, with three 2 slot cards (I plan to get my third 660ti back) .. I am going to filled to the brim! ..but I am going to miss my pretty white case. I don't want to let it go.

The definitions for cases by case manuf. are changing for a more modern age,  presumptuously perhaps, I have some opinions on what's what, as far as case size and form factor and definition.

I would say :

1, 2 and 3 mITX to ITX. Micro builds or SFF.
4 to 6 slots is ITX to mATX and Small Form Factor (sometimes also ATX for SFF)
6 to 7 is mATX to ATX, a MIDI Tower size case. general computing. small gaming build.
8 and 9 is ATX to E-ATX, a Full Tower size case. 8 being small 9 being large. Gaming 3 cards.
10 to 12 E-ATX to XL-ATX a Super Tower, great ventilation for gaming, can be used in a workstation build. Premium for gamers. 4 cards.
12+ slots for a workstation/server build. .. AFAIK, there are no cases like this on the market, perhaps there are workstation servers that I don't know about.

The cost of  Full Towers is coming down at the time of writing, you can get a small Full Tower for around £60 to £80 and a large one for around £100 to £140. 8 slots meaning four 2 slot graphics cards.. facilitating easy upgrade. 9 slots means you have clearance too.. which depending on your power supply, can be very needed.

While you can fit 3 cards in a 7 slot case.. trust me, it can get messy pretty quickly and cable tidying can be a nightmare.

We have had multi-card protocols for a while now.. and perhaps gamers, want more from their cases. now if I am buying for my main rig.. I know better, not to go below 8 slots.

peace

Dava


*https://www.facebook.com/photo.php?fbid=839032056114281&set=pcb.290615997754654&type=1&theater


7 Slot case + 2 card+ water cooling

Can I count the many ways 'I love you'? or ARM and Intel.



Hi www .

I just couldn't resist, while thinking about the subject of my last article.. i thought about ARM and Intel, and I was laughing.

Disclaimer: all the following is hearsay, guesses, subjecture and just plain 'made up' but make if it what you will*.

So around 2004 ARM applied for an x86 licence from.. you guessed it Intel. now.. Intel weren't exactly kind, and they *supposedly* asked for a sum of money they knew ARM didn't have or couldn't get.. ARM was quite Chuck Norris'ed about the whole thing. around 2009 ARM was starting to be a little influential and yes because there was so many ARM chips out there.. powerful.

Intel applied for an ARM licence .. well.. HAHAHA .. I guess Intel drove a truck load of cash up to the offices in Cambridge England, and they replied.. 'y'know AMD is like my Uncle (RISC).. so erm.. you have to wait up to 4 years until after they bring out theirs before can go to manuf.' and Intel replied 'Can I count the many ways 'I love you'?'..but an unimpressed smiley face was the reply.

well all this was two years ago so who knows now? but yeah I was laughing :))



hope you liked my little novel. haha.







* I rem watching a episode of BBC Click, that mentioned ARM and Intel a few years ago.

Can AMD Pull Off SoC? or..Schrodinger's Rabbit



Hi.

Let's chat about about the often sung potents of SoC.

There is no rabbit in AMD's hat. premise one.

AMD have often let their fanbase down, I say often but I really mean three times, but that hurt a bit didn't it? Phenom 1, Zambezi FX (on release) and their gaff on the server market, which means we don't get Opteron side-tech the following year for desktop.

The Phenom 1 is the best example, they had *every* opportunity to make a x64 4 core that would make Intel regret the 'nap' that was the pentium series..but they just couldn't do it.. it wasn't that they weren't trying, they just couldn't do it. ..if I were to guess, the money they made on the first x64 dual cores went into buying ATI.. which *WAS* smart! now they have the same opportunity, with no other 'Lions' in sight!

Intel can't compete *if* AMD manages x86/x64 parallel computing. it is estimated* 100 to 1000 times faster than standard CPU's. for this reason alone.. I think there is no rabbit in AMD's hat. AMD suffer from 'back luck/poor execution'* and NO company is *that* lucky.

I think there might be a 10% increase in computing power, possibly. and AMD will not unlock the potential of the GPU until Intel have managed to move to/past graphene and past the nanometer (2019-2025).

GREAT HOUDINI'S GHOST :O . premise two: they manage it.

I'm actually an AMD fan, I like the compatibility and not having to change my whole system when i upgrade, I have a mental block to new Intel gear because of this fact.

The stuff they are doing with APU's, is something special to behold, IGP's have traditionally been the equivalent of eating out of the bins at macdonalds, now you can game.. proper game with options-and-resolutions-and-3D-and-everything!.. all the gamers jazz we like to hear. 720p is easy now. 1080p is okay..read that again '1080p'.

AMD have been planning SoC for *many many* years.. infact they are the only major player to have read books on blackjack. *

They have money pumping at them from Xbox One and PS4 sales, not to mention their own graphics cards sales. They have every opportunity now.. the 'darlings' I think should be now courting AMD financially, because they have so much weight within the industry .. i don't mean the plain 'desktop' market or the shrinking AMD server space..but the general consumer is now getting to know AMD, oh wow!..and investors and potential engineers like that. imagine you're a genius engineer and you work for AMD, you have to explain to your family who AMD are.. you don't have to with Intel. that might and i think prolly will change soon.

So to conclude..

This in the next 3 to 4 years is AMD's 'big moment'.. rise and shine, or fall and red face*.

I root for them.. go AMD!

peace



Dava



Notes ------------

*I heard they patch it now but not sure as I didn't own one. but I hear its only 10% less in perf. to the Vishera's now.

*I watched youtube video about GPU's the guy was pretty convincing

*accidental pun

*(you could say a couple of others but money)

* rise and shine 'here comes the sun', or fall and red face 'Tom Green Movie'.

The Seasons of a System Builder ..

(Originally posted on OCN)

dava4444
Posted 7/9/13
[This is intended to be a light-hearted fun article, please enjoy ...and try not to be too serious, as it would make the ponies cry!]

Hi guys!


Well it's summer.. which I call 'building season' , I like staying up late at night working on a build with the window fully open, hearing the sounds of my town during the night.. I pretend I am in some sort of Alfred Hitchcock-ian New York, like the movie 'Rear Window' except without all the murder business. Recently I thought this should be dubbed 'Building and Case Tidying season' ..and if you have no money for parts.. then a case tidy is not the worst thing you could do instead.

Summer is too hot to bench.. as you may know, it is not until Winter that the usual main benching happens, this is called by many 'Benching Season'. I live in Scotland and during Winter It gets pretty cold.. not too bad.. but sometimes pretty cold. Ambients from 0C to 8C is the norm. I have my PC on my Windowsil, so the heat escapes straight away, in winter I enclose my curtains around my Rig, leave both windows open a crack. warm inside the house and cold outside. nice!

Summer has many mixed blessings for builders, gaming gets a little harder because of the ambient temps, and so the noise must increase, but on the other hand.. you get to build! those long..loooong sunny lazy evenings.. leading quietly into the night ahead..as dark falls.. sometimes its hard.. pouring of sweating, sometimes peaceful..

The '1 am break' for me is not really a 'break' but I always seem to stop for a little bit and think about life. then around 2:30am .. it suddenly gets cold here.. good thing because.. otherwise I would forget to eat.. I go into my kitchen and close the rear window and get something to eat.

Autumn (for me YMMV) is 'Gaming Season'.. the new releases come out.. like a débutante meeting her male fancies at a ball. This happens in Spring also, but if you indulge me a thought, Spring is for new friends, Autumn is for old friends. Spring gaming is harder with the temps vs noise, but all that didn't seem to matter.. as 'we' are just coming out of the darkness to see the sunshine.

peace

Dava

I realize this is is not my usual canter, I write about the clutter that fills our lives that should be gotten rid of.. your rights to not have old junk annoying you when you are doing something technical. but y'know gotta have a heart too..

Wednesday, 7 May 2014

An Underdog Has It's Day..


Posted 7/2/13 

Hi guys
So.. I wanted to talk a little about.. that ol'stick, the one that sits in the corner unnoticed.. but looming ever present..

AMD vs Intel. its not really 'vs' as the old fanboi-ism of yesteryear is dead like 'zombie + shotgun' .. the community has matured somewhat, and at least much less often do we hear the heated words of blind stupidity & brand loyalty without an actual 'reason'.

I use and like.. even a fan of AMD, I also like Nvidia's cards.. I like Intel Core2Quad & 2Duo too..but what I really wanted was to share some background for the newcomers to building, what on earth happened that AMD could even challenge Intel at all. For a long time.. Intel was king with the Pentium range.. no one could touch or even think of touching them..

But in 2005 (era roughly) AMD released the first 64 bit cpus.. then a year or so later dual cores.. they really caught Intel napping.. fast asleep like a little child after a bedtime story, and warm glass of milk. AMD stormed the market after years of having so-so cpus, they gave Intel a shock that ran to the 'cores'. Intel had only the Pentium 4 and quickly whipped up the Pentium D, which was a 'nightmare in drag'.. an awful horror movie where the hero is also the villain.

Well.. as you can guess. Intel was supremely embarrassed, and when they heard AMD had four cores coming...this is when they pulled their act together, like a lonely guy with a hot GF 5 days before an ugly break up. They created the Core2 Series.. it is still to my mind a thing of great beauty.. like walking around a beautiful building made of silicone.. a masterpiece.

AMD *did* bring out their 4 cores.. but it was much like the aforementioned zombie.. and still to this day should be shot on sight. AMD brought out the Phenom 1 (one) series, as contrasts go, it was the opposite Intel's structural wonder. So much so Intel gave AMD a slap in the face with release of the Q8xxx range.. just to show everyone, that even a poorly made Core2Quad could beat the Phenom 1's. and it did.

This hurt AMD is ways that would last until the Vishera came out, only last year. The Phenom one had set AMD back 4/5 years.. a boring three hour movie that you could hear the other movie goers breathing by industry standards. what happened?.. The Phenom II series kept fans alive, but didn't even touch any of Intel's stock offerings.

Intel had built on the Q9xxx .. and refined it to create the Core-i series..refining a masterpiece in efficiency. wow. Mensa start taking notes! and Intel still do to this day. the original Zambezi FX series, was a disappointment, for the fact that Phenom II could outperform it.. it wasn't until the Vishera FX came out only last year.. we had a new king in the AMD camp. The 8350 (i feel) sits nicely between the I5's and the I7's of Intels range, with the 8320 on level pegging with the top i5. all of the i7's beat the 8350 without question, but the 8350 'clips at the heels' of the lower i7's not reaching but.. almost.

it's my old adage 'AMD for price & capacity, Intel for performance & efficiency'. truck vs race car.. and this last truck, Vishera, was full of soldiers ready to shoot zombies.

peace.
Dava

Terminality of Terminal ..or .. 'can it play Spy Hunter?'

David Shaw


15 February


Hi guys.

I recently installed Ubuntu,.. my OH and I were in Limketkai mall, we went in to the apple-like store and she got to see for the first time, OS X.. and a macbook pro. we talked about operating systems, to my delight. i told her about Ubuntu, using the analogy that OS X and Ubuntu are like twins. Both very secure both very fast.. polar opposites in the approach to funding.. one heavily commercial the other FOSS. but certain things are bothering me.. and I'd like to share.


I blogged about this in 2009 on zdnet uk.. the attitude of Linux Devs, that the end user should HAVE to use terminal. no Windows dev would dare assume that a windows end user should have to use CMD to use or install their program. .. while it has improved slightly in the 5 years since i blogged, i still only, 1 hour ago, had to use terminal to do things. yes there is the software center. its good, needs a little style improvement, but yeah i like its functionality regardless... wait wait wait.. did i just contradict myself, for the sake and the sake that happens over and over.. that Linux users are GRATEFUL for getting work for free?? no no thats not good enough! that attitude holds back the whole community, through lack of ẗrue critical response. which is the very center of what improvement is..


where inspiration meets truth and correction.

.. to coin a phrase.

Being honest, Ubuntu is versatile, perennially incomplete, has great expectations of a simple end user regardless of their actual ability, and could be OS X tomorrow, with a very minimal amount of work (user operability). with three simple changes.. Canonical hires design staff, oh I love unity desktop, but things like software center still have strange blank style pages. not just software center.. other places too.. like in settings. an initiative to let linux devs know.. its not okay just because YOU'RE a geek, that everyone who will use your program will be fine with terminal (´oh they should just learn terminal commands.. yadda yadda.. i find it much easier.. WHAT?? YOU are a developer and you find CODE easy? big flipping shock!! .. no one should ever have to go into terminal. ever. .. how often do you think the average windows user spends in CMD.. i showed cmd to couple of people over the years.. they were frighten of cmd). true story!


finally to the third correction i'd like to suggest.. don't just accept mediocre coz you got it for free. complain tell the truth and show you love Linux and that you want it to improve, not languish in the past.. like some 80's zx spectrum game. LOAD ´SPY HUNTER´ ..beeeeep beup! beee..

:)

Dava

Refresh and Display Resolutions in a Modern Age

David Shaw


17 January


Hi guys .. I happen to know we have a couple of members with fancy monitors.. Myself included so I thought i'd throw out some info for those thinking about upgrading


REFRESH RATE

Your monitor tv has a limit to how many pictures per second it can show..very many are 60z this is a common refresh rate..it means your monitor can show 60 images (frames) per second.


Any frames above 60 are wasted. So if you but an amazing i7 titan rig and play minecraft and it says you are getting 300 good..but you will only see 60 of those frames if your monitor is 60hz. Higher refresh = better.


Common refreshes are (progressive) 50/60/72/85/100/120/144/200/240

Interlaced 23/24/29/30.


DISPLAY RESOLUTIONS

I could go on about texture fill rates but really its about how your system handles resolutions in it most basic form.. the gpu will handle most things but when dealing with higher resolutions the whole system gets involved.


Common resolutions are 1280x720 1920x1080 2540x1440 2880x1800 3200x1800 3840x2160 4096x2160 5120x1800 and many more ..


Think of what's best for your playing experience.. I *can* play bioinf in 5120x1800 but I get 10 frames per second and the textures run out at 2880/3200 so why would I? The right mix can give you much pleasure visually and playing experience


Look im being eaten by flies g2g guys


Hope this helped..did this on my mobile


update:


Resolution Definitions


The first significant improvement came in the early eighties with PYE TV, before then it was very very difficult to get a sharp picture in any sense, yes this is only for TV's but when PYE (the concept not the company) went stateside 10 years later they invented the first HDTVs. contrary to popular belief, it wasn't the Japanese who invented HD, it was the Americans, inspired by British PYE tech.


Standard Resolutions SD


320

480

576


there is also an upscaler for HD that comes from SD, 640p translated to 720p and magnified for 1080.


High Definition


720

1080


the next ones I mention are new tech, make up your own mind if these nom'ers fit.


SuperHD (2K and 3K)


1440

1600

1800


UltraHD (4K and 8K)

2160

2300

4320

8640


as we grow in the tech, perhaps you can see as I that we are becoming less rigid in the choice of screen resolutions and we can choose more and more for ourselves.

:)

Dava

MS Windows Licensing Structure

David Shaw


9 January


Hi guys


been in the Philippines for a while now busy with my girl. but thought id do a quick article on Windows (types, kinda and) licences, just in case anyone was wondering


Consumer Windows.


OEMAct


OEM's are companies like Dell and HP, Lenovo and Acer, they buy bulk licences from MS (Microsoft) and as long as your laptop or desktop has OEM Bios of the respective licence (of theirs), their OEMAct key will work to activate Windows, (albeit hit and miss) these can be picked up for as little as £10 -£20 on ebay, even for Ultimate (Win7 and Vista). they are normally preinstalled by the manufacturer.


OEM


These licenses are for us system builders, they are also for computers that we build to sell. Both the OEM and OEMAct do not have a legitimate transfer ability, but MS is normally fairly relaxed in their attitude to transfer the licence to a new computer. this is not a guarantee, but if the mood strikes MS to *not* active your copy. as long as they see you bought your copy, they are *normally* fine with you.


Retail


These are the more expensive kind, note with the advent of Windows 8 some of the license types have changed, but the info is basically the same. A Retail licence has the ability to transfer guaranteed, also you are entitled to 1 month phone support from MS, but also they will support you during service pack releases. these are usually expected to be bought by small businesses, but builders buy them too for their own rigs, I have couple of these licences, and it feels good to know that i won't get nonsense from the activation system when I have to re-install. the OEM one's usually refuse and ask me to use the phone system to activate or even worse tell me to buy a new licence key.


A retail key should never give you nonsense, unless you are playing musical chairs with your rigs and licence keys.


and the final consumer licence,


Upgrade Licence


When you have a previous version of windows you can upgrade to a later version, like XP to Vista, Vista to Win7, Win 7 to Windows 8, (you can also skip versions)if you have a retail licence in an older version this 'downgrades' you to an OEM licence of the new version of windows, as all upgrade licenses are in reality OEM licences.


Upgrade licences usually force to do a clean install, and only a few times have i personally successfully upgraded and it worked without a clean install. this was for Vista Home Premium to Vista Ultimate only.


Business/ Server Licence


not an expert in this at all but.. here goes


a Server Licence costs anywhere around £1000 to £2000, and each user/workstation needs a licence called a CAL. CAL's cost around £20 each. but these do not have to be a personal licence, just per user. if you expect 10 users to use a workstation you buy 10 CAL's along with your server licence, and even if you replace and employee the new one can use this also.


Business licenses are not just for aforementioned OEM's who buy in bulk but also for large corporations and governments.


ok ok, I cant concentrate coz I have flys constantly landing on me. tried my best guys. maybe come back and add some more info later.


thanks for reading.


Dava

Dava's Hitchhiker's Guide to Computers. ..bring a towel..

David Shaw


29 November 2013


Hi guys


we've had a little influx of members new to building so.. as you do.. I thought I'd share some info.


My speciality is Operating Systems. (I hope Rob does one of these on networking, he was a network admin a few years back as you might already know). I VBoxed for many years as it was a hobby of mine, not done it now since 2011 but i keep a hand in reading up about OS's from time to time. but in that time I learnt a lot.. I think, .. about what on earth is going on inside your computer.

My info might not be perfect but it'll at least give you an idea of the concepts involved. .. I hope!






[now please imagine Hitchhiker's Guide voice in your head for the remainder of the article. thank you!]


1. Hardware


far far away in a sleepy land known as 'Asia' there are a whole huge team of 'Asians' working away.. they work in cooperation with the States and sometimes Japan (yes Asians too but not the main place and kind of owned by America ).. and sometimes Germany of 'Europe'. The idea is to take sand and make it into Silicone. that Silicone then becomes the chips we use. they use a form of engraving called 'lithography'.. they do it to nanometers.. this isn't custom engraving for wedding rings and they don't even do it if you fone them up and offer £20. . all chips start this way, blank. then have the ability engraved in them for a programmable purpose.


2. Bios/UEFI

not really a Sex Pistols song* but a way to manage all those chips in a more orderly fashion. stop go look listen. all that jazz.


Think of the BIOS as a teeny tiny operating system like windows. but with a few set commands. .. down here it rules all the chips. it itself his its own chip where the data for its execution is stored. but it will never tell the other chips where in case they stage a mutiny. which only happened once in 1999, in a small house in Skegness the owner was even more surprised when the SAS broken in and torched the offending computer involved. He was heard to remark ' that's the last time I buy a Gateway computer'. These days however it is a new system of UEFI which is even more like a small operating system. these sub-systems continue to work even after windows boots.. controlling your drives your CPU power options and a few more things besides. ..teasmade.


3. HAL layer


ok so im no expert here but.. here goes.


Hardware Abstraction Layer.. the 'HAL' YES JUST LIKE IN THE MOVIE WITH THE CRAZY SENTIENT AI


This is a level up here.. inside the main part of Windows is something called a 'Kernel' its a set of around 80 instructions* to make things 'work'. it chitchats and has coffee with the bios and the hal layer and all have a hopefully good time and no fights breaks out and someone shoots someone's dog over a confusion about who was meant to pay.. 'should we just have gone dutch?'.


The HAL is connected to both the BIOS and the registry.. which looks a little like a tree in windows. it tells the registry what on earth is going on below.. its so everything is organized and you don't have the kernel having a heartattack if it ever saw what the desktop wallpaper was. who would have guess windows was so ...organised!


Let's hope Windows does not become sentient, and just go on the internet and look at naughty pictures and be slow and complain when we ask it 'did you do that task yet?? ' ..but then we notice it sat on the couch drinking beer all day and not loading the drivers correctly !


4. Registry.


Thinks its a tree, like some hippies did in the 60's but it's actually a complicated but simplified way of storing data in a manageable way for the purposes of supposedly quick access. this myth has existed for millenia.. when the code was first written on the inside of a cave c2500BC by Homsfelt Grabaoffeltz, the very first MS employee on record. .. note if the kernel is LARGE then this hurts the process access time for the registry. this is why Linux is faster, and why Windows needs a Virus Checker at all.


But also Linux People like Jazz and coffee bars too much so we won't talk them. shhh. be quiet, they might not hear us.


5. GUI


Graphical User Interface. its what you see. there is a window manager also but basically. its this.. your browser and what you are seeing now.


if some day you find yourself on a spaceship and they tell you to 'use the hologram!'..and you answer 'oh! you have a holographic GUI' they will be so impressed they will give you a kudos star on the MS forum website.


no cattle prod for 3 dayz reward for you!


I hope you enjoyed this light hearted look at: what on earth I just said.


and.. maybe we can do this again sometime.. no cattle prods this time .. honest


Dava.


* (bye bye EMI)

*(Last I heard Linux had around 28 instructions, so much cleaner/faster, but still breaks almost all the time.. exaggeration tho. but I am sick of using terminal!)

* and yes I know I am saying Hardware Abstraction Layer Layer. but habit.

Power Supplies Quick Knowledge... or Dougal and the spiderbaby..




David Shaw


27 November 2013


okay sooo..


Thought I'd share some info on PSU's Power Supplies.. Rob knows a fair bit too.. but I don't think we have any engineer experts in the group. (afaik but..if you are, please chime in! ) so i'll do what I can to share some knowledge and opinions.


There are many 'fly-by-night' PSU (power supply) manufacturers .. mostly chinese companies looking to make a few fast quid. They sell PSU's incorrectly branded.. some might say 'deliberately' shock 'no!'. sell you a 100w PSU and brand it as 1200w.. but 750w is more believable so they do that too hahaha. they are lucky to output 100w. if you expect 1200w power supply for £15.. erm.. that won't happen. .. it's a bit like dougal and the spider-baby. not really a baby. not as advertised.


'An aul fella weh a saddle oan him' apt words.. like 100w PSU trying to output for a system designed for more power.

There are rules of thumb.. like the 80Plus system. This ensures your PSU *should* have 80% of the branded wattage. .. there are 5 standards. 80Plus Standard/White. 80Plus Bronze. 80Plus Silver. 80Plus Gold. 80Plus Platinum.


80Plus Standard/White


this ensures your PSU does at least 75% of the branded wattage under load.


80Plus Bronze.


this ensures your PSU does at least 80% of the branded wattage under load.


80Plus Silver


this ensures your PSU does at least 85% of the branded wattage under load.


80Plus Gold


this ensures your PSU does at least 90% of the branded wattage under load.


80Plus Platinum


this ensures your PSU does at least 95% of the branded wattage under load*.


What is 'LOAD'??


I could go into much detail about this.. but.. so you get this.. basically when you are gaming. thats it.. or thats how ill put it for 'now'.


your CPU might be very thirsty! and in need of constant rails.. rails sounds complicated and.. well if you're reading this kind of thing for the first time it is. but the PSUs who have said above standards AND are from a modern era to match your CPU*.. should be fine.


Peace. Dava.

* although they give platinum for 92% efficiency, as 95% is so hard to get. but 95% is the target, I read the 80Plus's cheatsheet PDF from their website.
*made in the same era as your CPU socket.

Dream a little dream of me.. (1090FX speculation and thoughts)









David Shaw


25 November 2013


Dream a little dream of me.. (1090FX speculation and thoughts)


(this is an article I wrote on OCN a while ago.. don't go there anymore but thought id share this with you guys too, with the announcement of the new AMD A10-7850K APU coming out in January I thought it pertinent... hope you like)


dava4444By: dava4444

Posted 7/17/13


Hi guys


been thinking about this off and on for around a year now. AMD 1090FX


AMD produced the 990FX two years ago now.. and at the time, their heads were up in the clouds with APU innovation, and quite rightly they should be proud! the AMD APU's today boast the best IGP's you can buy for cash money! they even let you crossfire these with low end cards, they have also released couple of chipsets for the FM socket. already you can see they have put a lot of work into this new innovation.. this is perhaps why.. as yet no 1090FX chipset.


The 990FX while good in it's day .. is getting on a bit.. I recently joked with a friend 'there is code in that chip, that even the Tardis can't translate' not true.. but I thought it funny at the time. The question for me was 'Do AMD expect me to give up and change to APUs, just to get a new chipset??' .. you might say 'oh the 990FX is fine!' and yes, you'd be right.. its fine.. okay.. acceptable. does the job. .. but if I could dream..


The 1090FX would have.. on chip drivers. An expanded UEFI flash chip much greater in size, which includes every driver for the whole motherboard.


And built in wifi.. why? surely no gamer ever uses wifi.. erm..correct.. I wouldn't trust it either... but when I do a fresh install it is so valuable and hassle free to have wifi handy for internet access, handy for a second network path (streaming from/to other comps) handy for sharing internet access to other computers.. not essential..but..


I would like to see more RAM slots on the motherboard, 4 is okay.. mediocre.. but 6 or 8 appeal to me, meaning I would not have to lose usable RAM modules to gain overall capacity, also if I choose I could put smaller capacity modules in and use the channels perhaps more effectively* ...


I would like to see DDR3.. but in the DDR4 range of speed, supported on the cpu memory controller and the next FX chipset/cpu range to have out of the box 3Ghz northbridge. at least. .. yes at least AMD! .. feeling totally narky about AMD's pants northbridges. I know the plan is system on chip, but cmon! if the northbridge is slow it chokes both the RAM and the PCIE bus.. flip! and we are meant to be happy with 2 something gigahertz and a tiny *token* heatsink? the Vishera is a fine chip but you put her in old clothes.. how are we meant to show her off when her dress is so tatty ..and all her friends (fans) know it! She should eat cake of a less 'Great Expectations' variety. or to put it another way, you can't kung fu kick Intel wearing clogs.


PCIE 3 with up to 6 lanes.. why? because we know this chipset is to last a couple of years and by then AMD and Nvidia might have opened up SLI/XCF to greater than 4 cards per respective multi protocol. I would like to see x16 only lanes.. why when the bandwidth effects so little? But when gaming in SuperHD you DO take an impact for x8 lanes.. this is only from my experience in SLI.. but check it out for yourself.


Unified northbridge southbridge and vrm/mosfet motherboard location with a big heatsink and fan slapped on. Something like Intel does now. Yes, this is copying them a bit.. quite a bit, but really they got this right! it was staring everyone in the face for years and they got there first.


I want to talk a little about AMD's plan for 'system on chip' ..To quote Futurama '..Welcome to the world of tomorrow!' .. why can't we have performance APUs beside the consumer ones? It makes total sense to me.. I really like Nvidia, they are familiar and very smooth. but if AMD started dropping the high end GPU's into a performance APU.. I think it would be, to pun, 'game over' for Nvidia. Crossfire IGP high performance.. ready to go! high end CPU and high end GPU combined with a high end card.. oh flip! .. but to regress, AMD don't have a chipset good enough to handle all of this.. being honest I don't know the FM chipsets first hand at all, but i don't think they are ready yet.. why? because if they were, AMD is smart enough to be doing this already. I had a look at the FM chipsets.. while they seem, efficient enough for now, in a year or two's time, I don't think this design is good enough to maintain the idea of a performance APU.


8 CPU cores, 2 GPU cores, Cache, Memory Controller, Northbridge, SouthBridge. I often think.. who cares how big the end chip is? the Vishera is sizable, but even if we had it twice the size, yes, that might sound stupid because of heat issues, but bear with me.. the performance we would get is unlike anything any other company could touch or think of touching. Heat issues would be very bad, yes, correct, but you're dealing with enthusiast chips here..


'go water, or go home'


I heard an acquaintance say on the subject of current AMD cpus. I think that is a witty and not altogether untrue quip.


System on chip solves the chipset problem, only copper and plastic remains.. I guess writing this I talked myself out of wanting the 1090FX chipset and my true desires came out.


Do I want AMD to 'win' flip no! I think a open market is a healthy market, but do I want them to do well.. yeah I kinda do..


peace


Dava


*but i am unsure of this as I have never had the opportunity to see many channels used effectively.

Core Parking

David Shaw


17 November 2013


Hi guys.. just wanted to share something with youz about the FX 8 core Vishera's (unsure of this applies to 4 to 6 core and also Zambezi FX) .. okay its about the two updates perhaps you already know and have applied them. here they are (thanks to fossilfern at TekSyndicate forum for the links.)



http://support.microsoft.com/kb/2645594


http://support.microsoft.com/kb/2646060


Okay so now that's done. I want to share why I think these are important updates/patches.. I hear it depends on CPU to CPU.. just your luck.. but once its applied in mine.. wow.. I had a little go at underclocking before, just to see what happened really.. and.. nothing.. I was SO surprised .. to see my games run 'as normal' even though I had underclocked to 2.2Ghz. frame rates were the exact same as 4Ghz. (I only tried 1080 and 1440). without the mentioned updates.. I can 'feel' the cpu having problems.. that might be bologne.. a system builder with aspirations of being Patricia Arquette in 'Medium'.. but I do notice I get more screen cutting without the patches in games.


from what I read it depends from cpu to cpu. .. but I prefer to have rather than not.. unless your on win8.. in which it gave me lots of problems.


peace


Dava

Why are we tied to Triangles?

David Shaw


28 June 2013 · Dumbarton


Hi guys, been thinking about graphics recently over the last few weeks.. did you know in the early days there was two graphics systems for drawing pixels?

let me explain... back, ..waay-back-when 70's/80's the graphics for games was very rudimentary, at some point around the 80's there was a choice by the industry.. involving how a 3d game could be made, the industry chose 'Polygons' a system involving triangles to draw 3D pictures, but there was another system called 'Vauxhallite' it was much like asteroids, and was a technology based on 'pixel for pixel' on screen, very basic and going with Polygons was very much a smart choice on the part of the industry. .. but today.. things are different..why? because now we have high pixel density TV's and Monitors.


The technology itself, it much like a very super tiny search engine, it goes from graphics card to screen and back, checking the pixel is where it supposed to be, this improves performance dramatically (as seen in a Youtube video I can no longer find..flip!) but anyways.. Polygons were a very very smart way of doing things, it uses a triangulation to create an environment, we might not see it but it's there ever so tiny, Pythagorean law, the industry had to made this choice or we would have had asteroids for 20 years, and a lack of gusto in the finance of games. but that's the crux I'm trying to share with you guys today.. performance wise it makes no to sense to use polygons now, and inside the cards there is much much wasted performance, as the card draws the whole triangle not just the parts we see on screen, all of that energy wasted on parts we can never see. but we are stuck now with this system as everything is based on polygons, and I DO mean everything from every API, DX OpenGL, to every tool, Game engines even the people involved in the creation of games, programmers, texture artists.. all know and use polygons. as PFP (pixel-for-pixel) has NO infrastructure, it can never be.. which is.. what can I say?? it makes soo much sense for us right now.. 1 screen pixel = 1 internal process.. but much like the train model in BTTF.. we have passed the point of 'no return' no doc brown can save us now.. unless OpenGL team come up with a Flux Capacitor. but that said.. I would like to say to all of the heavy hitters should you wonder in and read this.. AMD Intel Nvidia (Via and Sis to a much lesser extent).. first one to crack this.. will win the whole pot. plain and simple. you increase your performance 300% overnight without loss. really, i mean that. without loss. There is modified Vauxhallite demo online out there somewhere.. Ive seen it.. they used a very low end gpu/igp (i think it was an intel one) to go SuperHD, no frame loss or stuttering.. that is something using Vauxhallites can kill stone dead.. it adds smoothness of the likes that we can only dream of..and that's maybe what this is a dream...






peace. Dava

NB: since the time of this post I have changed some of my opinion on this. 08/05/2014

Footwork and Frames





David Shaw


25 June 2013

Wanted to educate.. some footwork .. to the newer to PC building members of our group with a few thoughts on graphics card vendors, namely AMD and Nvidia.


AMD price for punch performance is quite the heavyweight, their cards deliver better performance than the Nvidia equivalents (at retail prices, although the 660 nonTI, and 650ti boost, are an amazing mid range bargain right now). but Nvidia has a southpaw that leaves AMD's head spinning.. Nvidia has 7% less microstutter, that might not sound like much but it means that you consistently get 7% smoother experience in terms of games.


So what I'm saying is... frame for frame the AMD range will give out more frames, but when the going gets tough and your system has frame drops, its will be bad, coz AMD is on the ropes punching himself out, but Nvidia's still got that southpaw, but that might cost you a hot dinner. make up your own mind. peace. Dava


-------------

'Footwork and Frames' is boxing reference, because without those two things, maintaining you footwork and maintaining your frame .. you cannot punch correctly.
Hi

I'm Dava.
I'm an admin for FTR & and I enjoy blogging about technology.

come join us on FTR Facebook here:
https://www.facebook.com/groups/ForTheRepublic/

---------------

Pictures for forum posts.


----------------

The Outer Worlds

Glitch?

companions seemed to stop getting perks..


----------


Rockstar Launcher Problems


----------------


Can't edit Gibson Forum



------------------------------


RTSS Forums






---------------------------------------------------------------


My old PC specs from 2007.. wow!

----------------

Current drives 13th Apr 2021