+oisiaa There's probably a data center close by you, I bet you could do a walk though, so long as the companies they house do not need to follow PCI/HIPAA rules.
More of this please. More details. Networking topologies, Storage configurations, data center layouts, why they make the layout decisions they make. This doesn't have to be specific to this DC, but a general look at ground up DC decisions would be marvelous! Keep up the good work Comphile!
He keeps saying "The IT" like it's a physical, tangible object instead of a concept or idea. They couldn't have possibly used the words Server Room or Network Closet? I didn't really learn anything from this video. How many Xeon cores are each blade server running? Is the overall memory for the building up to 1TB? What kind of virtualized environments were they running? If so, how many clients do each terminal server host? What is the memory/bandwidth load for average customer use? Honestly, the most interesting part of this video was the 30-40sec demonstrating the security protocols in place. I love these kinds of videos but I'd have liked it to be hosted by one of the Network Engineers, not some PR rep.
+Roenie It's not just someone using a chopped-off finger. You can also trick a normal fingerprint reader by getting someone's fingerprint and using that directly. A high-quality reader won't be that easy to trick, but it is still doable with a little effort. Recreating the bloodflow is a bit more difficult. And I don't see your point about it being a bad place to work at. Because they are trying to keep people who don't work there out? I'd be more worried if the security was lax. This is a high security area, the hardware in there is worth a LOT and the information stored there probably even more...
My dad worked at the phone company for a couple of decades. To get near the equipment, you had to be a well paid technician or a low paid member of the janitorial crew.
Such a shame that they had their "client director" talk rather than someone who actually knew what was going on. Sounded like a typical PR person. Not very impressed at how he explained the power systems or capability's , all I herd was IT IT IT and POWER ......yes we get it , it uses electricity.
Yeah, he didn't get very technical when it comes to the actual capacity of the infrastructure and what kind of research was being performed. Very shallow explaination.
He is definitly a businessman. You know, I could also pack loads of instruments and many more stuff in this racks. Are these computers good for gaming, by the way?
+derLPMaxe They are probaly just running xeons but they could be using graphic cards for compute power so they might do gaming well. Not really my area but that is my guess.
+derLPMaxe It would be a supreme waste to spend that kind of money on infrastructure and use it to play games, and since they most likely are running *nix natively and other OSes through virtualization, you would likely get sub-par performance out of them. Not to mention, no real graphics sub systems, even if they are using GPU compute, those systems are not really configured for gaming.
derLPMaxe Interesting, I was under the impression that most of Europe was doing well with access. Now I feel bad for having my middle tier 150 Mbps connection, lol.
Would it be possible to asseble a team considting of a surgeon and a thief with the same bloodtype as Mr. Lamb, cut his finger off, transplant it to the thief and open the door then? The finger would probably be rejected by the thiefs pretty fast, but you could transplant his original finger back pretty fast as well.
Cool datacenter, but I can't believe you have rack space like that not caged off per customer (maybe the doors to the racks lock, but still, some of those racks where open and rack doors can bend pretty well).
I'd would really like to know an estimate of how much this stuff costs, obviously they might not want to give exact figures but just a general approximation would be nice for the power costs and hardware costs respectively.
I notice he says high power a lot. Does that mean the taller racks are filled with IBM's POWER CPUs or just that the place uses a lot of electricity (which is a given and doesn't add much information)?
Couldn't they have gotten an actual engineer to talk about the DC rather than this obvious spokesperson who knows very little? To clarify by "they" I of course mean the DC management
+dowRaist Yea, I was thinking while I watched this video how surprising it was that they weren't using rectified 3-phase power. Now I'm looking at those plugs that you can't quite see behind the mesh, and wondering if they might be.
+dowRaist Thought the same. Instead they get the director. I bet one of the engineers would of liked to have done it but was told "NO, and I want you all to keep away from the cameras as we do the walk through". Both videos ruined as they've just become two adverts.
Do these Computer Rooms have Disruptive Technology? Maybe a few power supplies with fake EMC compliances, that cause nearby computers to crash? Or a 110 dB siren that sounds at random, without warning?
+Emmerson Motta This is why there are layers of redundancy on top of layers of redundancy, on to of layers of redundancy. Sorry to sound redundant. I didn't mean to say things over and over and over.
S0chan Not sure I agree with you there. Every server has built in redundancy. It's almost impossible to set up an IT infrastructure with no redundancy, both physical and logical. It is also not at all uncommon to see those systems with built in redundancy, to be backed up by redundant physical systems in failover pairs, or even clusters. This is even more true in European countries where HA Pair redundancy is very commonly split between separate sites. Our product has an entire functionality that makes wide area redundancy possible between HA pairs, and they are heavily utilized in Europe. It is very very rare for me to see a system that doesn't have multi-head redundancy capabilities, and the entire enterprise storage industry is built on top of redundancy and being able to fail over. Redundancy technologies are some of the biggest investment areas in replacing technology. So while you are correct that it is expensive, that is just part of the cost you calculate when purchasing IT assets, and any company that doesn't invest in redundancy just doesn't value their data and IT infrastructure because they haven't had the unfortunate experience of a critical failure.
+S0chan redundancy is based upon the required uptime of a service or an agreed SLA. Required investments scale up with the required uptime. ie. Your computer at home is not the same computer as a server in a data center, even if they have the same performance specs. A server could be 10 times more expensive as different/more expensive parts with a higher MTBF are used and/or were implemented redundantly.
For reference, the company I work for advertizes a "5 9's" uptime, meaning 99.999% uptime. This is accomplished via massive redundancy technologies built in across the board.
could you and brady talk to each other and compare the security of this place and the BullionVault he was in? i would like to know which is more secure and therefore which is more valued.
+ben orr We just finished expanding one of our data centers at the location I work, but they've not moved any compute in yet, and the air coming out feels like a refrigerator. Once the compute is put in, it will be around 25 or 26 C, lol.
It doesn't really look like a high tech DC... allthough datacenters do not have to look modern to be state of the art. What I did not see in the vid, are secured cages. A lot of DC have metal cages and visual obstructions on the tech floor to fence important machines and data from curious people or even to provide different levels of access for a customer. Such infra is required for some businesses, like financial institutions.
+Herve Shango I've seen many DC's but not Microsoft's or Google's. Fact is most DC are commercial enterprises. fancyness is usually there to convince potential customers.
+Herve Shango 100 % True! Google is building a new datacenter in Groningen, here in The Netherlands. It is 3 football fields big. This is a small DC to any standard. I have been working in IT for almost 20 years now and as VP-IT for an e-commerce fulfillment company I only sometimes visit DCs, thank God! As these are the least exciting element of my work to me. It is noisy (much noisier usually then in this video!), windy and boring as hell. It is just racks and racks of computers and a few people that un/install servers, connect cables, install OSs and do some pre-config stuff. Nothing fancy or groundbreaking happening there. As soon as a server has been installed, it is usually configured remotely by an engineer, like with Google from Mountainview - California. The guy in the video who was doing the tour was obviously a mere salesman, dropping the same fancy names like 10 times. The true interesting tech facts of a DC were not mentioned, like the difference between storage at home (ie. an external HDD- 4TB, ~130 US$) versus the size of a Professional 640 GB Dell Storage unit (US $15.000,-, that weighs ~ 30 Kg, and measures 1 by 1 by 1 meter! When he was explaining the redundancy in power supplies, he forgot to mention that each individual rack usually has at least 1 UPS-power supply for itself, on which the rack is able to run for a few hours independently in case of a power failure. He could also hvae mentioned that most DCs have 1 or more satellite DCs at different locations where customers could run a mirrored architecture if required, just in case an entire DC is out because of a calamity. He could have said something about the fire protection/measures at DCs which is usually pretty impressive to see. etc, etc. Also the location of DCs is sometimes almost laughable. A while ago I worked in Geneva, Switzerland and found a huge Datacenter in the basement of the Movenpick Hotel right next to the Geneva Airport! Or in Belgium where the DC for KPN/Orange telephone company was located in a huge barn in the middle of a forest, crazy! Maybe Computerfile should do another visit and get a true insight of a DC.
Pietje Puk I love the way marketing people come up with what they think are impressive names for things. You're a Vice President of the IT department of an e-commerce fulfilment company? Do you 3 dimensionally restructure and reshape your external leisure topography with a multi purpose, composite construction, manually operated, slicing and leverage fulfilling earth moving implement? Or do you just dig the garden with a spade like the rest of us? 😉
To those who are beakin' the guy speaking, realize that he is likely the representative for clients, so jargon is a big no-no. yes we're all so smart and call them server racks and clusters but this tour is meant for the layman who is likely targeted towards a high level executive who doesn't know anything about IT as an in house infrastructure and is looking at outsourcing or SAAS solutions.
There's some.. cloud computing solutions for people to donate their gpu/cpu time for science, and it puzzles me how it takes SO SO SO much time for each little work unit compared to ..whatever consumer programs, and how much science is actually done with that one work unit.
+NightcoreTKFF If they were full of Nvidia K80s, heck yeah! And actually, if those really are "compute" machine, they might actually have GPUs in them to do number crunching (GPUs are way faster than CPUs at math).
+PhazonSouffle At the 16MW figure the other guy mentioned, it would cost them roughly 2880 USD (give or take 1000 depending on where it's located) per hour to run. This amounts to around $25,228,800 a year. Of course, I'm using consumer prices for the US that I found online. These people are located in the UK, so their prices likely differ. They might also get discounts or something related. Also, you have to keep in mind that having servers hosted at a data center is VERY expensive to the client. They are probably making a LOT more than their power costs yearly.
+Dave Birney people working in there for extended periods wear hearing protection, of course. The noise levels were actually filtered by the glass dividers, the screens around the cabinets, and the wall and ceiling material (all or some of which may need to be removed during maintenance and repair work).
+Dave Birney That first area they go into with the really high pitch sound is what my works data center sounds like. It's most likely the fans for the disks just going at full power. It really isn't that bad and first it's weird, but once you're there for a while you just get numb to it, especially if it late at night and you're tired.
+Matthew Mitchell In some places they will do a bag check. I work for a structured cabling company and we do a lot of work for NTT Docomo. Whenever we go into their data centre, they will always do a bag check before we're allowed inside.
+Yuannan Lin Well you are wrong then. In fact this specific site has a two cleaning technicians from 8 Solutions on site every weekday buffing the floors throughout the building. The dust in these rooms builds up alarming quickly even when proper maintenance schedules are following with the CRAC Units.
Data is the bread, gold, currency, and DNA of the future. We're on a trajectory that will take us from data being a loyal servant to data being the master.
wow you got the sales drone/pointy haired boss. I've had dentist visits that were more exciting than watching this video. I think the janitor would have made a more interesting interview subject the "Data Centre Client Director at Infinity" (got that of linkedin)
+Tom Nicklin (Shmink) "Cloud", "App", "An Android", "An Apple", "SSD Hard Drive", "Smartphone" The list goes on, and as such makes me more depressed as it does so.
+ipullstuffapart yeah i guess. I just hate the umbrella term IT. It doesn't really describe anything now. Its like saying there is weather outside. Well yeah of course there is but is it sunny rainy etc.
+samramdebest The school I go to recently built a new data center in the new library building and they wanted the building to be energy efficient so they wanted the data center to heat the building during the winter. Well it turned out that my school, UVU, bought new energy efficient servers for the new data center so it wasn't able to produce enough heat to heat the rest of the building. ^,^
Serious, a power outage that takes down a data center? How long must this have lasted? 24 hours or so? I know only of data centers that have their own support batteries which need to last until a big diesel generator has started up, and then it will all run independently, for as long as the diesel fuel lasts (or when it gets constantly refilled, potentially indefinitely). Some of the more modern facilities are also equipped with solar panels on the roof that can support part of the needed power (all going through huge battery buffers, of course). Others have hydrogen fuel cells. I don't believe that a power outage would easily take down a data center that's well built. But then, you guys also drive on the wrong side of the road, so anything might be possible ;-)
basically "over there...power, over there medical research, bit further to the right general research and everything else is just an overpowered computer that doesn't ever reach its full potential"
I think this video gave me IT-itis (pronounce I-Tittis :-)) Can we please have more videos of people who knows what they're talking about, and less videos of sales drones?
I once was responsible for the design and implementation of a server NOC.Went all out for redundancy - redundant power out the wazoo. It's undoing, even though I'd recommended it was not hosting a DNS zone copy. Oops.
It's great that all of this is being introduced to us by someone taking time out from his regular job as a waiter at a cocktail bar. Seriously, there is nothing wrong with a tie. Great stuff, though.
Data centers fascinate me. I could watch a 100 part series on data center intricacies.
+oisiaa There's probably a data center close by you, I bet you could do a walk though, so long as the companies they house do not need to follow PCI/HIPAA rules.
Check out the HomeLab community
"IT racks" - my new favorite expression..
Very cool tour! The amount of raw computing power that come out of centers like these is staggering!
More of this please.
More details. Networking topologies, Storage configurations, data center layouts, why they make the layout decisions they make.
This doesn't have to be specific to this DC, but a general look at ground up DC decisions would be marvelous!
Keep up the good work Comphile!
Is this able to run AC Unity at 60FPS?
No nothing can run that unoptimized shit at 60FPS.
probably not, it needs about 20km³ per fps
+grande1899 60000FPMiliSecond
ecopper1 oh thanks, now i know that, didnt know that before /s
+grande1899 It is able to run on AC Utility at 50 or 60 CPS (Hertz).
This is a joke, by the way.
He keeps saying "The IT" like it's a physical, tangible object instead of a concept or idea. They couldn't have possibly used the words Server Room or Network Closet? I didn't really learn anything from this video.
How many Xeon cores are each blade server running? Is the overall memory for the building up to 1TB? What kind of virtualized environments were they running? If so, how many clients do each terminal server host? What is the memory/bandwidth load for average customer use?
Honestly, the most interesting part of this video was the 30-40sec demonstrating the security protocols in place.
I love these kinds of videos but I'd have liked it to be hosted by one of the Network Engineers, not some PR rep.
All I can notice is the HPE 3PAR storage arrays... amazing what your eye is drawn to when you work with this sort of stuff
And in this corner is xHamster Research Center, mostly educational.
His voice. So wonderful.
Oh wow, life detector. SUCK ON THAT MOVIES.
+Roenie It's not just someone using a chopped-off finger. You can also trick a normal fingerprint reader by getting someone's fingerprint and using that directly. A high-quality reader won't be that easy to trick, but it is still doable with a little effort. Recreating the bloodflow is a bit more difficult.
And I don't see your point about it being a bad place to work at. Because they are trying to keep people who don't work there out? I'd be more worried if the security was lax. This is a high security area, the hardware in there is worth a LOT and the information stored there probably even more...
+Roenie You seem to be missing the part where that protection isn't designed primarily against staff, rather against adversaries.
+Roenie In that case, your statement just plain doesn't make sense.
My dad worked at the phone company for a couple of decades. To get near the equipment, you had to be a well paid technician or a low paid member of the janitorial crew.
Anyone else not quite sure whether Kings College is doing any medical research in there?
Coin mining
This guy doesn't strike me as an IT type of guy...
how many raspberry pi's would i need to shut down that air conditioning system?
+KatyPeezy Just 1 in the toilet room would be enough.
+KatyPeezy i would suspect, many trillions
+KatyPeezy sick reference, bro
Raspberry Pis (with similar computing power) would likely require much more air conditioning.
+KatyPeezy Elliot used just one. Genius!
Such a shame that they had their "client director" talk rather than someone who actually knew what was going on. Sounded like a typical PR person. Not very impressed at how he explained the power systems or capability's , all I herd was IT IT IT and POWER ......yes we get it , it uses electricity.
Yeah, he didn't get very technical when it comes to the actual capacity of the infrastructure and what kind of research was being performed. Very shallow explaination.
Cool video, just needs a little more "IT"...
Elliot Alderson laughs at your layers of security.
IT racks IT infastructure IT power IT technology IT IT IT IT IT
3:04 Love these yellow-black HP racks. Have yet to find a used one, but I know a couple guys on Reddit who have one.
He is definitly a businessman. You know, I could also pack loads of instruments and many more stuff in this racks.
Are these computers good for gaming, by the way?
+derLPMaxe Definitely are, but you wouldn't need a computer that powerful to play any game today. It would be a waste of money.
+derLPMaxe They are probaly just running xeons but they could be using graphic cards for compute power so they might do gaming well. Not really my area but that is my guess.
+derLPMaxe It would be a supreme waste to spend that kind of money on infrastructure and use it to play games, and since they most likely are running *nix natively and other OSes through virtualization, you would likely get sub-par performance out of them. Not to mention, no real graphics sub systems, even if they are using GPU compute, those systems are not really configured for gaming.
Leo Williams That would be an interesting read. Though, possible and practical are not necessarily the same things :)
derLPMaxe Interesting, I was under the impression that most of Europe was doing well with access. Now I feel bad for having my middle tier 150 Mbps connection, lol.
Anyone know the total storage of UA-cam at the moment and how much is added every day? Blows my mind.
does he mentioned that each rack consists of 100 petabytes of storage?
I like how the room is the cooling system!
Now, let's imagine this beast built with thermoionic valves.
So clients provide their own equipment? So do datacentres just power them on and make sure that they don't fail?
Would it be possible to asseble a team considting of a surgeon and a thief with the same bloodtype as Mr. Lamb, cut his finger off, transplant it to the thief and open the door then? The finger would probably be rejected by the thiefs pretty fast, but you could transplant his original finger back pretty fast as well.
Cool datacenter, but I can't believe you have rack space like that not caged off per customer (maybe the doors to the racks lock, but still, some of those racks where open and rack doors can bend pretty well).
Pure and INFINITE Love fo this infrastructure!!!
I'd would really like to know an estimate of how much this stuff costs, obviously they might not want to give exact figures but just a general approximation would be nice for the power costs and hardware costs respectively.
But can it run Crysis?
+BoboDoboRobo Yes, but it has no monitor connected to it, so you wouldn't be able to see it.
+BoboDoboRobo please advice
+DoubleM55 but it has no graphic card, and xeon cpu dont have integrated graphics. but, you can make a xeon gaming pc.
+Adam Abraham Maheswara chances are they actually have several Tesla cards on them!
XD
no mention of a ups?
I think I just fell in love with those servers
I notice he says high power a lot. Does that mean the taller racks are filled with IBM's POWER CPUs or just that the place uses a lot of electricity (which is a given and doesn't add much information)?
100 years from now we will have the power of this datacenter in the palms of our hands.
Are all the blades supplied with their own storage or is that located in a separate room?
Couldn't they have gotten an actual engineer to talk about the DC rather than this obvious spokesperson who knows very little?
To clarify by "they" I of course mean the DC management
+dowRaist Yea, I was thinking while I watched this video how surprising it was that they weren't using rectified 3-phase power. Now I'm looking at those plugs that you can't quite see behind the mesh, and wondering if they might be.
+Falcrist I'd be surprised if they weren't. It's pretty standard at this point.
+dowRaist I was thinking the same thing. All this guy talked about was how expensive everything was...
+dowRaist Thought the same. Instead they get the director. I bet one of the engineers would of liked to have done it but was told "NO, and I want you all to keep away from the cameras as we do the walk through".
Both videos ruined as they've just become two adverts.
+Steven Whiting Yeah his LinkedIn even says his background is in corporate sales.
Do these Computer Rooms have Disruptive Technology?
Maybe a few power supplies with fake EMC compliances, that cause nearby computers to crash? Or a 110 dB siren that sounds at random, without warning?
This is clean which company designed and installed this ?
10 Petabytes? 80 Petabits? 10000 Terabytes? 10 million Gigabytes? That's insane!
Was that guy in the background checking out Spencer ?
I wonder if there is any medical research going on in there. I'm not too sure.
Is a very stressful environment for professionals that work in those data centers and the top of that there is no room for failure.
+Emmerson Motta This is why there are layers of redundancy on top of layers of redundancy, on to of layers of redundancy. Sorry to sound redundant. I didn't mean to say things over and over and over.
Fully S0chan agree redundancy is very expensive and companies are more then happy with just one layer.
S0chan Not sure I agree with you there. Every server has built in redundancy. It's almost impossible to set up an IT infrastructure with no redundancy, both physical and logical.
It is also not at all uncommon to see those systems with built in redundancy, to be backed up by redundant physical systems in failover pairs, or even clusters.
This is even more true in European countries where HA Pair redundancy is very commonly split between separate sites. Our product has an entire functionality that makes wide area redundancy possible between HA pairs, and they are heavily utilized in Europe.
It is very very rare for me to see a system that doesn't have multi-head redundancy capabilities, and the entire enterprise storage industry is built on top of redundancy and being able to fail over.
Redundancy technologies are some of the biggest investment areas in replacing technology.
So while you are correct that it is expensive, that is just part of the cost you calculate when purchasing IT assets, and any company that doesn't invest in redundancy just doesn't value their data and IT infrastructure because they haven't had the unfortunate experience of a critical failure.
+S0chan redundancy is based upon the required uptime of a service or an agreed SLA. Required investments scale up with the required uptime. ie. Your computer at home is not the same computer as a server in a data center, even if they have the same performance specs. A server could be 10 times more expensive as different/more expensive parts with a higher MTBF are used and/or were implemented redundantly.
For reference, the company I work for advertizes a "5 9's" uptime, meaning 99.999% uptime. This is accomplished via massive redundancy technologies built in across the board.
could you and brady talk to each other and compare the security of this place and the BullionVault he was in? i would like to know which is more secure and therefore which is more valued.
Looks very cool 😃
+ben orr We just finished expanding one of our data centers at the location I work, but they've not moved any compute in yet, and the air coming out feels like a refrigerator. Once the compute is put in, it will be around 25 or 26 C, lol.
This guy reminds me of the CEO in the show "IT Crowd"
Could you please do an episode about Software-Defined Networking? I'm a networker but SDN is a concept I just don't get.
It doesn't really look like a high tech DC... allthough datacenters do not have to look modern to be state of the art. What I did not see in the vid, are secured cages. A lot of DC have metal cages and visual obstructions on the tech floor to fence important machines and data from curious people or even to provide different levels of access for a customer. Such infra is required for some businesses, like financial institutions.
+Senne Van Laer you shouldn't expect some thing this small to be anything like googles data centre or even microsoft data centre for that matter.
+Herve Shango I've seen many DC's but not Microsoft's or Google's. Fact is most DC are commercial enterprises. fancyness is usually there to convince potential customers.
+Herve Shango 100 % True! Google is building a new datacenter in Groningen, here in The Netherlands. It is 3 football fields big. This is a small DC to any standard. I have been working in IT for almost 20 years now and as VP-IT for an e-commerce fulfillment company I only sometimes visit DCs, thank God! As these are the least exciting element of my work to me. It is noisy (much noisier usually then in this video!), windy and boring as hell. It is just racks and racks of computers and a few people that un/install servers, connect cables, install OSs and do some pre-config stuff. Nothing fancy or groundbreaking happening there. As soon as a server has been installed, it is usually configured remotely by an engineer, like with Google from Mountainview - California.
The guy in the video who was doing the tour was obviously a mere salesman, dropping the same fancy names like 10 times. The true interesting tech facts of a DC were not mentioned, like the difference between storage at home (ie. an external HDD- 4TB, ~130 US$) versus the size of a Professional 640 GB Dell Storage unit (US $15.000,-, that weighs ~ 30 Kg, and measures 1 by 1 by 1 meter! When he was explaining the redundancy in power supplies, he forgot to mention that each individual rack usually has at least 1 UPS-power supply for itself, on which the rack is able to run for a few hours independently in case of a power failure. He could also hvae mentioned that most DCs have 1 or more satellite DCs at different locations where customers could run a mirrored architecture if required, just in case an entire DC is out because of a calamity. He could have said something about the fire protection/measures at DCs which is usually pretty impressive to see. etc, etc. Also the location of DCs is sometimes almost laughable. A while ago I worked in Geneva, Switzerland and found a huge Datacenter in the basement of the Movenpick Hotel right next to the Geneva Airport! Or in Belgium where the DC for KPN/Orange telephone company was located in a huge barn in the middle of a forest, crazy! Maybe Computerfile should do another visit and get a true insight of a DC.
Pietje Puk I love the way marketing people come up with what they think are impressive names for things. You're a Vice President of the IT department of an e-commerce fulfilment company? Do you 3 dimensionally restructure and reshape your external leisure topography with a multi purpose, composite construction, manually operated, slicing and leverage fulfilling earth moving implement? Or do you just dig the garden with a spade like the rest of us? 😉
To those who are beakin' the guy speaking, realize that he is likely the representative for clients, so jargon is a big no-no. yes we're all so smart and call them server racks and clusters but this tour is meant for the layman who is likely targeted towards a high level executive who doesn't know anything about IT as an in house infrastructure and is looking at outsourcing or SAAS solutions.
I would love to work there
really impressive facility
Why does medical research need such high performance computing? Are they making that many requests at the same time?
There's some.. cloud computing solutions for people to donate their gpu/cpu time for science, and it puzzles me how it takes SO SO SO much time for each little work unit compared to ..whatever consumer programs, and how much science is actually done with that one work unit.
But can It run GTA 5 on PC at Ultra,4K with 60FPS ?
+NightcoreTKFF If they were full of Nvidia K80s, heck yeah! And actually, if those really are "compute" machine, they might actually have GPUs in them to do number crunching (GPUs are way faster than CPUs at math).
This guy reminds me of Jen from the IT Crowd.
+Marfelt He's much more Douglas Reynholm!
such immense machinery
holy crap, 6MW. Dare I ask how much that costs to run.
+PhazonSouffle its 16 MW which is enough power for a small Data Center.
+PhazonSouffle At the 16MW figure the other guy mentioned, it would cost them roughly 2880 USD (give or take 1000 depending on where it's located) per hour to run. This amounts to around $25,228,800 a year. Of course, I'm using consumer prices for the US that I found online. These people are located in the UK, so their prices likely differ. They might also get discounts or something related. Also, you have to keep in mind that having servers hosted at a data center is VERY expensive to the client. They are probably making a LOT more than their power costs yearly.
Sasha Wolf Europe's electricity is 3-5 times more expensive than the US.
The Science must not stop.
Wait, is this in Slough? lol.
Why does it have to run continously? I can unplug my usb drive and the info will be safe forever.
that noise would crack me up
+Dave Birney This video does nothing to capture just how loud a data center can be.
+Dave Birney people working in there for extended periods wear hearing protection, of course. The noise levels were actually filtered by the glass dividers, the screens around the cabinets, and the wall and ceiling material (all or some of which may need to be removed during maintenance and repair work).
+Dave Birney That first area they go into with the really high pitch sound is what my works data center sounds like. It's most likely the fans for the disks just going at full power. It really isn't that bad and first it's weird, but once you're there for a while you just get numb to it, especially if it late at night and you're tired.
He reminds of the Ilusive man from the mass effect series...
Do they ever clean that room? How?
+Yuannan Lin Cool! Thank you for the info! :)
+Yuannan Lin I would have thought they could fire you just for thinking about food in there, let alone bringing food in there.
+Yuannan Lin How about not bringing food or drink into a server room? That seems like common sense.
+Matthew Mitchell In some places they will do a bag check. I work for a structured cabling company and we do a lot of work for NTT Docomo. Whenever we go into their data centre, they will always do a bag check before we're allowed inside.
+Yuannan Lin Well you are wrong then. In fact this specific site has a two cleaning technicians from 8 Solutions on site every weekday buffing the floors throughout the building. The dust in these rooms builds up alarming quickly even when proper maintenance schedules are following with the CRAC Units.
If I was a client of this company I would not be happy with this chap explaining where my machine was physically located.
It's all well and good but...
...Can it run crysis?
Amazing!
Data is the bread, gold, currency, and DNA of the future. We're on a trajectory that will take us from data being a loyal servant to data being the master.
James Bond has had a bit of a career change.
wow you got the sales drone/pointy haired boss. I've had dentist visits that were more exciting than watching this video. I think the janitor would have made a more interesting interview subject the "Data Centre Client Director at Infinity" (got that of linkedin)
Ugh. I don't know why but the word IT irritates me.
+Tom Nicklin (Shmink) "Cloud", "App", "An Android", "An Apple", "SSD Hard Drive", "Smartphone"
The list goes on, and as such makes me more depressed as it does so.
I'm not sure what your point is friend.
+ipullstuffapart yeah i guess. I just hate the umbrella term IT. It doesn't really describe anything now. Its like saying there is weather outside. Well yeah of course there is but is it sunny rainy etc.
+Tom Nicklin (Shmink) Enterprise. Enterprise. Enterprise.
I get that though because they are still applications at the end of the day.
did that guy just say IT ? or did i miss that ?:)
Awesome.
1:57 He just missed Tom Cruise slipping past.
our university has it's own server building, and a super computer used that companies can hire.
+samramdebest The school I go to recently built a new data center in the new library building and they wanted the building to be energy efficient so they wanted the data center to heat the building during the winter. Well it turned out that my school, UVU, bought new energy efficient servers for the new data center so it wasn't able to produce enough heat to heat the rest of the building. ^,^
+Austin Harsh It would still be able to displace heating load in proportion to it's energy use.
What he's trying to say is that they're dedicated servers connected to the internet.
Serious, a power outage that takes down a data center? How long must this have lasted? 24 hours or so? I know only of data centers that have their own support batteries which need to last until a big diesel generator has started up, and then it will all run independently, for as long as the diesel fuel lasts (or when it gets constantly refilled, potentially indefinitely). Some of the more modern facilities are also equipped with solar panels on the roof that can support part of the needed power (all going through huge battery buffers, of course). Others have hydrogen fuel cells. I don't believe that a power outage would easily take down a data center that's well built. But then, you guys also drive on the wrong side of the road, so anything might be possible ;-)
I live about a two minutes walk from slough trading estate. Really weird watching this video lol
basically "over there...power, over there medical research, bit further to the right general research and everything else is just an overpowered computer that doesn't ever reach its full potential"
Impressive
This guy is allergic to the word 'Server' =|
Oh please come to Canada so I can take you on a tour of a real Data Centre.
6:23 he said IT technology ... that means Information Technology technology xD
Everything I know about Slough is from the Office.
I think this video gave me IT-itis (pronounce I-Tittis :-))
Can we please have more videos of people who knows what they're talking about, and less videos of sales drones?
+Troels Jacob Ringsmose Feddersen IT-titties?
one point twenty-one gigawatts! amazing!
the job is not data,it is environment control.
I once was responsible for the design and implementation of a server NOC.Went all out for redundancy - redundant power out the wazoo. It's undoing, even though I'd recommended it was not hosting a DNS zone copy. Oops.
Nice.
It's great that all of this is being introduced to us by someone taking time out from his regular job as a waiter at a cocktail bar. Seriously, there is nothing wrong with a tie. Great stuff, though.
is that the main character from Fallout 4?
It is the nicest "looking" one I have seen.
"Medical research?" "H.R. management?" "Economics research?" "Academic research"? Then why all the high security?
The iPhone's fingerprint sensor has shown that even this vein reader probably can be tricked.
My desktop computer is noisier than that during the summer
Now try doing that with Amazon.
But can you play Minecraft on this?
That's a lot of IT.
and I thought our server room was neat ....
Pod-people… LOL.
Interesting though, that they've got a wardrobe full of infinity at 4:30.
Question- if you mined cryptocurrency with that setup would it make a profit?
+MagikGimp Going on an average value, if such a thing is plausible of course.
I bet it's possible to trick the fingerprint reader