office interior design app
male speaker 1: a data centre'sthe brains of the internet. male speaker 2: theengine of the internet. female speaker 1: it is a giantbuilding with a lot of power, a lot of cooling anda lot of computers. male speaker 3: it's row, uponrow, upon row of machines, all working together toprovide the services that make google function. joe kava: i love buildingand operating data centres. i'm joe kava, vice-presidentof data centres at google.
i'm responsible for managingthe teams globally that design, build and operategoogle's data centres. we're also responsible forthe environmental health and safety, sustainabilityand carbon offsets for our data centres. this data centre, herein south carolina, is one node in a largernetwork of data centres all over the world. of all the employees at google,a very, very small percentage
of those employees areauthorised to even enter a data centre campus. the men and womenwho run these data centres and keep them up 24hours a day, seven days a week, they are incredibly passionateabout what they're doing. male speaker 2: in layman'sterms, what do i do here? female speaker 1: itypically refer to myself as the herder of cats. male speaker 4: i'm an engineer.
male speaker 3: hardwaresite operations manager. male speaker 2: wekeep the lights on. male speaker 1: andwe enjoy doing it. joe kava: and theywork very hard, so we like to provide them with a funenvironment where they can also play hard as well. female speaker 2: we just wentpast the three-million-man-hour mark for zerolost-time incidents. three million man-hoursis a really long time,
and with the number of peoplewe have on site, that is an amazing accomplishment. joe kava: i think that thegoogle data centres really can offer a level of securitythat almost no other company can match. we have an informationsecurity team that is truly second to none. you have the expression,"they wrote the book on that."
well, there are many ofour information security team members whoreally have written the books on best practicesin information security. protecting the securityand the privacy of our users' information isour foremost design criterion. we use various layersof higher-level security the closer into the centreof the campus you get. so, just to enterthis campus, my badge had to be on apre-authorised access list.
then, to come intothe building, that was another level of security. to get into the secure corridorthat leads to the data centre, that's a higherlevel of security. and the data centre andthe networking rooms have the highestlevel of security. and the technologiesthat we use are different. like, for instance, inour highest-level areas, we even use underfloorintrusion detection via laser
beams. so, i'm going to demonstrategoing into the secure corridor now. one, my badge has to be on theauthorised list. and then two, i use abiometric iris scanner to verify that it truly is me. ok, here we are onthe data centre floor. the first thing thati notice is that it's a little warm in here.
it's about 80degrees fahrenheit. google runs ourdata centres warmer than most because ithelps with the efficiency. you'll notice that we haveoverhead power distribution. coming from the yard outside, webring in the high-voltage power distributed across the bus barsto all of the customised bus taps that are basicallyplugs, where we plug in all the extension cords. all of our racks don't reallylook like a traditional server
rack. these are custom designedand built for google so that we canoptimise the servers for hyper-efficiency andhigh-performance computing. it's true thatsometimes drives fail, and we have to replacethem to upgrade them, because maybe they're nolonger efficient to run. we have a very thoroughend-to-end chain-of-custody process for managingthose drives
from the time that they'rechecked out from the server til they're brought to anultra-secure cage, where they're erased andcrushed if necessary. so any drive that can'tbe verified as 100% clean, we crush itfirst and then we take it to anindustrial wood chipper, where it's shredded intothese little pieces like this. in the time that i've beenat google – for almost six and a half yearsnow – we have changed
our cooling technologiesat least five times. most data centres haveair-conditioning units along the perimeter walls thatforce cold air under the floor. it then rises up infront of the servers and cools the servers. with our solution, wetake the server racks and we butt them right up againstour air-conditioning unit. we just use cool waterflowing through those copper coils that you see there.
so the hot air from the serversis contained in that hot aisle. it rises up, passesacross those coils, where the heat fromthe air transfers to the water inthose coils, and then that warm water is thenbrought outside the data centre to our cooling plant,where it is cooled down through our coolingtowers and returned to the data centre. and that process is justrepeated over and over again.
to me, the thing that amazesme about google and the data centres is thepace of innovation and always challenging theway we're doing things. so, when people say thatinnovation in a certain area is over, that we've kind ofreached the pinnacle of what can be achieved, i just laugh. [music playing]