Glass Cutting Basics for Picture Framing

Glass Cutting Basics for Picture Framing


OK today on Repairs101 weíve got this piece
of artwork here that has an unfortunate crack in the glass. And so Iím going to remove
the glass, cut a new piece because itís a custom frame and it has a unique size that
you canít just buy off the shelf. Hereís a handful of framing tools that weíll be
using and my framing supplies toolbox. Sitting on top of this replacement piece of glass
weíre going to be using which is UV glass. As you can see itís quite faded so weíre
going to put on a piece of UV glass. And give it that little bit of extra protection from
light that is deteriorating the quality of this image.
So being able to cut glass is a great skill to have. The tools are inexpensive for sure
and youíll have a lot of opportunities to use it around the home if youíre a handyperson.
You might have broken window panes in your home that you need to replace, you might have
a broken mirror that needs to be replaced or reframed or like this a piece of broken
artwork. Now first up Iíd say you may want to wear
gloves. Now Iím going to do this barehanded but I recommend that you wear gloves when
you do this. And for sure youíre going to wear your safety glasses.
Although there are a lot of more expensive options the most basic tool you need to do
this job is this little thing right here. Iíll show you that. There we go. As you can
see, its got a wheel in the end of it made of high carbon steel and thatís what does
all the work. These notches right here, theyíre there as holds so you can grab on to a piece
of glass and break it off like that. And of course the handle is very nice ergonomic handle
and it also has a little ball on the end thatís also used for tapping the glass along the
scribe that youíve made in order to ensure a break.
So another tool youíre going to need is the glass cut running pliers, also known as a
glazerís pliers or a glass plier. So this is the way it works. As you scribe along the
top of the piece of glass you get this underneath it and it levers it together and it separates
it along your scribe. The original piece of glass is considerably
smaller than the replacement piece that weíve bought for it. So itís going to need to be
cut down. It also has all the corners nipped off of it. Weíll just take a Sharpie and
trace it now obviously itís just going to give me a rough outline of where I need to
cut. OK so in the spirit of measuring twice and
cutting once Iím going to take a quick measurement and weíll start measuring. Just take a look
at this. And weíll measure this one again of course just take two measurements. So the
tracing is good. And Iím going to follow it as my guideline. The first thing Iím going
to cut is the short edge because the shorter your run youíre cutting the better off you
are. And weíre going to take a brand new Fletcher
glass cutter ñ the gold tip type, with the breaker ball on the end which Iím not sure
Iím going to need. Iím going to apply a tiny drop of tenacious oil which is a very,
very thick chain oil that is very much similar to gear oil. Say an eighty or ninety weight
gear oil, if thatís what youíve got thatís what Iíd recommend. Anyway so thatís just
to make sure that the wheel rolls nice and easily across the surface.
Line it up right the first time. There we go. There we go. There we go. OK so Iíve
got it lined up to my liking Iíve got the ñ maybe Iíll add one more clamp to hold
that in place to make sure it doesnít move when I make my cut.
OK start beyond it, go right up against the ruler which youíre using as a fence, and
get started. [Glass cutting]
And go right to the end and over the edge. [Glass cutting]
OK and thatís all there is to it. OK and then the last thing is to take your
cut running pliers and snap the end. You want to line that up right on it right on the cut
and then you see it just breaks like magic. Here we go here we go so we line it up just
like that and there you go. You see that? No cuts, nothing to worry about but I recommend
that you wear gloves . And then you just give it a little tiny gentle squeeze and as you
can see it pops off clean and right into my hand.
This time Iím going to use this much bigger set-square because unfortunately my favourite
little steel ruler isnít long enough ñ itís eighteen inches and we need to go across eighteen
and a half. And you must score from edge to edge. You cannot start part way down and hope
that it all works out OK. Thatís just not going to happen. And here we go.
[Glass cutting] Now if for whatever reason you canít your
hands on a pair of cut running pliers thereís a civilized solution and itís not using your
hands or using two pairs or ordinary pliers. OK donít use you hands even with gloves on.
And really what you really want to do is just find yourself a block of wood, something like
this. And what weíre going to do is weíre going to rip a channel down the middle of
it to accommodate the edge of the glass. And Iíll just show you how to do that real quick.
OK you get your fence set up, just lower the saw a little bit.
[Table saw rips] So of course the channel in my block of wood
acts exactly the same way as the channels in this tool. Insert the glass and snap off
ñ in this case generally youíre just going to nibble off little bits with this edge of
his tool. When you need to make a break and you donít have cut-running pliers then itís
a good idea to cut yourself something like this and use it to prevent your hand from
coming in contact with it. And this will help you apply even pressure along the whole length
of it as opposed to say just where my thumbs are. If that was the glass and I was trying
to break it with my hands. Thatís just a recipe for disaster. Youíre going to end
up getting some really severe cuts. OK weíre going to take this piece of wood
that we cut a channel into and weíre going to use it instead of my hand or a pair of
pliers or something silly like that to provide nice even pressure across a much larger surface
area than my fingers, my hands, another pair of pliers or something would do. Now make
sure youíre wearing your safety glasses when you do this and I would still recommend gloves
to most people. Then just a little twist of the wrist and it breaks off nice and clean
nice and safe. So pop off like that, OK piece of cake and very safe, no contact of course
with the glass at all that Iím breaking off, OK.
OK so whatís left are nicking off these corners. So I donít think you need a clamp for this.
Just start right there and etch and she popped right off for me. OK so weíll just take a
line, take our etcher our glass cutter. Look at that. Now that one again came right off
with just the little bit of pressure I was putting on it with the ruler. Alright. And
again, nice clean cut. Popped right off. OK thatís snug. Oh yeah, right in. OK Iíll
just clean this up both sides, put the artwork back together and weíre all finished.
So what Iíve done is put some tape here to replace all this old tape from the nineteen-seventies
that dried up and no longer holding the piece in place. Although art conservatorists will
be having heart attacks right now because this is certainly not acid free paper, itís
the paper that was in there all along and itís an historical document. Itís a time
capsule so Iím going to preserve that time capsule.
And last but not least I have the original nails here but Iím going to replace them
with a much more modern and much more convenient solution. Which are these little fellows right
here. I call them stars because theyíre kind of star shaped.
Weíre going to use these little staples that are driven in by this interesting little driver
right here. You just get in behind them like that and you push them in Ö
Thatís what the staple looks like. As you can see its got raised edges for pushing on.
The tool is this thing right here itís made by the Fletcher company again and all you
do is you getÖ you load one in like that. And you want to be very careful not to push
down at all because youíre going to shatter the glass. Of course you need to only be pushing
parallel to the glass into the wood, not down on the glass at all.
Iím going to try and centre it and then slide it on in like that. Very careful not to push
down ñ only to push across. And there it is.
OK as usual my product is by 3M and it is Durapore fabric tape. So just attach that
along like that. Like that and then bring it in, make a nice dust cover while still
preserving the look of the original picture frame maker.
OK now as you can see in no time flat Iíve been able to restore something that was otherwise
relegated to a storage room where it was going to collect dust for the next decade or two
before somebody finally got tired of looking at it and threw it out. And that would be
a real shame because itís really a beautiful piece of artwork. And now it can go back to
its rightful owner and they can enjoy it on their wall for years to come.
OK so I like to put it on a piece of cardboard as you see on top of my workbench. I bring
it right to the edge here. Iím going to be cutting across there. Iíll take a steel ruler
and also align it perfectly across the edge so I have an exact parallel cut. Iím going
to take my gluing clamps and place them across the bottom so that the ruler canít move when
Iím etching my line. OK Iím going to put a drop of tenacious oil
on the wheel, every time I use it because itís going to clog up with glass debris and
you want it running freely in order to do the cut.
Now listen for this sound, itís critical that you be making the same noise so that
you ensure that youíre making a good score across.
[Glass cutting] Listen for that sound.
You probably want to wear gloves to do this. So just line that up.
There we go.

Fuji Guys – FUJIFILM XP140 – Top Features

Fuji Guys – FUJIFILM XP140 – Top Features


Welcome back to the Fuji Guys channel
my name is Gord. It doesn’t often snow here in Western Canada, but when it does,
that’s a great day to be able to take along a Finepix XP140. In addition to
being waterproof to 25m under water, it’s also great for up to -10
degrees centigrade. It’s also drop proof and dust proof. So
in this video I’m going to first go inside and warm up and then I’m going to
take a look at some of the top features of this camera. So if that really
interests you by all means keep on watching. One thing that makes taking
photos with the XP140 real easy is the Auto SR mode or automatic Scene
Recognition. The camera will analyze the scene and choose one of 58 different
scene positions this optimizes the settings for that particular scene. If
you have people in your picture then the subjects eyes will be in proper focus
every time. When you first power the camera on and it’s in the Auto SR+ mode, you have the option then of having subject tracking. If you have faces in your photo
pushing that button each time we’ll alternate and rotate through the various
people that are in the picture that point you can get exactly the face
that you want to be in proper focus. If there’s no faces in the photo what
happens is when you lock on the subject no matter where your subject ends up
going through the frame the camera will automatically stay with it and track the
focus directly on to that main subject. There are other modes that you can use
in addition to Auto SR. When you push the menu button you can go into shooting
mode and from there you have a choice of standard program, multiple exposures some
creative filters in there as well as scene positions like sport or landscape
or nigh. Make your choice hit the menu button and now you’re ready to take
photos again. The XP140 also has face and eye detection. This helps to improve
portraits. Means people’s faces will always be a nice sharp focus, and there’s
a few options you have within there. When I power on the camera,
because I know I’m going to be taking a picture of a portrait of someone I’m
going to under the portrait mode. So I push the menu button and then shooting
mode, I choose portrait from there now I’ve got my choice of whether or not I
want to have the face or the eyes in focus and then which one. So I push
the menu button again and then I go drop it down to AF/MF setting and then move
over to face detection and eye detection settings. From there you have a choice of
the face detection is on but the eye detection is turned off. It doesn’t
matter whether it’s one or the other then you can have it where it’s
automatic and the camera will decide whichever is closer of the two eyes to
the camera it will focus on that one. Then if you want you can purposely make
it so it’s only they pick the right eye or only gonna make the left eye. When
you’re in certain modes you’ll notice that the FACE ON/OFF is disabled. This
is because normally when you’re taking pictures of people you want to have the
face in nice sharp focus. If you have the scene position set to landscape for example you won’t have eye or face detection enabled and at
which point FACE OFF and EYE OFF will be enabled on the menu screen. Once
you’ve made your choice hit the menu button go back and take your great
portrait photo. The XP140 offers seamless Bluetooth with Wi-Fi
connections so you can very quickly and easily transfer images from your camera
to your smart device as well as transfer geo-tagging information from your smart
device over to your camera. When you first power the camera on it walks you
through the steps. If you miss that or wanted to re-pair another device you
go into the menu setting and then you go down to the bottom part where it talks
about connection setting on page 2 of the setup menu and then you have the
choice for your Bluetooth settings. Go into here and you want to set pairing
registration. If you don’t already have the free Camera Remote App installed on
your smartphone there’s an QR code on the back of the camera you can scan that and
it the smartphone will automatically take you to the correct location. The
smartphone app is due for an upgrade about the same time that the XP140
becomes available so what you see on my screen might be a little different than
what at the app actually looks like. So start the app on my phone and now I want
to choose pairing registration and then the camera will do a little bit of
communicating. I see my camera listed on my smart device I tap on that and then
it’s going to do some further communication and we’ve got it connected.
On the back of the my camera now it will ask whether I want to set the date
and time from my smart phone. I always like to say yes. The reason for that is
now when I’m traveling the camera will
automatically update the time from whatever time my smartphone has. Let’s
take a look at some of the options we have within the Bluetooth menu on the
camera. So again going into the connection settings side and then
Bluetooth settings we can, if we want, we can delete that pairing registration you
can actually have up to five different, sorry 8 different pairing
registrations per the phone so you can have eight different devices all
connected. You have to choose which device you have connected up at the same
time if you ever lose your device or you want to delete it from the list you can
delete it from here. You can turn the Bluetooth on and off if you want to and
you can have this auto image tagging and you can have it where it’s set to
automatically tagged the image to be transferred over to your smartphone. What
I like to have instead is the seamless transfer and what that does is it will
automatically as soon as I take a photo transfer the image over to my smartphone.
So let’s turn that on for the time being and then I also have the smartphone sync
setting on the second page and what that enables me to do is to set the date and
time of my phone over to my camera if I want to, or if I want I can have the
location transferred over, or my favorite choice, is to have location and time. At
that point every five minutes or so the camera using, Bluetooth technology,
for low power will automatically update the location as well as the time.
This is great when I’m traveling, allowing me to be able to know exactly
where my photos were taken as far as which waterfall or which fountain or
which museum I was in. So let’s go back and take a photo of my friend here and
over on my camera on my smart phone rather I have a few different options
within there. I can have a remote release when I push the trigger button it will
automatically take a photo. I also have the option for remote control after a
few seconds it will populate the information through here the first time
around. I do have to connect it up to allow it to be able to talk to the
network. On my camera now the first time out it wants to be able to connect I
just want to make sure that I’m talking to the right device. I don’t my devices don’t get pirated Now on my screen I can actually see all the
different things live view and I can control the camera from my device and
take photos from my device. If I want then I can actually push
playback and now I’ve got all these images that are currently on the camera
and I can pull them over one at a time or I can select all of them and import
all of them. I have a few controls on my smart phone as well things like if I wanted to I can start recording a movie. So those are some of
the different options you have using the Bluetooth and the free remote to be able
to connect up to your camera and talk directly to your camera. the XP140 features image stabilization. It’s based on the sensor and what that does is
helps to counteract if ever your hands are shaking or if by chance you’re
taking to take a photo in the wind and you’re blowing ever so slightly and
there’s a few different options you have within there as far as the different
types of settings for image stabilization. If you push the menu
button and go to the second screen down at the bottom is where you’ll see image
stabilization mode from here you have a few choices including continuous and shooting. What continuous is the image stabilization will always
be on so even when you’re looking at the image, the image stabilization is
active. If you choose shooting it’s just when you press the shutter button when
the image stabilization kicks in. There’s also OFF depending on the setting scene
mode that you happen to have the camera in. Why would you want to turn image
stabilization off well if ever you have the camera mounted directly onto
something that’s very stable you want to turn that off. Otherwise the camera will
go looking for movement and inadvertently create a little bit of a
feedback loop. You wouldn’t want that to happen. The camera also has a five times optical zoom. It starts at 28mm which
is a fairly wide angle which allows you to take landscape shots. There’s a five
times zoom so when you press the T button or telephoto, that will zoom
in more and more on your subject. The W makes it more wide-angle. Up on
the top left hand corner you can see a scale as far as where you are on the
within the zoom range. There’s also an intelligent digital zoom you need to
enable that first be able to be able to use the extra zoom power on
camera and that’s located on the third screen of the shooting menu. You need to
turn that on. Once you’ve turned that on you’ll notice up in the top left-hand
corner of your screen, your zoom area now has a blue as well as a
clear or black area and so what happens is you zoom first where the entire zoo
optical range. You need to let your finger off of the T button and then you
can push it again to be able to get into the additional digital zoom within there.
When you’re choosing the wide-angle again you need to let go of the W button
first to be able to zoom out through the entire range. So that offers
you a really nice wide range of shooting options when you’re looking at zooming
on your XP140. The XP140 can record movies including up to 4k at 15 frames
per second. Full HD at up to 60 frames per second or even 720p. There’s also the
option in full HD or at 720p of square movies. Kind of fun for Instagram. There’s
a few choices you have within there to record a movie all you have to do is
push on the very top button there’s a dedicated movie record button. Press it once to start press it a second time to stop. Let’s take a look at setting up the
various menu options when it comes to movies. Pressing the menu button going to
the shooting menu to the second page is where you’ll find the movie setup. The
first top line is you’ll see the movie mode. This is the resolution and the
aspect ratio. Whether you wanted to have it in Full HD at 16 by 9 or square at
one to one. After that when you make your choice you can then go to full
high-speed recording. I’ll come back to that in just a moment. There’s also the
focus mode depending on the mode you have the camera set up to you’ll have
choices in here. If you have it set to scene recognition it will automatically
figure out the appropriate focus mode to be in but otherwise you have a choice of
either continuous AF or single AF. There’s also the wind filter, if you are
in windy outdoor situations you probably want to turn that on. That helps to cut
down on the wind effect that you sometimes hear in microphones. Going back to high-speed video modes you can record up to four times high-speed movies so
when you play them back they’re up to four times slower than they normally
would be. You first have to turn that on in order to start the high-speed movie,
but let’s go back to our standard movie. Once I made my choices I just hit the
DISP/Back button. Go back, press the video button on the top of the camera to
start the recording. Press it again to stop. Easy as that. The XP140 features an
interval timer and you can create, in camera, resulting movies from those
series of images. There’s a few choices you have in there as far as the
resolution you can record it in. 4k at 15 frames per second, Full HD at up to 60
frames per second, from the images that you capture in the camera. There’s a
couple steps you need to do to be able to make that happen. First is you go into the menu and you go into the second page where there’s the
time-lapse movie mode. Here’s where you would choose what the resulting
resolution of the movie will be beforehand. So let’s choose from this
instance 1080p at 59.94. I know and go into the interval timer shooting mode
and here is where I want to choose what the interval is between each image
capture. If the image captures are more than a couple of seconds, the camera will
power down in between the various exposures. This helps to save battery
power. Just before it’s ready to capture the next image, power back up
again capture the image and then go back into power saving mode. So in this
particular case I’m going to just choose a couple seconds just for
demonstration purposes. So I’m going to choose five seconds in between allowing
me to move my little friend here. Then there’s the number of times I need to
choose. I need to choose how many times the camera will take pictures that’s for this instance. Let’s take 10 images and then I want to
choose and then I hit OK and I choose whether I just want to save a series of
still images. These are the full resolution images or if I want the
camera to create a time-lapse movie from within there. Let’s create a time-lapse
movie within there, I hit OK and whether I want to start it now or whether I want
to start it in a few minutes or an hour or two later on from now. I can start up
to 24 hours later than when I start. I push the shutter button to start things
happening. Let’s set it to zero so we’re all ready to go, let me take my first
shot I’m gonna move my little friend a little bit just so we capture some nice
images here. So I am going capture a total of 10 images. Each time around I’m going to
move him ever so slightly and then I’m gonna bring my other
little friend in here and bring him in so he starts to show up. The camera does
a countdown and also account up as far as how many images it’s captured
each time around. Couple more to go here It’s now captured all the images and
it’s doing a little bit of processing work to be able to put all that together again. Now when I have my resulting movie it’s a captured image of all the
different things in there. Again if you’re going to want to have a
time-lapse movie you probably want to have a lower frame rate the reason being
when you have a lower frame rate it won’t playback nearly as jumpy or as
quickly if you are going to go with the higher frame rates you want to have
either a little bit less movement or a lot more movement in between each one
otherwise it doesn’t quite look quite proper. It might take a little bit of
experimenting to you for you to be able to get the various settings to make sure
you’ve got the right frame rate, as far asthe right interval, but
it can be a lot of fun. One thing we do recommend whenever you’re going to be
doing interval shooting put it on a tripod or put it on something solid so you don’t inadvertently have some camera movement within there. That’s how you can quickly and fairly easily make interval
time movies that are quite a bit of fun with your XP140. The XP140 has a couple of different burst or continuous modes this is great for capturing action like
people jumping off diving boards or jumping over ski hills. You can capture up to
15 frames per second if you want. You can capture up to 10 frames per second at
full resolution or capture up to 15 frames per second in 4k resolution. The
resulting images are about 8 megapixels. Which still will print up very nicely as
well as show on your screens. Here’s how you get in a couple of different modes
of the burst modes for the XP140. On the back of the camera you’ll find a
dedicated burst mode button. Pressing it the first time will get jump you into
high speed mode and you have a choice of either 10 frames per second, 5 frames per
second or 3 frames per second. Depending on the mode that you have the camera set
into. You may not be able to jump down to 4K resolution. I suggest you put it into the P mode first and then you can jump
into 4k resolution. Pressing the button again and will drop down to 4k burst mode.
Now when you press the main shutter button it’ll capture 15 frames during one second at 4k resolution and you’ll be able to play those back on
your TV set again or be able to share those images directly and immediately. So
that’s a great way to be able to capture exactly the a point of action that
you’re looking for with the XP140. The XP140 has a few different self timer
modes allowing you to either get into the frame yourself or wait until your
subject is smiling before it starts taking photos. Here’s how you get into
those various modes. Press down on the control ring and you’ve got the self
timer mode and you can choose either two seconds, this is great if you have the
camera on a tripod and you want to make sure there’s no camera shake. When
you press the shutter button the camera will wait two seconds and then take the
picture. Alternate there’s ten seconds this will allow you to time to be able
to get into the frame yourself. There’s also face auto shutter. As soon
as the camera recognizes the face it will start taking photos. There’s also a
smile. The camera will wait until the subject is smiling before it will start
taking photos. There’s also buddy and you have three
different choices of the two people that are in the picture you can either have
near, close up or super close. There’s also a group shot at which point you can
choose between one and four people and the camera will wait until it’s
recognizes the specified number of people before it starts taking the
photos. Let’s see face auto shutter. Press the Menu/OK button to confirm your
choice and now the camera will wait until it sees a face and as soon as it
sees a face it’ll start taking photos and continue to take photos until you
press the display back button. Those are just some of the features found on the
Finepix XP140, hope you enjoy it and found out a few things about your camera.
If you should have any questions about this video feel free to leave them in
the comment section below. Subscribe to our Youtube channel and you’ll be
notified whenever there’s new videos posted. You can follow us on Twitter
@fujiguys look for us on Facebook as well as Instagram and until next time
I’m Gord of the Fuji Guys thanks for Watching!

IMPROVE IMAGE SHOT ON RED DIGITAL CAMERA BETTER! / IPP2 color science / Workflow in Davinci Resolve

IMPROVE IMAGE SHOT ON RED DIGITAL CAMERA BETTER! / IPP2 color science / Workflow in Davinci Resolve


hello guys my name is Hugo. today let’s talk about IPP2. this is gonna be a little bit technical and for Red users only. so what is IPP2. IPP2 is the image processing pipeline. so basically it’s a Color space for red cameras. many music videos, commercials, films or documentaries are shot on red cameras, however, for some reason editors and colorist don’t take advantages of IPP2 I know it’s a relatively new a color pipeline, but it’s been out since 2017 and still not many people actually know about it. well, I think IPP2 is something that actually convinced me to become a red owner. The thing is that red actually didn’t provide enough of dynamic range and highlight roll-off like you get from Arri Alexa. well it was before I took advantage of IPP2. For Instance, image from helium sensor using IPP2 looks amazing. it has a very very high dynamic range and the highlight roll off is very pleasant. this pipeline preserves more information, more details and better processing of challenging colors like neon lights color mix and overall contrast light mix but even if you don’t have helium or monstro sensor, it still works on any other camera from red lineup. IPP2 will definitely make your image look better but you need to set it up in post, because by default it’s set to legacy color science and here or how you can change it. so I use DaVinci Resolve for this. so when you go to color tab and we go to raw settings here we can change clip and usually color science by default set to original or version 2. and you need to change this one to IPP2. then probably by default it’s gonna set To rec709 or any other color profile but you need to select redwidegamutRGB if you want to choose rec 709 straight away you can just choose here but I prefer to choose it to log so I have more flexibility in manipulation. so for the footage so I’ll go ahead and choose Log3G10 and you see the footage become very flat, but no worries you can go to red website and you can download IPP2 output presets once you download the presets you have two folders one is rec.709 and the other one rec2020. I use the Rec709 Because most of the devices they still use rec709. and here you can see lots of luts. you have a different settings. it has no contrast, medium contrast, low contrast, high contrast and also on the other side you also see that there is a hard size, medium size, Soft size, very soft size. those are highlight roll off. for me personally my favorite one is high contrast very soft so when we go here when I apply this lut, I go and apply rec.709 and I go high contrast very soft straightaway you have very beautiful image with enough contrast with enough color saturation and everything from there of course you can manipulate so this little trick will help you to make your red image look a lot better and if you for instance have many cameras it’s a lot better to color match those cameras when they use IPP2 color science. I hope this information was useful for you. Hit like if you like this video click on subscribe button if you want to watch more videos like this thank you for your time. bye

Samsung MU8000 TV Picture Settings – RTINGS.com

Samsung MU8000 TV Picture Settings – RTINGS.com


Hi, I’m Daniel from Rtings.com In this video, we will go over how-to setup
and get the best picture for the Samsung MU8000 which is also equivalent to the MU7000 in
Europe. We will describe any adjustments you should make for different content, such as
movies, sports, gaming and HDR. The first thing to note is that all of the
inputs to the TV are located on an external one connect mini box. Unlike the Samsung QLED
TVs, this doesn’t require a secondary power connection, but it also isn’t in wall rated
which may cause cabling issues for some people. If you have a receiver or soundbar which supports
ARC to route the TVs sound through external speakers then you should connect it to HDMI
4. Other than this, the inputs are identical so connect your devices to any of them. Also
note that there is no support for older composite or component inputs on Samsung TVs with an
external One Connect or One Connect Mini box. When you connect an input, the TV will try to identify what it is and change to the appropriate
input icon and label. This usually works well, but if you’re using a PC and want to ensure
support for Chroma 4:4:4 then you can go to the ‘Home’ menu and press up on the HDMI
port to set the corresponding PC icon. This is the only icon which affects the picture
quality, the rest are all cosmetic. With your inputs setup, the next thing you want to do is adjust the bandwidth of the
HDMI port to use full HDMI 2.0 capabilities. This can be done either by going through ‘Settings’
->‘General’ ->‘External Device Manager’ ->‘HDMI UHD Color’ or by holding the
voice button on the remote and saying ‘HDMI UHD Color’. This voice option also works
well for all the settings and menus shown in the video. Adjusting this setting is only
required for high bandwidth devices such as HDR consoles or for PC use but only very rarely
causes incompatibility issues. In the same ‘External Device Manager’
menu is an option for ‘Game Mode’. You should enable this if you want the lowest
input lag for gaming, and it will disable some picture processing. You can still follow
the rest of this setting guide, but some options will be disabled.
If the HDMI Black Level setting is available then it should almost always be left at ‘Auto’.
This setting corresponds to the video range of the input device. A mismatch here will
result in crushed dark scenes or a raised black level and loss of contrast.
Now, we will go up a menu and into ‘Eco Solution’. Disable everything here to avoid
the brightness adjusting automatically, as it can be distracting.
Under ‘Picture’ adjust the ‘Picture Mode’. ‘Movie’ is the most accurate
picture mode and allows the most setting customization, so is the one we will use here.
The bulk of the picture settings lie in the ‘Expert Settings’ menu. To better understand
how they work, we will be showing measurements of our MU8000 which correspond to each of
the settings we go over. The ‘White Level’ measurement is the brightness of the screen
on a checkerboard pattern. Adjusting the ‘Backlight’ option will affect the overall screen brightness
without reducing the picture quality, so adjust this to suit your room and if you have a bright
room then set it to maximum. Also, for HDR content you should set the ‘Backlight’
to maximum to produce the most vivid highlights. The ‘Brightness’ slider works differently on 2017 Samsung TVs compared to previous years
and other manufacturers TVs. We can see the effect it has by measuring the ‘Gamma’
curve which shows the relationship between dark and bright areas. A high gamma value
results in deeper dark scenes and a lower value results in a brighter overall image.
The left hand side of the plot affects darker scenes, while the right hand side affects
bright scenes. For example, a high gamma value toward the left-hand side of the plot results
in deeper dark scenes but may result in loss if details in a bright room. Movies are mastered
to target a flat value of 2.2 across the range so this is what we aim for.
When the ‘Brightness’ setting is adjusted it affects the gamma in dark areas, rather
than raising the black level. You can increase the ‘Brightness’ to bring out dark scene
details or decrease it for a deeper image. We leave this to the default value of 0 as
it is closest to the reference target. The contrast option affects the brightness range of the display. This should be set as
high as possible without losing details in highlights. The default value of 95 is provides
a good brightness range, without loss of details. A sharpness setting of 0 results in no added sharpness. If you are watching lower quality
content and don’t mind sharpening artifacts then you can increase it slightly, but too
high values will result in excessive ringing around edges. To see the effect of the color setting we
will show measurements on a CIE diagram. The squares on the diagram show the target color
– which is what a calibrated display should achieve. The circles show our measurements
from this MU8000. Increasing the ‘Color’ results in a more saturated image, but results
in less accuracy and may cause saturated details to be clipped. Decreasing it too far results
in loss of vibrancy. The default value of 50 is best for an accurate image.
The ‘Tint’ setting adjusts the balance between Green and Red, which has the effect
of rotating colors on the CIE xy diagram as shown.
The default value with equal amounts of green and red is the most accurate. ‘Digital Clean View’ is a noise reduction
feature which clears up low quality content. Enable this for DVDs or cable. ‘Apply Picture Settings’ allows you to
change whether the picture adjustments are adjusted on an input-by-input basis or are
the same across all inputs of the TV. If you prefer a brighter image when gaming for example,
you can use different settings for a Blu-ray player and console. For most people it is
best to use the same settings for all inputs. The ‘Auto Motion Plus Settings’ menu is
for motion interpolation and image flicker options. To learn more about how these affect
the motion performance, see the videos linked in the description. These settings aren’t
available in game or PC mode, to avoid adding input lag. If you enjoy the soap opera effect
when watching movies or cable TV then select the ‘Custom’ option and increase ‘Judder
Reduction’ to 2 or 3. If you enjoy a strong soap opera effect and don’t mind too many
artifacts, then you can also increase ‘Blur Reduction’ to a similar value. For our calibration
we will leave both of these sliders on 0. ‘LED Clear Motion’ flickers the backlight to clear up motion. If you’re watching sports
or other fast motion then you can activate this, however the resulting flicker is distracting
to some people and it does decrease the overall screen brightness. ‘Local Dimming’ allows some areas of the
screen to dim and produce darker scenes. Unfortunately, it doesn’t work well on the MU8000 and produces
blooming, so we recommend setting it to ‘Low’. It is not possible to disable on this TV. ‘Contrast Enhancer’ affects the relationship
between dark and bright areas of a scene. You should disable it if you want the most
accurate image. The ‘HDR+’ mode doesn’t enable HDR, but rather adjusts the settings to make SDR
content look HDR-like. It generally produces an overly saturated image as shown in the
xy plot. If you do prefer a more vivid image then you can activate it, but we don’t recommend
it if you’re trying to match the director’s intent.
‘Film Mode’ is only available with certain input signals, such as 1080i sources. If this
option is available and you’re watching a movie, such as from cable TV, then activate
this. To see the effect of the ‘Color Tone’
option we use the same plot. Setting the color tone to a cooler value results in the whole
image shifting towards blue. Warmer values look yellow or reddish. We calibrate to the
standard 6500K color temperature that movies are mastered at which corresponds to a value
of ‘Warm2’, but you can adjust this to your preference.
In the ‘White Balance’ menu are more advanced adjustments to the white point at different
brightness. These require measurement equipment to set accurately. You can find our values
in the review for reference, but we don’t recommend copying them as the best values
vary on a unit-by-unit basis. The ‘Gamma’ option will change automatically to the correct curve depending on the content
metadata. For Hybrid Log Gamma content this will default to HLG, for HDR10 or Dolby Vision
content it adjusts to ST. 2084 and for SDR content the correct value is BT. 1886. The
effect of the gamma slider can be measured with the same plot as before. Increasing the
value results in a lower gamma curve, which increases the overall brightness of the image
and brings out details in dark scenes. A lower value increases the curve and produces deeper
dark scenes, but may crush details in a bright room. You can increase the slider in a bright
room, but we use a value of 0 as it is closest to our 2.2 target. The ‘RGB Only’ setting filters the primary
colors of the image for calibration by eye. The ‘Color Space Settings’ affects the
target color space. The ‘Custom’ value allows for calibration of the color space,
but this requires measurement equipment and the best values change from unit to unit.
The ‘Native’ setting produces a more vivid image in SDR, but results in loss of accuracy.
For accurate colors leave this to ‘Auto’ for both SDR and HDR content.
So that’s it. You can find the screenshots of all the settings we recommend on our website
via the link below. And if you like this video, subscribe to our channel, or become a contributor,
and see you next time.

How to add a picture as PowerPoint Slide Background

How to add a picture as PowerPoint Slide Background


Hi, I’m Ramgopal from Presentation-process.com In this video, you will learn how to add a picture as PowerPoint slide background. This technique comes in quite handy to have very interesting slide backgrounds for your presentations and let me show you how to do that in a step-by-step way Here I have a new presentation the first step in the process of inserting picture as slide background is you right click on the slide and go to ‘Format background’ option. Now, you will see this pane on the right-hand side and the option we want to choose is this one called as ‘Picture or texture fill’. As soon as you click on that you will have a default texture inserted by PowerPoint Don’t bother too much about this one. Because, we’re going to insert a picture from our file. So, I’m going to click on the option called ‘insert picture from file’ and I’m going to choose one of the images that will serve as slide background. and this is a picture that I have already saved on my computer and I’m going to say ‘Insert’ As soon as I did that, you can see that the picture is now added as a slide background. Let’s say, you want to have the same slide background for every slide that you want to insert in a presentation All you need to do is to choose this option called ‘Apply to all’ As soon as you do that you’ll see that any new slide that you insert like by going here You will have the new slide have the same slide background as well. aAnother thing I want you to note is once you insert a picture as slide background, You also have the option to change the picture properties as well All you need to do is to go to this option called ‘Picture’ and you can change picture colour or you can do some picture corrections For example, if you want to recolour this picture in some other tint All you need to do is to go to ‘Recolor’ option and you can insert a green accent or you can have an orange accent etc. So, all those options are available to you once you insert a picture as slide background. Hope you got some useful information from this video. As a thank you for watching this video this far I’m happy to present to you a wonderful mini training called ‘Five things you can do under five minutes to make your slides look more professional. It is a useful mini training for every business presenter. whether you are a business owner a business executive a trainer or a consultant you will find this mini training very, very useful You can sign up for the mini training by clicking on the link here. You can also sign up for the mini training by clicking on the link in the description area right below this video. Thanks a lot for watching the video and I will see you in the mini training.

PowerPoint 2010: Removing the Background in Pictures

PowerPoint 2010: Removing the Background in Pictures


PowerPoint has a lot of great features for
editing pictures, but probably the most powerful one is
background removal. You can use this feature to cut out the background or any other part of the picture
that you don’t want. To get started, click on the image and go
to the Format tab. And on the left side of the Ribbon, select
the Remove Background command. PowerPoint will try to guess which areas of
the picture are part of the background, and they will
be marked with this kind of magenta color. It doesn’t look exactly right yet, but we’re
going to fix it. The first thing you should do is drag the
selection handles so that the foreground is inside the box. And PowerPoint will kind of readjust the image
after you do that. But there are still a few areas that it’s
missed, so we’re just going to help it decide which
areas to remove, and which areas to keep. I want to remove this area, so I’ll click
Mark Areas to Remove, and then just click and drag to
create a mark on the image. You can create marks wherever they’re needed, and if you make a mistake,
just click Delete Mark and then click on the mark and it will disappear. And I’m just going to try again here. Sometimes it takes a couple of tries, especially
when you’re working on the details. And I also want to remove this area below
the bowl. But now that’s caused another problem. PowerPoint thinks this part of the bowl should
be removed, but I want to keep it in the picture. So I’ll click Mark Areas to Keep and then
draw a mark there to fix it. And I just need to add a few more marks to
the image, and then I’ll be done. Okay, it looks like all of the parts that
I want to remove are now magenta, so the last thing we
need to do is click Keep Changes. Now, the background has been cut out, and any objects that are behind the
picture will show through. So if you wanted to make it more interesting, you could put a shape or a picture behind
it to give it a different look. And at any time, if you see
something that you’ve missed, you can just click the Remove Background
command again and add or delete markings as needed.

Weird Pictures of Toilets

Weird Pictures of Toilets


This picture makes me really depressed. I don’t know why, it’s such a depressing atmosphere Why is it a depressing? It kind of is, yea. What makes it depressing? Its so sad..
They’ve got just a lobster trap on the wall. Oh, is that what that is? I dont know ¯_(ツ)_/¯ Thought it was like a towel rack, but..
I guess it has multiple uses [laughing] Obviously we’ve all seen this one, right?
Oh is- is this the banger, uh- WOA-
thats new o.o/ Thats a new one [laughing] Why’s the floor look so wet has he been trying for a while? Yeah few attempts. Heres the eggs hatched~
WoAAh My god, is that wait a second. What the fuck? They’re frogs!!1! Its the tribunal ~wheezes~ When you wake up and you have a morning boner How does she, like, use the restroom in that scenario? I THOUGHT THAT WAS A LITTLE BOY. look at the concern on her face1!11! OHHH IS THIS FOR BATTLES?!?! Have like a little shit, shitting 1v1
mExIcAn sTaNd-oFf Wait-
The elevators ( ͡° ͜ʖ ͡°)
WAIT A MINUTE- WHAT THE FUCK someone took a picture of it.. instead of helping him
( ͡° ʖ̯ ͡°) ALRIGHT, I HAVE A FEW QUESTIONS! hOW DID HE MANAGE TO GET *FLIPPED* UPSIDE DOWN?
how did they take the picture? I l1ek t0 1m4g1n3 1t’2 n0t up21d3wn BUT his gust of wind is blowing the toilet paper upward! đŸ˜€ AND THE TOILET TOO~┬┴┤( ͡⚆ل͜├┬┴┬ I CANT COMPREHEND THIS– I’M BEWILDERED
( ͡☉⁄ ⁄ ͜⁄ ͜ʖ̫⁄ ⁄ ͡☉) what if- wHAT IF HE’S JUST- WHAT IF HIS MIRROR’S JUST UPSIDE DOWN?(∩ ͡° ͜ʖ ͡°)⊃━☆゚ OOhhHh~ (my very best-) OH WAIT- NO IF YOU TURN A MIRROR UPSIDE- NOOO ~le wheeze~ you genuinely thought about it for a second ~they laugh~ I thought for a second he was actually wearing skis but uh- the, walls were painted hE’s nOt wEaRiNg SKIS? o0o I THOUGHT HE WAS WEARING SKIS THE ENTIRE TIME. WAIT- OH MY GOSH I THOUGHT IT WAS A FUCKING- I THOUGHT IT WAS A LINT ROLLER (woa-) ~le laugh~ wa- tHOR AND HIS MIGHT LINT ROLLER ~hehe~ OK- WHATT THE FUCKK CAN SOMEONE BUY A SEGWAY AND RECREATE THIS IMAGE BUT VIDEO IT AND FALL. I JUST- I WANT THAT TO EXIST CARSON YOU HAVE THE MOST DEDICATED FANBASE OF ANYONE I’VE EVER SEEN YOU CAN GET THIS TO EXIST you know, i think it would be an extremely irresponsible for me to say to my fanbse “HEY GUYS GO MAKE A VIDEO OF YOU WITH A HOVERBOARD AND A STALL AND FALLING” yea that would be extremely irresponsible cooper i wouldn’t do that (DO IT PLEASE) i DONT SEE WHATS WRONG WITH THIS pHoTO
ItS JUST A BREAKFAST ROUTINE Somebody goes to use the urinal you like spit out your sandwitch and you say “HEY! IM EATING!” you cant do that here ):

The Hubble Deep Field: The Most Important Image Ever Taken


[oh] ان الكون مكان كبير نعرف ذلك على نحو غريزي فلذلك صدى عميق كما اننا نسمع عن كبر حجم الكون كلما زرنا قبة سماوية او ناخذ درسا في علم الفلك او ننظر الى السماء في ليلة صافية و لكن معرفة كبر حجم الكون و القدرة على تصور ذلك شيئان مختلفان عندما يتحدث علماء الفلك عن حجم الكون يحبون استعمال ارقام ضخمة لتفسير كل شيء لعلماء الفلك و علماء الكون الذين يتعاملون مع هاته الاشياء بشكل يومي يعد ذلك منطقيا لهم و لكن لبقيتنا هاته الارقام كبيرة جدا لدرحة انها تتركنا في تعجب حين نسمعها لمشكل هو انها كبيؤة جدا لتحمل معنا و دراسة علم الفلك مليئة بمثل هاته الارقام و لذا فنحن نعيش حياتنا اليومية فنقود سياراتنا للعمل نطبخ العشاء و ننشر اشرطة الفيديو بيوتوب و يسهل بذلك نسيان بقية الكون و يسهل علينا الانغماس في امورنا اليومية ففي نهاية الامر لدينا اشياء اكثر اهمية لنقوم بها عوض التفكير في كبر حجم الكون و لكن فلا نقسوا على انفسنا كثيرا فبالطبع يهمنا الكون و لكن الارقام في علم الفلك كبيرة جدا لنستطيع فهمها و ادمغتنا ليست مجهزة لتصور الارثام الكبيؤة كثمان و سبعين مليارا ذلك هو حجم كوننا 78 مليار سنة ضوئية و لكن قول ذلك الرقم لا يسهل الامر اليس كذلك لحسن الحظ تم التقاط صورة بامكانها ان ترينا كم هو كبير ذاك الرقم في نظرة واحدة في سنة 1995حدق تلسكوب هابل الفضائي لمدة 10 ايام الى بقعة غير مميزة من السماء و لم تكن النتائج اقل من مثيرة للتواضع على مستوى كوني لاف المجرات ملات الصورة تم رصد حوالي 3 الاف مجرةفي بقعة من السماء تبدو خالية تماما لناخذ دقيقة من اجل استيعاب الامر نعيش في كوكب.. واحد من اصل ثمانية في مجموعتنا الشمسية ..سنفتقدك يا بلوتو تدور هاته الكواكب حول نحم غير مميز نحمنا يوجد في مجرة .. واحد من اصل 500000مليون نجم في مجرة درب اللبانة اعرف.. ارقام كبيرة من جديد و لكن تحملوا معي هذه مجرة البينويل في كوكبة اورسا ميجر و هي شبيهة بمجرتنا و هذه اكبر و اكثر صورة تفصيلا لمجرة دوارة تم اخذها عبر تلسكوب هابل كل نقطة من نقط الضوء هاته هي نجم كل واحدة منها بعضها اكبر و بعضها اصغر من شمسنا و لكنها كلها نجوم و للعديد منها كواكب تدور حولها و بالنظر الى هاته الصورة فان فكرة ان الارض هي الكوكب الوحيد في الكون الذي توجد به الحياة تبدو فكرة عبثية و ما يبدو اكثر قابلية هو وجود العديد من الكواكب ككوكبنا مجرتنا هي واحدة من اصل العديد في مجموعتنا المحلية و هناك العديد من المجرات عندما ننظر الى السماء ليلا يمكننا ان نرى فقط حوالي 3000 نجم في ليلة صافية و مظلمة مما يسهل الظن بان الكون لا يبدو كبيرا جدا و لكننا الان نعرف احسن من ذلك حقل هابل العميق مثال من بضع امثلة يمكنها مساعدتنا من اجل فهم كبر حجم الكون و لكن القصة لم تنته بعد فلاحقا في شتنبر 2003 فعلها هابل مجددا و هذه المرة حدق الى بقعة غير مميزة اخرى من السماء و مجددا لاكثر بقليل من 11 يوما استعملوا كاشفات محسنة و مصفيات مختلفة و هذه المرة هذا ما رؤو you و هذا يسمى ب حقل هابل الايلترا عميق و يمثل ابعد ما راينا في الكون و يوجد بالصورة اكثر من عشرة الاف مجرة كل نقطة و بقعة هي مجرة كاملة و بكل منها ملايين النجوم بكل كوكب امكانية وجود كواكب في مدار حوله و بكل كوكب امكانية وجود حضارة هذا ما نراه حينما نحدق ببقعة فارغة في السماء حيث لا يبدو وجود اي شيء هذا عدد المجرات في الفراغ هذه صورة ل 78 سنة ضوئية انها صورة تمثل صغرنا انها اكثر صورة اهمية في تاريخ البشرية

SML Movie: Picture Day!

SML Movie: Picture Day!


Jackie Chu: Alright, Class. Let’s go over the four states of matter. WE HAVE SOLID… LIQUID… GAS… AND DOES ANYONE KNOW THE FOURTH STATE OF MATTER? “Uh… ‘Black Lives’?” CORRECT! “BLACK LIVES… MATTER”. ALRIGHT CLASS, REMEMBER… PICTURE DAY TOMORROW. SO MAKE SURE YOU LOOK YOUR BEST ‘CAUSE THIS PICTURE GOING IN THE YEARBOOK AND YOU DON’T WANNA LOOK LIKE UGLY FREAK Bowser Jr.: “PICTURE DAY TOMORROW???” “Hope their camera has a wide angle lens… they’re gonna need one to get my giant biceps in frame!” [Grunts, flexes] “Oh dude I can’t wait to wear my new outfit, I’m gonna look so good…” Toad: Oh! Picture day! Ha ha ha! no [Toad] I’m gonna look a pimp tomorrow. I’ma be super fly! Junior: I wonder what I’m going to wear for picture day tomorrow… Cody: Ooh! I wonder if they will let me have Ken in my picture! Joseph: Wait, Ken?. Junior: Ken?! Cody: Yeah, we can be cute. Joseph: That’s stupid Cody! Junior: Yeah, when you’re 30 one day, Cody, you’re gonna look back at you’re yearbook And say I wonder why I had that doll with me Joseph: Yeah Cody: No, we’ll be “Class Couple.”. Junior: W-wait, speaking of Ken, where do you keep Ken when you’re at school? Cody: Oh, he’s at flute practice… …under my desk!… Junior: Ken can play the flute? Cody: Oh hell yeah, he can! Joseph: He’s a doll, Cody! Junior: He’s a stupid doll. Huh, I just wonder what I’m gonna wear tomorrow… Cody: Ohhhhhhhh, Jesus Christ! Jeffy: Hey Junior, I got a question. Junior: Uh yeah Jeffy? Jeffy: What’s picture day? Junior: Oh, it’s where they take a picture of you for the yearbook. Jeffy: Ohhhh I like pictures! Bully Bill: Yeah, but the camera might break because of how ugly you are. (Laughing Sarcastically) haaaha-haha Jeffy: Um, Junior, have you ever seen someone with a fucked up face in the yearbook? Junior: Um, no. Jeffy: You’re about to. (Jeffy Screaming) (Jeffy beating up Bully Bill) TAKE THESE HANDS, BITCH! TAKE THEM! THIS IS THE LAST GOD DAMN TIME YOU’LL SAY SHIT TO ME!!! Jeffy: Hey, Daddy? Mario: *Sigh* What is it Jeffy? Jeffy: Tomorrow’s picture day at school. Mario: Wuh, picture day? We have to get you a nice out- Jeffy?! Jeffy: What? Mario: What happened to your eye?! Jeffy: I got in a fight at school Daddy! Mario: Wha, a fight?! What happened, Jeffy?! Jeffy: Well, this kid was talkin’ all kinds of shit, so I had to spank that ass Daddy! Yeah, He got one good hit on me, right in the eye, but you should see him. He’s all kinds of fucked up! Mario: Wuh- Jeffy! Now you’re gonna have a black eye for your yearbook photo! Jeffy: I don’t give a shit! Mario: *Sigh* Jeffy, I think I got an outfit for you You’re- you’re gonna wear this! Jeffy: Daddy, I’m not wearing that. Mario: Yeah you are, Jeffy you’re gonna look nice with this. Jeffy: No Daddy, I wanna pick out what I wanna wear, and if you don’t let me, we’re gonna be twins, Daddy. Mario: *Sigh* (Junior snoring) Junior: *yawn* Oh man, I’m ready for picture day! I’ma get Chef Pee Pee make me some breakfast! Picture day! Chef Pee Pee: I crackin’ the egg! Ooh! I’m makin’ the omelet! Crackin’ the egg! I’m makin’ the omelet! (Lots of omletts) I’m makin’ the omelet, makin’ the omelet, makin’ the omelet… i’m makin’ ome- Junior: Uh, Chef Pee Pee? Chef Pee Pee: *sigh* What, Junior? I’m making an omelet. Junior: Uh, can you iron my bib? I gotta look good for picture day. Chef Pee Pee: Uh sure-(Chef Pee Pee laughing childish) Junior: Wuh-wuh-what? Chef Pee Pee: Oh God, you got a PIMPLE on your face! PIMPLE FACE!! Junior: Wuh-wuh-wuh-wuh-pimple on my face? Chef Pee Pee: Oh my God, you haven’t noticed that big ass pimple on your face? Oh my God, you could see it from space! See it from space, see it from space! Junior: Wuh-wuh-what are you talking about?! Pimple on my face?! Chef Pee Pee: See it from space! WOOOOOO! Makin’ the omelet… (Junior screaming) I’m a big, pimple-faced loser! Look how big that pimple is! What am I gonna do?!!!!! It’s picture day, and I have a pimple on my face! DAAAAAAAAD! (Bowser sleeping) Dad, dad, dad, wake up! Bowser: Ugh, what Junior?! I told you not to wake me up unless the house is on fire or Chef Pee Pee’s naked! Junior: Dad, today’s picture day at school, and look! Bowser: Puuuuuh-puuuuuh-puh-puh-puh-pimple face! Oh my God, that is a pimple face! Junior: Dad, what do I do about the pimple? Today’s picture day! Bowser: Huh, well there is nothing you can do Junior. You were chosen! Lemme tell you a story. Every year, somebody is chosen by the pimple to be the pimple-faced freak. And you were chosen! Yeah, it was this kid in high school named Pete! And man, he was chosen too! He had a huge pimple! Ah Man, we called him “pimple Pete,” “pimple freak,” even “pizza face freak.”, Ah Man, we even called him ugly, ’cause damn! That pimple made him SO ugly! Heh heh. Huh, I wonder where Pimple Pete is now. Eh, he probably killed himself. I know I would’ve! Heh heh heh. Well, the moral of the story is… well…it was nice knowing you Junior! Heh heh heh heh. Junior: Uh, uh, what do I do?!., Bowser: (quietly)Heh, that was funny Cody: Hey Junior. Joseph: Hey dude. Cody: It’s really early in the morning, Junior. School is about to start. What’s up? Joseph: Yeah dude!’. Cody: Hold up, I know what you guys are thinking… DAMN! Cody’s one tall glass of water! And I know y’all thirsty, but it’s okay. Grab yourself a straw and take you a SUCK! Joseph: Hey guys, what do you think about my new outfit? Ha ha, it’s pretty nice, huh? Cody: Well it doesn’t even match, Joseph. Joseph: Well, it doesn’t have to match, I look super fly in it! Cody: Well, I look like a Boeing 747 ’cause I’m so fly! Joseph: Well, um, I look like a spaceship, that’s how fly I am! Cody: Well people are always trying to swat me with a fly swatter ’cause of how fly I am! Uh how fly are you, Junior? Wo-woah, Junior! Why do you have a bag on your head? Joseph: Yeah dude, why does it have a sad face on it? Junior: I’m ugly guys. Cody: What are you talking about, Junior? I’d still bend you over like a naughty child. Joseph: Yeah dude, what’s wrong? Junior: I woke up this morning and God hit me with the ugly stick! Cody: Wuh-Junior, you’re not ugly! It can’t be that bad! Joseph: Yeah dude, just take the bag off. Cody: Wuh-WOAH! Woah, oh my God Junior! Y-You are a pimple-faced freak! Junior: I know Joseph: Yeah, that pimple is huge, dude! Cody: Oh my God, Junior, what is your pillow, a slice of pizza?! Joseph: Oh man, it looks like a planet on your face, dude! Junior: Shut up guys I know I’m ugly I know I have a big pimple- what do I do?! It’s picture day! Cody: Well, you could try standing next to me, and everyone would be looking at me and not your stupid ugly pimple face. Junior: Uh, uh, guys… Cody: Oh that’s actually a kind of a good idea! What if we added something to you that’s so distracting that no one noticed your pimple? Junior: Like-like-like what? Cody: Uh, like, like a hat! I’ll get you one of my hats! Junior: Uh, okay. Alright guys, did the hat block out the view of the pimple? Cody: Uh, no, but it blocks the haters! Junior: Uh, Joseph, what does it look like? Joseph: It looks like it’s your bedtime dude Junior: Wha-Cody, this hat’s stupid, it’s not working. Cody: Uh, okay, well you can try popping the pimple! Junior: Try popping it? Cody: Yeah just squeeze it. Junior: Uh okay. (Junior reaching) Cody: You can’t reach your face? Junior: I can’t reach my face guys, my arms are too short! Cody: Oh. Well, I’m not touching it. Joseph: Yeah I’m not gonna touch it either dude. Junior: Wuh-how are we gonna pop it? Cody: Uh, I kinda, I kinda just wanna bite it! You know, just pop it in my mouth like a grape, like n’yah! Junior: You gonna do it? Cody: No, God no, ewgh. Junior: Uh, alright guys what am I gonna do? It’s picture day today and I can’t get a picture with this big ugly pimple on my face! Cody: Uh, well we can still try the distracting thing. Junior: Okay, uh, what would be distracting? Cody: Uh, what if we shaved your eyebrows? Junior: Sh-sh-shave my eyebrows? Cody: Yeah, everybody would be looking at where your eyebrows should be and they wouldn’t look at your pimple! Junior: Hm, that is a pretty good idea, right? Cody: Yeah, let’s shave your eyebrows! Junior: Let’s try that, let’s try it! (Electric Razor in Background.) Cody: Ahhhh…uhh-oh-oh God. Ah, oh my God Junior I.. I am so sorry. Junior: Wuh-how does it look Joseph? Joseph: It looks real bad dude, real bad. Junior: Well is it distracting from the pimple? Cody: Yeah. Yeah it’s all.. it’s all very distracting. Joseph: Oh, very dude. Junior: Do you still see the pimple though? Joseph: Yeah. Cody: Yeah a little bit. Junior: I don’t want to see the pimple at all, I want to be completely distracted from this stupid pimple. Cody: Uh, okay. Joseph: Well, maybe if you shave your head dude. Junior: Shave my head? Okay do it, do it. Cody: Okay, yeah, I guess we can try that. Uh, hang on, let me, let me… (…) Ahhhh… ahhhhh… Junior: Alright guys how do I look? Joseph: Dude you look racist. Junior: Wha-racist? Cody: Ahh, oh God Junior, you look horrible. Junior: Well do you still see the pimple? Cody: Yeah. Yeah that’s ALL I see Junior, you don’t have any other features, you just look like a normal turtle with a giant pimple on its face. Joseph: Yeah dude it looks bigger. Cody: Yeah I think it’s growing. Joseph: Growing? Guys! We’re supposed to make it where you don’t see the pimple and now all y’all you see is the pimple?!.,’ Joseph: Yeah. Cody: Yeah… Junior: Guys! School starts in five minutes, what are we gonna do?! Cody; Well, I guess there’s one more thing we can try. Junior: What? Cody: I-I made this gun that’s supposed to shrink things. Junior: Shrink things? Cody: Yeah but I’ve never tried it out before so something could go horribly wrong. Junior: Guys, I don’t care, school starts in five minutes, try it on me, try it, I don’t care what happens, try it. Cody: Are you sure Junior? Junior: Try it! I want this pimple GONE! Cody: Alright. (Laser sound effect) (Cody and Joseph in shock) Junior: Gu-guys, how do I look? Did it shrink it? Joseph: Uh, uh, I’ma see you at school dude! Cody: Yeah, good luck Junior! Junior: Wha… GUYS! Did it shrink it or not?! (Junior screaming) CODYYYYYYYYY!!!!!! Cody: …Yeah? Junior: WHAT DID YOU DO TO MY FACE?! Cody: Look Junior, I told you something horribly wrong could happen! I guess I duplicated the pimple instead of shrinking it! Junior: WHA-CODY I’M A FREAK! Cody: Yeah, I know, Junior, Hell even Ken could read your face at this point. Junior: Cody what am I gonna do?! Today’s picture day and now I have like a million pimples on my face! Cody: Uh, I think you’re all out of options, Junior. You should probably just put that paper bag back on your head and th-that’s your life now. (Junior sobbing.) (School bell ringing) Jackie Chu: Alright class. Today is picture day. So I’m gonna call you up one by one to have your picture taken. I hope you all look really nice. ‘Cause this picture gonna be in the yearbook forever. Joseph: Oh man, I can’t wait to get my picture taken! Cody: Me too. Me and Ken look BANGIN’! Joseph: Hey did Junior ever fix his face? Cody: No don’t talk about it. Joseph: Oh ok. Jackie Chu: Alright Cody. You’re first. Cody: Woah. I thought they would save best for last. Oh well. Guess we’re just gonna have to make everyone look bad. Come on Ken. Alright. How do you want us, Chu? I was thinking we have Ken on his hands and knees and I’d be tongue punching his fart box. Jackie Chu: Cody. Why you bring your toy with you? Cody: Wuh-toy? No, we left those at home. Wa-wait. You mean I could’ve brought those?! Oh my God! Jackie Chu: Alright Cody you done get out of here. Cody: Uh, alright. Send that to me on the ol’ snap. Well, that was one, sexy photo Ken. Anyway, back to flute practice. (SEXUAL MOANING) Jackie Chu: Alright Jeffy. You’re next. Jeffy: I’m ready for my picture to be taken. *Moment Of Silence* Wha-a-t? (Jackie Chu takes the photo in silence) FUCK THE HATERS!! Jackie Chu: Alright Junior. You’re next. (Joseph and Cody cringe) Junior: I’m coming. Cody: Not Junior. Joseph: Oh dude, he’s gonna get made fun of! Junior: *sigh* Jackie Chu: Junior! Take the bag off your head. Junior: Wh-what bag? Jackie Chu: The bag with the frowny face! Take it off! Junior: It-it’s against my religion to take this bag off. Jackie Chu: Junior. We don’t know who you are. Take the off the bag, now! Junior: You know who I am, you’re saying- Jackie Chu: Junior, take off the bag! Jackie Chu: HOLY SHIT YOU KUNG PAO CHICKEN POCK FREAK HAHA!!! YO-YOU HAVE PIMPLE ALL OVER YOUR FACE! This going to be a hilarious picture Junior: NO NO DON’T TAKE MY PICTURE! *poof* Jackie Chu: Wha-what happened to your head?. Junior: What do you mean?. Jackie Chu: The pimples are all gone! Junior: They’re all gone? Really? Jackie Chu: Yeah. Junior: Wh-wha-okay! Joseph and Cody: Uh… Junior: Hey, Guys, are my pimples really gone? Cody: Yea-yeah. Joseph: Ye-yep dude. Junior: So your gun must’ve worked Cody! Cody: Y-Yeah… Yeah, especially the shrinking part. Junior: Yeah, it must’ve had a delayed reaction or something. Cody: Yeah… Junior: *sigh* man at least my pimples went away before my pictures was taken Cody: Ya. SML Question: What is the most EMBARRASSING thing that has ever happened to you? Subscribe Kill me

Google I/O 2013 – WebP: Deploying Faster, Smaller, and More Beautiful Images

Google I/O 2013 – WebP: Deploying Faster, Smaller, and More Beautiful Images


ILYA GRIGORIK: [INAUDIBLE]. STEPHEN KONIG: Are we ready? OK, good morning, everybody. AUDIENCE: Morning. STEPHEN KONIG: Thank
you for joining us. We’re here to talk to you guys
today about WebP, which is an image format we developed which
will allow us to deploy beautiful, faster, and smaller
images on the web. My name is Stephen Konig. I’m a product manager
here at Google working on the WebP team. ILYA GRIGORIK: And my name
is Ilya Grigorik. I’m a developer advocate on the
Make the Web Fast team at Google, where we work on WebP. So first of all, I guess to kick
it off– why are we here? Why do we care about images,
image formats? And the answer is simply that
most of the bytes that we transfer in our applications,
in our web pages, are in fact images. If you look at the stats for a
desktop site, over 60% of the bytes that we transfer
are in images. So this is in addition
to HTML, CSS, and everything else. And on mobile, it’s
even larger. So it’s 70%. So optimizing images is
oftentimes the number one performance optimization
that you can do. And in fact, if you look at
stats, there’s a great project called HTTP Archive. What HTTP Archive does is it
crawls over 300,000 sites on the web, top sites on the web,
and basically tracks how the pages are constructed. So it doesn’t care about the
content of the pages, simply how they’re constructed. So how much CSS are
we downloading? How many images are
we downloading? And the rest. And what you can see here is
that first of all, the trend is pretty clear. We are building more ambitious
and more complex applications. The size is growing. So in fact, on desktop, an
average site today is over 1 megabyte in size, almost
200 kilobytes. And on mobile, it’s a
little bit smaller. So that’s good. It means that we’re optimizing
for mobile experience and for mobile networks. But nonetheless,
700 kilobytes. But then you look at images. And you’ll notice that most of
the requests are for images, both on desktop and on mobile. And the majority of the
bytes are in images. So this is why WebP matters. And of course, you heard Linus
at the keynote talking about WebP and how it can save
bytes for users. And saving bytes is also
important because bytes are literally expensive for
a lot of people. You can be charged up
to $1 per megabyte. If you’ve ever roamed with your
mobile data plan, you know how expensive
that can get. And it also affects
performance. So that’s trend number one. We’re building more ambitious
applications, transferring more bytes. Then, of course, you guys have
all been enjoying the beautiful Pixel screens,
right, HiDPI images. This is another trend that is
making this problem in some sense a lot worse for us, or
much harder, I should say. So the Chromebook Pixel
has 240 DPI. And what that means is we’re
basically packing more physical pixels into
the screen. So to get these high-resolution
images, we also need to deliver more
pixels to get a high-resolution image. There’s no magic here. You can’t ship a low-resolution
image to a Chromebook Pixel and expect it
to display nice sharp edges across everything. We literally need to
ship more pixels. So if we’re not careful, we’re
just going to inflate all of the images by 4x because
we’re shipping four times more pixels. Now hopefully, that’s not
what we’re going to do. And that’s why WebP is here. So first and foremost, WebP
is about improving data compression. We want to make sure that we
build a faster web and a more beautiful web. But also– and we’ll talk about this in
detail– we want to make it simpler to deploy the beautiful
web, which is to say, today, we have a variety
of formats which we need to balance between. It’s like, oh, you want
to do an animation. You need to use a GIF file. You want to have a
transparency– well, you’ve got to use a PNG. But then you can’t animate it. And then if you want to have
lossy versus lossless compression, you kind of have
this annoying tree of options. And with WebP, we can actually
do all that and give you better compression. So it’s kind of one format that
can achieve all of these great features. STEPHEN KONIG: So let’s talk a
little bit about the history of WebP and where
it came from. You may have heard of our WebM
video and audio formats that we launched a couple
years ago. The video portion of WebM uses
VP8 as the video codec. And as part of the development
of VP8, someone actually noticed that hey, if I take a
key frame from a video stream, it offers really great
compression. And this would actually
be a really good basis for an image format. And so essentially, that’s
how WebP got started. It was a sort of aha moment
from one of the engineers on our team. And so we adopted that
methodology as the basis for the WebP image format. So essentially, a WebP image
is nothing more than a key frame from a VP8 video stream. But it offers much better
compression than alternate technologies, as we’ll see. Both WebP and WebM
are open-source, royalty-free formats. So you’re free to deploy
them, use them. There’s no fees you have to pay,
no licenses you have to sign up for. It’s all ready to go
out of the box. So a little bit more in terms of
the timeline of how we got to where we are today. We did the initial release back
in 2010, which initially just supported lossy
compression, so basically an alternative for JPEG. In 2012, we followed that up
with support for both lossless and transparency, or alpha
channel support. So at that point, we became
an alternative potentially for PNG. And interestingly, one of the
things that we did here is we separated out transparency
from lossless. Because today, if you want to
have transparency, you have no choice really but use PNG. But that forces you to
also use lossless. And in some cases, you might
want transparency with lossy, which is something you
can’t actually do. But you can with WebP, and
we’ll talk about that. And then earlier this year, back
in April, we launched our most recent release, which
added support for color profile, animation,
and metadata. So really at this point, we have
a full set of features that allow us to be an
alternative replacement for both GIFs, PNGs, as
well as JPEGs. But we’re not done yet. There’s more work we’re doing. And we’re continuing to work
on improving the format. Top of the list are performance
optimizations. We’ll talk about performance
in a little bit. And I think we offer a really
good performance for the features we provide today. But there’s always room for
improvement, and we’re going to continue to focus on that. And similarly, better support
for mobile on ARM devices– that’s really sort of
code optimizations. We support a wide variety of
mobile devices in terms of Android and iOS today. But there’s more work we can do
to make that even faster. And then looking ahead, some
of the other things we’re thinking about are how can we
support higher color depth images, so more than eight
bits of color. How could we do things like add
layer support to support 3D types of images? And even doing things like
progressive rendering. And I’m sure there’ll
be other things. As the format becomes more
widely adopted and used, we expect there’s going
to be feedback. And we’ll take that into
consideration as well. ILYA GRIGORIK: So you heard
it here first– HiDPI, 3D progressive
rendering with WebP. STEPHEN KONIG: Exactly. Probably by Friday,
I’m guessing. So let’s talk a little bit about
performance and how WebP actually does in the wild. So the first thing I wanted to
talk about is sort of let’s do a little bit of a comparison
of WebP versus JPEG. And as Ilya mentioned, one of
the challenges we have on the web today is just the
predominance of images as a driver for how many bytes we’re
sending down the wire and thus how fast our
pages render. So in the old world, in the
past, the only way you could really solve this problem was
to reduce your image quality if you’re dealing with
JPEG images. So you have two examples
of JPEGs up there. The one on the top
is quality 80. The one on the bottom
is quality 10. And you might hopefully be able
to see that with quality 10, you’re getting very noticeable compression artifacts. The image looks very
visibly degraded. But the advantage is you’re
going from a 1.3 megabyte image to a 200 K image. So if you were in a world where
you were like, I have to reduce the size of my images
because it’s taking up too many bytes on my page, this
was literally only the option you had. But you had to trade off
visual quality in order to get it. And we think that’s the
wrong tradeoff. WebP offers much better
compression than JPEG across all quality levels. So you’re able to get the same
sort of benefit without reducing your image quality. And an illustration of that is
this chart, which shows you– for a number of images which
we did a sample on– for visual quality 20 all the
way up to 90, the file size for JPEG versus WebP. WebP is the one in
green below. And what you see is that for
all image quality levels, quality settings, WebP
is significantly smaller than JPEG. The one thing I’ll note is
you’ll see on the JPEG slide, right around quality 75, you’ll
notice that the file size sort of grows exponentially
with JPEG. So it does pretty good, mostly
linear from below 75. But after that, image sizes get
pretty big pretty quickly. With WebP, you have a much
more linear progression. And so ironically, as you look
at higher quality images, which are important for the
reasons we just mentioned, WebP’s benefit actually grows. So while on average, WebP is
about 30% smaller then JPEG, when you look at higher
quality images, the effect is even more. And so that’s something
to keep in mind. So let’s talk a little bit
about WebP versus PNG. So there are cases, obviously,
where you can’t tolerate compression artifacts at all. So you’re in a sort
of lossless mode. Or as I talked about earlier,
maybe you need to have transparency. And again, WebP is another great
alternative here because on average, we get about a 30%
file size reduction over PNG. But again, as I mentioned, one
of the features WebP has is that it allows you to have
lossy images with transparency, which is something
you can’t really do today with PNG. So in cases where you have an
image that you’re able to tolerate a little bit of
compression and artifacts– maybe you want a, say, quality
90, quality 80 type of image. But you need transparency. You can encode that in WebP
versus PNG and save even more. So in this chart, what you’re
looking at is a series about 1,000 PNGs that we compressed
both with WebP lossless, which is the purple. So that’s again no
loss whatsoever. And then we recompressed them
again with lossy, but preserving the transparency. And what you see is that for
the vast, vast majority of images, we’re significantly
smaller than the PNGs. And there’s a couple of extreme examples at both tails. But obviously, the average
and median is strongly in favor of WebP. ILYA GRIGORIK: So I think a
very common use case is to have a lot of small PNGs for
things like icons, which need an alpha channel. And they’re also very small. And that’s where you can get up
to 80% or higher savings in terms of bytes, which is huge. STEPHEN KONIG: Yup, exactly. So the nice thing about WebP is
it allows you to be in this world in which you can really
stop caring about what image format you need to use. Today, you’re in this mode of,
well, if I need animation, I have to use GIFs. And if I need lossless or
I need transparency, I have to use PNG. And if it’s not one of
those things, I can probably use JPEG. But as a web developer, it
forces you to think through which image format
you have to use. With WebP, you don’t have
to do that anymore. Literally, you can use every
image as WebP, because it supports all of those
features. And we think that’s a great
benefit to web developers. So let’s talk a little bit about
performance from the encoding and decoding side. So this is all great. WebP’s a lot smaller, offers
great features. Where’s the catch? Well, lunch isn’t free,
despite the fact that we keep trying. We’re not quite there yet. So you should know that WebP
is more expensive to encode and decode. But there are some reasons
I think why– sort of take these numbers
in context. On the encoding side, it takes
about 5 to 10 times longer to encode a WebP image than a
JPEG, which sounds pretty significant. But when you realize that
encoding is typically done only once– in the vast majority of cases,
you’re dealing with a static image corpus. And you’re encoding those
images one time. And you’re doing it not in
your serving pipeline. You’re doing that offline. And so if it takes an extra
couple CPU cycles to encode your WebP images, that really
shouldn’t be something that would be a barrier
to WebP adoption. On the decoding side,
we’re more computationally intensive. So yes, it takes more time
to decode a WebP image. And on average, we’re about
1.3 times slower than a comparable JPEG. What we’ve seen, though, is that
in the vast majority of cases, unless you have a very
specific use case, the amount of time you’re saving just by
virtue of the fewer bytes you’re sending more than
makes up for the increased decode time. So that’s something
to bear in mind. With that said, high encoding
costs can be a limitation in some use cases. And I’ll give you one example
from Google as we’ve been trying to deploy WebP. So if you think about Maps,
map tiles are actually dynamically generated PNGs
that are based on the underlying raster data. When we actually tried to deploy
WebP instead of PNGs, what we found is that the
increasing encoding time, because those images are
generated dynamically, sort of negated a lot of the benefit
from the smaller file size. But that’s a pretty special
case because there are no static map tiles. They’re all generated at runtime
based on your request. So in that case, in that
particular instance, encoding is actually in the
serving pipeline. And so if you have a situation
like that, you should be mindful of these stats. But again, as I said, that’s
really the exception. The vast majority of
cases are static. And then we talked a little bit
about bandwidth savings. But in addition to just raw time
to send fewer bytes, you have to be aware that a lot of
users, especially on mobile, actually have to pay for
their bandwidth. Either they’re on metered plans
or capped plans, or they’re roaming. And in some cases, it’s
literally $1 per megabyte. So not only are you saving users
time and making your site faster for them and
giving them a better experience, you can literally
save them money. And so that’s an important
consideration to keep in mind, as well. ILYA GRIGORIK: So the other
thing I’ll add is we have the chicken and the egg problem
here with performance. Of course, there is also a
growing support for hardware– or sorry, hardware support is
improving for WebP and WebM. So we have WebM, which is
driving this trend, as well. So performance will get better
as more hardware deploys native support for it. So a good example of
this is actually– there’s this tradeoff between
CPU and bandwidth. And one of the concerns, a valid
concern that a lot of people have is, great. So we’re going to ship
these bytes faster. But if we take more time on
the client to decode the images, is that going to negate
all of the benefits? And the eBay team actually
just recently had this great study– I encourage you guys to
check out their blog. It’s an eBay tech blog– where they compare a number of
different image formats. And they set up a test case
where they have an image gallery with 30 different
images. And they compared JPEG
versus WebP. And what you’re seeing here
is a film strip which is basically 100-millisecond
increments, so 0.5 seconds, 0.6, 0.7. And you’re seeing the rendering
performance. So on the top, we have WebP. And even though it takes more
time to decode the WebP images, because we can ship
them faster, we can decode them faster. And we can display them
on the screen. So in this case, we’re actually winning in both respects. We’re getting better
visual performance. We’re shipping fewer bytes. The user’s paying less for
the bytes transferred. And we’re paying less
for the bytes transferred from our server. So it’s a win all around. And on this gallery, we were
able to compress it from 750 kilobytes down to 474. So a really cool case study,
and I think it illustrates that a lot of our web apps can
benefit a lot from WebP. So we talked about
performance. Let’s dive a little bit deeper
into tooling, into how do we actually create these files
and how do we serve them. So long story short, there
are plugins and converters that we provide. You can go to our site,
download the binary. And you have a cwebp and
a dwebp, which is an encoder and a decoder. And you can convert any image. We give you a lot of different
knobs– if anything, there’s too many knobs– to tweak the quality
and levels and all the other stuff. That’s kind of the manual
way to do it. You can also get plugins
for all of the popular image editors. The Wikipedia page actually is
a great resource to start, if you want to get started
with WebP. They have a great matrix of all
the different tools out there and how to get
WebP support. And with time, as more people
adopt it, of course, this will also improve. And I’ll also call out
a couple of tools like language bindings. So there are great libraries for
virtually every language runtime available. So you can do these
optimizations at build time or at runtime. And there’s some other tools
which we’ll demo a little bit later. In terms of the actual adoption
and support today, of course, Chrome supports WebP. Opera also supports WebP
and has been supporting WebP for a long time. We also ship native libraries,
or we make the native library available for iOS. And it’s also supported
natively by Android. So you can deploy both on native
apps and on the web. There are ways to basically
polyfill support for WebP on other browsers. So you can actually have a
JavaScript decoder for WebP. I’m not saying that’s
necessarily a great idea, but you could do it if
you wanted to. And we’ll show you a couple of
examples of how to do perhaps a better deployment strategy. And of course, we’re also
working with the Firefox team to get WebP supported. And fingers crossed, I hope
we will see it this year– sometime later this year. So let’s start deploying
WebP on the web. Great– we’ve motivated
the use case. We know that we should
be optimizing images. How do I even get started? And in a nutshell, there are a
couple different approaches. You can use server detection, or
server-side detection, and client detection. And we’ll talk through
both paths. But one thing I’ll point
out is they’re not mutually exclusive. You can and probably should use
both, depending on your application. So the client-side detection
is an easy case. Basically, you just
want to test– the client loads some
JavaScript. And you want to find out if
the client supports WebP. Because then you can dynamically
inject the image tags and fetch the appropriate
asset. So you can have both a JPEG
and a WebP on your server. And for example, Modernizr
provides a one-line test function that you can run
to determine if a client supports it. If you actually unroll, and
you look at the underlying implementation, it’s basically
three lines of JavaScript in there. They just inject an image tag. And they add an on load element
to make sure that the image is decoded properly. So you don’t need Modernizr
to do this. You can just inline this
code directly. And I mentioned the JavaScript
decoder. You can use that as well. So in fact, if you only
want to serve WebPs, you can do that. It would just serve the
JavaScript decoder, and then your images would
load just fine. So that’s client-side. Server-side is a little bit, I
think, more interesting and more scalable for a lot
of applications. So the way this works is when
the client makes a request for an image, it sends an Accept
header, which advertises which file types it supports. So this is very similar to, for
example, a compression. So gzip negotiation, where the
client advertises their support of gzip, and then the
server returns a response which is compressed. Similarly here, the
client advertises, hey, I support WebP. And then the server can select
the appropriate asset, in this case, a WebP file. So Opera provides the Accept
header, which advertises WebP. And we just recently landed
support in Chrome for that, as well. So it’s currently in Canary. And it will be making it to
Stable soon, once we go through the release cycles. So with that, basically, your
server can just look at the Accept header and serve the
appropriate one, such that when, for example, an IE user
comes, they won’t advertise WebP, for obvious reasons. And you can just serve
a PNG file. And this logic is handled
by the server. So you don’t have to modify
your application. This is the best part. And I’ll show you some
examples later in an actual demo. A slightly different
approach– and this is actually very
popular with a lot of web optimization products today– is to rely on user-agent
sniffing. So the reason this exists today
is because we didn’t have the Accept header. So we’re fixing this problem. We’re making it much better
and easier to deploy. But you will find applications
or approaches that will use this strategy, where they will
check the user agent and say, hey, I know that Chrome
supports WebP. Hence I will serve the HTML
markup, which has WebP files embedded in it, or references
to WebP files. Whereas for Internet Explorer, I
will serve the JPEG version. So then there’s a couple of
other quirks that you need to worry about– for example,
caching. If now we’re generating multiple
versions of the HTML, one with WebPs and one with
JPEGs, how do we make sure that the IE user agent doesn’t
get the WebP assets? This is why we’re marking this
document as private, which is to say, the client
can cache it, but intermediate servers can’t. So some intermediate proxy
can’t cache it. So those are the kind of
do-it-yourself approaches. The good news is we actually
provide tools that will do all of this work for you. So we have a PageSpeed product–
or family of products, I should say– which
are both open source. We have modules available
for Apache and nginx. And we also have a Google
hosted version, which is PageSpeed Service. So let me show you guys
an example of this actually in action. So I have– I’ll make this a little
bit smaller. On my blog, I have
a little gallery of my favorite photos– beautiful images, or at least
I think they are. And you can see here
that I have Tokyo, Oxford, and Aurora. And they’re about 3 megs. Now, I’m serving them as PNGs. That’s just how I happened to
have saved them on my server. 3 megs is probably
a little high. So I think we can do better. So I’m also running
PageSpeed Service. And the way PageSpeed Service
works is I have my origin server, where all
my markup lives. But my WWW domain is actually
routed through Google servers, such that when you visit my
site, you actually hit a Google server, which then
fetches my content. It optimizes it and serves
the optimized assets. So let me show you
the dashboard for PageSpeed Service. We have the overview here. So this is my server,
Google servers. And then the visitor is hitting
the Google servers. And all of the magic
happens here. So we provide a number of
optimization for things like let us optimize your CSS, your
JavaScript, and all the rest. And optimizing images, for the
reasons that we mentioned previously, is a
big part of it. So in here, you can see that I
can enable a whole bunch of checkboxes for things like
resizing images on the server. So I’m just going to
click that on. And I’m going to also convert
PNGs to JPEGs. Or I’m going to allow it
to try to convert. It won’t necessarily do it. It will try to make sure that
we get better compression. And let’s apply that
to the site. And let’s go back. And we will refresh this page. And we will actually refresh
it twice, because the way PageSpeed Service works is it
optimizes these images on demand and in the background
such that on the second request, you can see that the
file names have changed. And they are now coming
through as JPEGs. So what we have done is we have
optimized these images in the background. We’ve determined that
JPEGs are a better version of this file. So we went from multiple
megabytes to hundreds of kilobytes. So now the page is
468 kilobytes. So this is done automatically. I didn’t modify my
app in any way. And similarly, I can go
into the API console. And I’m just going
to enable it all. So I’m going to recompress JPEGs
and allow it to compress it to WebP as well. So once again, Apply to Live. And we’ll reload the page
a few more times. And it’s going to now try both
JPEG and WebP and pick the best format for these images. And you can see that it actually
determined that WebP is the best format. It detected that I’m
running Chrome. And now it’s serving these
images, which went from a megabyte plus to 80 kilobytes. So in total, the file size of
this page is 328 kilobytes. So we’ve just decreased the
size of this page by 10x without actually touching the
page itself, which, of course, is beautiful. And notice that this
was all images. So you can use PageSpeed
Service, which is what you saw here. Or you can use one of our
open-source modules like Apache or the nginx module
to do this for you. STEPHEN KONIG: So let’s talk
a little bit about interoperability. I think it’s fair to say we live
in a world in which WebP isn’t ubiquitous yet. We hope we’ll get there. We’re confident we’ll
get there. But just like previous formats
before us, like PNG, there’s this sort of uncanny valley of
time where you have mixed support across the web. And you have to deal
with that today. In fact, some of you
may have seen– a couple weeks ago, Facebook
actually did a field trial where they actually started
serving user photos as WebP instead of as JPEGs. It generated a slight
bit of controversy. Actually, I think overall, from
what we know, they’re very pleased with the
results they saw. But it did highlight some
interoperability challenges that you have to be aware of and
for which we think there are some good solutions. That said, this is a
short-term problem. And just like PNG eventually
got ubiquitous support, we feel confident WebP
will as well. And a lot of these issues
will go away. But in the short term,
these are some things to be aware of. So link sharing is one. So I look at my beautiful
site. I see this awesome
image in WebP. I’m running Chrome or Opera. It looks beautiful. I right-click, copy
link to image. I paste it in an email
to my wife. And I say, honey, this
is really cool. You should check it out. She happens to be
running Firefox. And she clicks the link,
and sad face. Because it can’t decode
a WebP image. So a good way to solve this
problem is the Accept header negotiation that we talked
about earlier. Because in that case, what the
server will see when it gets the request, even if it has a
.webp attachment to it, it won’t see the image/webp
in the Accept header. It will know it’s not safe to
return WebP to that client. And in that case, you can
actually return a JPEG. It doesn’t matter what
the extension is. You can serve the right image
format based on Accept header. Similarly, there’s an
issue with Save As. Right click an image,
Save As to disk. And then I email it to somebody,
or I happen to just double-click it from
my File Explorer. And oops, I don’t have an image
viewer that’s capable of rendering that. Or the person I send it to
doesn’t have one, either. So there’s a couple of ways to
fix that, one of which we’ve already taken care of. So starting with M28, Chrome is
actually a file handler on all platforms for WebP. So if the user has Chrome,
they’ll at least be able to open a WebP file. But another sort of approach
to this is to provide an explicit download option link
and have that always serve a common format such
as a JPEG or PNG. There’s some user experience
benefits to that as well. Because the image that the user
sees on the screen might not be the one that
you want to have them ability to download. For example, they might want to
download a full-resolution version of the image versus
something that’s scaled down. So there’s some other
reasons why that approach might make sense. Let’s look at a few case studies
in the wild and see some of the benefits we’ve
gotten from WebP. So earlier this year, we posted
a blog post about some work we did on the
Chrome Web Store. So for those of you who aren’t
familiar with it, Chrome Web Store is a great site. Lets you browse tons of add-ons,
extensions, and applications for Chrome. But it’s a very image-heavy
site– tons of promotional tiles, big
tiles, little tiles, tiles rotating in and out. It’s a gorgeous site, but it’s
definitely very image-heavy. So we went through a process
of converting all of those images to WebP and
serving them to Chrome and Opera users. When we did that, we got an
average 30% reduction in image size across all the images
on the site. But more importantly,
we reduced page load time by a third. And that was pretty
significant. And it really, at the end of the
day, turned out to be not a tremendous amount of work. Really, we just request– we
have a server that serves image assets. It’s capable of doing
transcoding. And then we just tell it, hey,
if the user is Chrome or Opera, serve WebP. And magically, things start
getting faster very easy. And one example is
the calendar tile that you see here– two versions of the same tile. One is 32 and one is 8.3 K. So
tremendous difference with very little plumbing work
that was needed to get that to happen. And we’re in the process of
deploying WebP across all Google sites that
serve images. So some of these are
supported today. Others you will see support
rolling out across the rest of the year. Some of them are in various
stages of experimental and field trial. But literally across the gamut
of sites across Google, we are in the process of doing
all this transition. And we’ll be doing that
throughout the course of the year. We’ll talk about one more
of them in a moment. ILYA GRIGORIK: So we’re
definitely dogfooding a lot of the WebP deployment
strategies today. And one great example is
actually data compression. So once again, if you saw the
keynotes, Linus actually talked about this, where
we now have a new feature in Chrome. And if you haven’t tried
it, we definitely encourage you to. If you go to the Apps Store and
download Chrome Beta, you can go into your settings. There’s a new tab called
Bandwidth Management. It can basically enable Chrome
data compression. So the way that works is when
you make a request, we will actually route the request to
a Google server, which will then fetch the content and all
the associated images, compress it, minify it, optimize
it, and then deliver the optimized version to your
site or to your phone, rather. And we find that we get, on
average, 50% data reduction when we do that. So this is a huge benefit
to the user. And guess what? Images are once again the
number one optimization. So in fact, one of the big
optimizations that we do in Flywheel or data compression is
just re-encoding literally every single image to WebP. We just transcode everything
to WebP. And that helps reduce the
data usage quite a bit. This is a great site. I love to share this site. Let’s actually look at
a demo to appreciate it in all its glory. So this is a beautiful animation
built by Oakley. So as I’m scrolling
down, we have this beautiful parallax effect. We have animations coming
in and out. This is using HTML5, lots
of images, nothing special under the hood. But the one trick to
this page is– let’s just reload it here. This is where we definitely
need a good connection. Let’s see. 100%– here we come. 86 megabytes of images to
deliver this great experience. So the way the site is built– and it’s a beautiful
experience. I love it– is it’s delivering a lot of PNGs
with transparencies in the background. And so as you move, these are
basically layers that are being positioned right
within the browser. So 86 megs is quite heavy. So as just an experiment, we
said, look, WebP can probably do a bit better,
especially with transparency, as you guys heard. So we can take the 86 meg,
without any modifications to the site, and convert
it to 28 megs. So let’s be clear. 28 megs is still massive
for a web page. But it’s 60 or 58 megabytes
less, which is pretty good, I think. And so some cool tools or some
tips for getting started. And if you just want to
experiment with WebP, there’s a great service. It’s a free service called
IMG2WebP.net. You can just go there. You can drag in an image, any
image off your local file system, and just play
with the settings. And you have a bunch
of sliders. And you can see the
output without installing any other tools. So this is not something that
you want to use to convert all of your images to
WebP, but it’s a great way to get started. If you’re on Windows, there’s
a great new tool that was released just a couple of weeks
ago where you can punch in the name of the site or
just the URL of the site. It’ll download your site. And it’ll recompress all the
images with WebP and basically give you a report card saying,
if you converted to WebP with these settings, then you would
save this amount of bytes. And you can actually change and
tweak the settings, which is the cool part. So you can change your
compression quality levels and all the rest. STEPHEN KONIG: And it lets
you see your site with WebP as well. So you can actually do a little
side by side visual comparison. ILYA GRIGORIK: So
awesome tool. Definitely check it out. And then finally, if you’re
ready to start converting your images, as Stephen mentioned,
you can do that at deploy time or at build time. There’s a number of great
plugins for things like Grunt and other build tools which can
run at build time and just convert all of your images
and make them available on the server. STEPHEN KONIG: So let’s
talk a little bit about WebP on mobile. Full screen. Oh, I want to go
to full screen. So we talked a lot about
deploying it on sort of the desktop web. The good news is there are great
ways to deploy this on the mobile web as well, on our
mobile devices for native applications. We’ll talk a little
bit about that. One case study– so the Google+ app, today on
Android, and actually for about the last month or so, has
been serving WebP images. So you probably haven’t
noticed. And that’s actually the point,
that you’re getting faster performance and faster loads
for the same quality image, and you didn’t have
to do anything. And that’s the whole idea. But starting about a month and a
half ago, we’ve been serving WebP images to users of
the G+ app on Android. On average, we’ve been actually
seeing about a 50% reduction in image
sizes with WebP. And the reason for that is WebP
actually does a pretty good job with user-generated
photos as opposed to other types of images, especially
with photos. So that’s why we’re able to
get even more than typical kinds of savings. We’re literally, as a result
of that change, saving many terabytes of bandwidth
per day. And literally also because of
the cost factor– we’re saving our users money. So it’s cheaper to use this app
now than it was before. And I think that’s a
pretty cool win. So how would you deploy
this on Android? So the code you see here is
actually using the native library, which is supported on
all versions of Android. So WebP support is baked in at
the OS level, starting with ICS and above. But if you want to be able to
target versions of Android prior to that, you
can use a native library that we provide. So it’s pretty simple. Basically, you get
a WebP image. You decode it as a bitmap. And then you render it. So I’ll let you guys read
through all the code. But that’s pretty much
the process– pretty straightforward. ILYA GRIGORIK: I think the
point here is that it’s actually no different from
any other image. STEPHEN KONIG: Exactly. And same sort of
story on iOS– we provide a library that you
can use if you’re building native apps for iOS. Pretty much the same thing–
grab the image from the server, decode it, stick it into
a byte array, render it, and you’re done. So it’s pretty straightforward
to deploy and use. ILYA GRIGORIK: So as a recap– I think we’ve covered a
lot of ground here. For deploying WebP, you can
use it on native apps. That’s very easy
to get started. We have iOS solutions. We have Android solutions. You can go far back with the
backport as well on Android. And on the web, we have server detection, client-side detection. We looked at a couple different
strategies. And personally, I think actually
the automation part of deploying WebP is the easiest
way to get started. So PageSpeed is one way. There are other providers that
will convert these images and serve them dynamically
for you. Just recently, we actually had
an interesting experiment deploy with a company called CDN
Connect, which does image optimization. What they allow you to do is
just upload any image that you want into their CDN. They will do all of
the optimization. They’ll handle the Accept
negotiation. You guys don’t even have
to think about it. And the user gets the
optimized assets. So hopefully, this motivated
why you should take a look at WebP. And hopefully you’ll deploy
them in your applications. And if you guys have any
questions, please grab one of the mics. And we’ll be happy
to answer them. [APPLAUSE] AUDIENCE: Hi. You guys talked about
encoding– or you showed example of
decoding on Android and iOS. I was just wondering if it’s
likely or realistic to encode on Android or iOS in the process
of an application where you’d be taking a picture
and then want to send it to a web service and
encode it first. Can you talk about that? STEPHEN KONIG: Yeah, I think you
could certainly do that. The libraries we provide today
are focused on decode because that’s the predominant case. But literally, they’re nothing
more than a port of the full library that you find
on our site. So I think we could certainly
look at that. Or it’s something I think you
could do today, just porting that code for the encoder. AUDIENCE: I have a question
about using the Accept header to serve the right
image format. Right now on our site, all
image links are complete permalinks that gives you the
same data no matter where you are or when you are, which is
really nice for a CDN, which just uses the URL
as a cache key. And also any caching proxy
between our user and our CDN can cache based on
just the URL. So I’m wondering what sort of
CDN support do you see for using that Accept header as
part of the cache key. And also how that affects
downstream caching proxies– are they able to distinguish
that? Or do we just have to tell them
that the downstream proxy can’t cache it? ILYA GRIGORIK: So the other part
I didn’t cover around the Accept header is the
Vary Accept. So when you provide a different
image based on Accept header, you also have
to basically indicate to upstream proxies to say, this
image is different based on certain parameters. So that’s why you specify
Vary Accept. And we actually have a good
tutorial that I’ve written up recently for how to make this
work with nginx and Apache and other servers. And working with CDN’s– I’ve talked to Akamai
and others. They’re in the process of
deploying support for Accept. AUDIENCE: And in terms of
proxies that you see users have, do you find that the level
of support for being able to identify that properly
is high enough that you’d be willing to send that
to your users? Or do you see a lot of errors? ILYA GRIGORIK: So the default
behavior for a lot of proxies today is if they see anything
but Accept in coding, is to not cache it. So that means that it’ll
just bypass that cache. So it won’t break your site,
which I think is a more important part. AUDIENCE: OK. Thanks. AUDIENCE: Yes, hi. How do you see WebP being used
in medical radiology? STEPHEN KONIG: In– sorry, which? AUDIENCE: In radiology. STEPHEN KONIG: Good question. I’m not sure. We think there’s lots
of applications. We’ve been heavily focused
on the web for sort of obvious reasons. But having said that, I think
there are a lot of additional types of applications where
WebP makes a lot of sense. In the cases where storage is a
primary concern, WebP is an obvious choice. We can get you the same image
with no loss of visual quality or purely lossless if you want
for 30 to 50% less bytes. So we haven’t been focused
on those kinds of applications just yet. But I would expect that as we
make more progress in getting it deployed across the web,
it makes a lot of sense. It would make sense for camera
and device makers to store these images as WebP natively. Literally, you’re going to get a
third more images on your SD card by using WebP. But the reality is they’re not
going to do that until there’s better support in sites across
the web for uploading WebP. So we have to get there first
before we can move on to the second tier. But I think applications like
that make a lot of sense. AUDIENCE: Hi. I’m curious about photographs
in particular. Typically, they come off of the
camera in a JPEG format. Re-encoding it– would you
want to use the lossless re-encoding so as to avoid
introducing new artifacts? If so, what kind of savings
do you get there? And so on and so forth. STEPHEN KONIG: You can
certainly do that. Even if you take a JPEG and
recompress it as a lossless WebP, you’re still going to
get byte savings over the original source. So that definitely still
makes a lot of sense. It depends a little bit on the
quality setting that’s coming out of the camera. If the quality is high,
recompressing into a lossy WebP, you’re really not going to
get anything significant in terms of additional compression
artifacts. But it is something that–
as you’re thinking about deploying this– and one of
the things that we do internally is we do a
visual comparison. We take like a couple hundred
to 1,000 images. And we’ll recompress
them as WebPs. And we’ll compare them against
the original JPEGs and do basically an eyeball
comparison. And I think that’s important
because depending on the nature of the images you have on
your site, you may want to tweak the quality settings
to get the right mix. The other thing I would say on
that, too, is that sometimes the issue is about improving
quality with keeping the file size constant. We see that, too, in some
cases, where the site is actually more interested in
saying, you know, gosh, those thumbnails we served today
are really crappy. They look terrible. But we can’t afford to increase
the JPEG quality because it would cost
more bytes and make the page slower. WebP is actually a great way
to solve that problem, too. So there’s lots of different
ways you can use WebP. So at the end of the day, it
just becomes a case where you have to compare them and get
to the level that you’re comfortable with based on what
your corpus looks like. AUDIENCE: Great. Thanks. ILYA GRIGORIK: Maybe I’ll just
add one more thing, which is in our encoder tool, when you
look at different quality levels, so you can actually say
75, 80 and others, we also have a couple of profiles
built in. There’s a photo profile
and a few others. So I still encourage you to play
with them and kind of get a feel for how it looks. But there’s some presets
in there. AUDIENCE: Hi, I’ve got
three questions. The first one is, is there a way
to do soft tiling and sub resolution image decoding? The first question. The second one is, is the key
frames used in VP9 the same decoder that WebP is based on? And the last question– is there any vector instruction
supporting iOS, for the iOS library? STEPHEN KONIG: So your last
question, I don’t know. I’ll have to follow up
on that for you. In terms of VP9, no. So the WebP is based
on a VP8 key frame. We’ve looked at, does it make
sense to transition to VP9? And for a variety of reasons
that I can get into offline, we feel like it doesn’t. So we’re going to be staying
on the VP8 frame for that. And I apologize. I’ve forgotten your
first question. But we can follow up offline. AUDIENCE: OK, yeah. It’s to do with like 16 K by
16 K texture maps or tiles. Can you generate a
thumbnail without decoding the entire image? That’s sub resolution
image decoding. And the last one was soft tiling
to grab a region of that 16 K without decoding
the entire thing. STEPHEN KONIG: We’re looking
at techniques like that. We don’t have direct support
for it today. But those are things
we are looking at. AUDIENCE: You guys mentioned
that encoding is 5 to 10 times slower compared to JPEG. Is there anything
on the horizon where that would improve? STEPHEN KONIG: So I mentioned
performance optimizations is one of the things we’re
working on. And encoding performance is
probably the top of that list. I would say, just in general,
the reason WebP is smaller is because we spend more CPU
cycles to compress. And we use more sophisticated
algorithms. So we will never
get one to one. That’s just not going
to happen. What we’re shooting for, our
goal, is somewhere around 2 to 3x, is where we hope to be. AUDIENCE: I’d like to mix like
JPEG and WebP files in the cache on Android. Does the WebP native library can
detect that this is a WebP image and this is a JPEG image
and decode that too? Does it do redirect to JPEG and
PNG bitmap decompression? ILYA GRIGORIK: I don’t think
the WebP library will do that for you. But there are great tools
available where you can just read the header of the file,
of any file, and basically determine the file type and then
just pick the right path in your code. AUDIENCE: So I have
to do it manually. ILYA GRIGORIK: Correct. AUDIENCE: Does the backport
library support all versions of Android? STEPHEN KONIG: I believe
everything Froyo and above. AUDIENCE: Can you provide some
insight on how WebP behaves with images that are a
combination of PNG image and JPEG image– would be great. So for example, a screenshot
or something like that. STEPHEN KONIG: So I’m not sure
what your question is. Can you elaborate
a little bit? AUDIENCE: It seems like WebP
is able to behave great on photos or to replace PNG. What if the image is a
combination of both? STEPHEN KONIG: I see. So in general, what you saw,
for example, on G+, is that WebP does fantastically well
in cases of photos, traditional photos you
see from cameras. It does really well on just
about everything. It just does even better
on things like photos. So if you look back in
the slides, I had a chart of 1,000 PNGs. And they were different
types of PNGs. Some of them were photos. Some of them are animated,
just drawings. Some of them are screenshots. And 98 and 1/2, 99% of those
images are smaller in WebP than they are in PNG. There’s a little bit of a thing
at the end where the PNG is smaller. Those actually are cases where
the image itself is like four by four, something
really tiny. And what you find out is that
the container for WebP actually is bigger than
the image size. And so it winds up being bigger
than the original one. So if your images are a lot like
that, then I would tell you, stick with PNGs. The other case that we see that
I highlighted is cases where if you’re generating
images as part of your serving pipeline– so you have something that
you’re then creating an image from, and that’s being generated
dynamically. It’s not so much a visual
quality issue there as more of a performance issue. Because it takes longer
to encode that as WebP versus PNG. So that’s another case where PNG
would be a better choice. But in general, I would just say
what I said earlier, which is at the end of the day, you
have to do a visual comparison to make sure you’re comfortable
and to make sure also all the knobs and buttons
and things that we provide to let you control those settings
for WebP are where you want them to be. ILYA GRIGORIK: Right, so maybe
one more thing that I’ll add is if you guys have any
follow-up questions later about WebP, we do have the
WebP-discuss Google group. All of our engineers
monitor it. We monitor it. So if at any point you guys are
experimenting with it or have questions about WebP,
please post there. And otherwise, thank
you for coming up. STEPHEN KONIG: Thanks. [APPLAUSE]