PDA

View Full Version : Colour code formation in HTML



ellenconnel
Jun 9th, 2017, 10:11 AM
Hello Folks,

I am having trouble while creating colour code in HTML. I am very confused about how to create colour code in HTML with RGB colour values. I have got so many suggestions for using online tutorial posted on W3Schools (http://www.w3school.in/?utm_source=Coding%20Forums&utm_medium=Coding%20Forums).
Can you please share your strategy for colour formation in HTML?

Thank you

jmrker
Jun 9th, 2017, 05:39 PM
What is your confusion question?
What colors are you trying to create?
Where do you want to use them?
:confused:

deathshadow
Jun 9th, 2017, 10:14 PM
Well, if you're being pointed at W3Fools, that's half your problem -- but really do you know anything about emissive colourspace?

Remember in school how "yellow and blue makes green"? Yeah, that's bullcookies. 100% fabrication. When it comes to pigments -- aka reflective colourspace:

yellow + magenta = red
yellow + cyan = green
cyan + magenta = blue

So when mixing with paints, the primary colors are NOT red, blue, and yellow, it's CYAN, MAGENTA, and YELLOW. In theory all mixed together makes black, but since pigments dilute usually a separate black is added for black, hence the CMYK colourspace used in print.

RGB is just the polar opposite. It's "emissive colourspace" which means that your color wheel is shifted 60 degrees as light frequencies are added together by the light source (screen), not subtracted by the pigments (print/paints/etc)

red + green = yellow
red + blue = magenta
green + blue = cyan

At the heart of it that's all you really need to know. Brown is just dark orange -- how do you get orange? full red + half green. #FF8000

When using the shorthand hexadecimal RGB, all you need to know is hexadecimal. Is that where you're getting tripped up -- or just further tripped up? It's pretty simple. Just as decimal is 0..9 per digit, hexadecimal is 0..15, with 10..15 stated as A..F. As such each digit is a multiple of 16, so:

hex == decimal
0x10 == 16
0x11 == 17
0x1A == 26
0x20 == 32
0xFF == 255 (15 * 16 + 15)

Whilst older browsers don't support it, you might be more comfortable using RGB() instead of the hexadecimal approach, keeping in mind that each colour channel is a 0..255 value from black to full intensity.. aka the value you can state by a full 8 bits. That's the key to why hexadecimal is big on computers is that computers being an on/off 0/1 affair work best with exponents of two, and one hexadecimal digit is the same as 4 bits.

As such the pure primary colours are:
Blue == #0000FF == rgb(0,0,255)
Green == #00FF00 == rgb(0,255,0)
Cyan == #00FFFF == rgb(0,255,255)
Red == #FF0000 == rgb(255,0,0)
Magenta == #FF00FF == rgb(255,0,255)
Yellow == #FFFF00 == rgb(255,255,0)
White == #FFFFFF == rgb(255,255,255)

As a historical note, the original RGB monitors used in many early 8 and 16 bit computers that supported colours, used a 8 or 16 colour palette of exactly that mix. This includes the 16 colours available in text mode on a original IBM 5150 PC with a CGA card which used an RGBI scheme. The "I" line meant 'intensity'. Basically all four data lines were a strictly on/off affair, with the RGB lines being the modern day equivalent of 0xAA each, where if the intensity line was on 0x55 was added to all THREE colours. 0xAA + 0x55 == 0xFF. Though if you sent 0b1100 the MONITOR would remove half the green signal strength so you got brown instead of a dark yellow-ochre.

One handy thing to keep in mind is the three digit colour codes, which are faster/easier to state and do the math in your head than the six digit version.

#666 == #666666 == Satan Gray
#DDD == #DDDDDD == Carlos Mencia Gray
#F00 == #FF0000 == Pure red
#F55 == #FF5555 == CGA/EGA/VGA bright red

Oh, and do NOT trust the 'named colours' as they were designed/named for the X11 (aka X-Windows) colourspace on really old hardware with the 6500K degree colour scale (as many early high res displays needed fast reaction time phosphors which hadn't been perfected yet, hence the sickly yellow passing for "white"). MOST modern displays are significantly bluer and brighter 9500K or higher white 'temperature'. A lot of high end "final generation" CRT displays in the 17" or larger size would often have a switch or option to go between or even adjust the white output "temperature".

I often suggest even in deployment to stick to the 3 digit if for no other reason than many handhelds/mobile devices -- and even some desktops (see the last generation of G4/G5 based iMacs) advertised themselves as "16.2 million colour" displays when it was in fact complete BULL, they were 262k colour displays (aka the 18 bits of the original VGA specification) that applied dithering on the monitor side to fake a higher colour density. It's a practice continued today and that dithering with certain colour combinations can impede legibility -- sticking to the 12 bit colour space of the 3 digit declarations keeps you well away from those limits, though in practice you could use 0x0, 0x4, 0x8 and 0xC with impunity since that would fit the 18 bit colourspace. (two extra bits per channel). Anything with finer granularity than that can have... issued on less capable hardware.

Though THANKFULLY most manufacturers have moved away from that practice over the past five to six years, there's still a great many cheaper devices in circulation. There's ALWAYS a cheaper less capable device people will use, plan for it. It's why the entire concept of "graceful degradation" is a core concept of PROPER web development!

... and don't be embarrassed to break down and use a colour palette tool if you have to. Some colours you can picture in your head but not figure out the numbers for. Well, I rarely have that problem anymore but I've been doing direct hardware level palette manipulation since RGB monitors became increasingly commonplace on microcomputers some 35 years ago... if you're just starting out with it, when in doubt use a tool... be it a free online one (Google it!) or something built into a paint program like Photoshop, GIMP, Paint Shop Pro (my fav) or even M$ Paint.

Also keep in mind WCAG accessibility minimums when choosing foreground and background colours. The actual guideline math is a bit tricky for normal people to follow, which is why I always point people at WebAim's tool:
WebAIM: Color Contrast Checker (http://webaim.org/resources/contrastchecker/)

when the WCAG 2.0 was made, font smoothing and webfonts really weren't a 'thing' yet, so to plan for either I say that for 'normal' text you should now treat Large AAA as the minimum, with 'large AA' being useless trash. If you use webfonts, no matter how big/small the text you should use normal AAA as your minimum. Only if you stick to system fonts at a default metric 1EM (16 px for most people though again, avoid stating font sizes in pixels) you can use Normal AA as the minimum, but I advise against it.

Strider64
Jun 9th, 2017, 11:39 PM
Well, if you're being pointed at W3Fools, that's half your problem -- but really do you know anything about emissive colourspace?

Remember in school how "yellow and blue makes green"? Yeah, that's bullcookies. 100% fabrication. When it comes to pigments -- aka reflective colourspace:




What you are saying is probably true for COMPUTERS, but you are 100% wrong with it comes to yellow + blue = green when it comes to pigments. I am more of expert when it comes to that, for I was a color matcher (Shader) for an automotive paint manufacturer for 19 years. ;)

Here's proof ->
https://c1.staticflickr.com/3/2173/2456779806_49361b0fcc.jpg (https://flic.kr/p/4K6D3w)Day 122 of 366 (https://flic.kr/p/4K6D3w)

That's me in front of the Macbeth color spectrometer and X-Rite color spectrometer. :D

Though we didn't make some less yellow by adding blue, we you used black mostly. Though it's true what they said in school about yellow and blue makes green, for that is one of the very first thing my boss said to me when he started training me to shade.

deathshadow
Jun 10th, 2017, 05:39 AM
Sorry @strider, but what you were calling blue was probably cyan. In terms of pure pigment mixing, it's CMYK... talk to anyone working in print. Yellow and blue makes this muddy impure green because there's too much red in it since again, magenta + cyan == blue.

https://en.wikipedia.org/wiki/CMYK_color_model

The RYB colour model:
https://en.wikipedia.org/wiki/RYB_color_model

is outdated outmoded nonsense with no scientific research or theory behind it. It is flawed, often fails to be able to reproduce even two thirds the possible variety of hues, and if you try to mix evenly for black you get a purple-brown... admittedly CMY isn't perfect either in that regard giving a more reddish brown, but it's closer to black than RYB ever would be.

But that's why you add black -- neither subtraction/reflective colourspace can produce as pure a black as the pigments cannot combine in a truly pure fashion to create it... unlike additive/emissive colourspaces where both black and white are easily produced... or the eye can be tricked into thinking it is white.

Also why generally if you are THINKING in RYB, you actually are probably working from a slew of pre-mixed pigments and not just three or four -- see why you would actually rarely mix green yourself and many artists use a pure green in addition to the traditional bases.... or simply go for more and more pre-mixed when laughably if the pigments were strong enough, they could get away with just four or five.

Though pigment strength is another issue, particularly with paints as I'm sure you're well aware. Pre-mix of strong pigment can be cheaper since it can contain less actual colour and more filler material.

RYB is to CMY what a BBS would be to the Internet. It's like blindly trusting that the stories made up by goat herders millennia ago and codified by creepy old roman dudes some 1800 years ago is the only one true answer to creation... RYB is the colour theory equivalent of flat Earth theories or thinking that the planets, sun, and stars revolve around the Earth.

It's workable, after a fashion, with a lot of persistence and dicking around... but CMY has science behind it and will get you there quicker. That's why pretty much EVERY modern colour print process since "spot printing" went the way of the Dodo is CMYK.

Though it could be worse, we could get into YUV... which is what almost every damned usability study is based on -- INCLUDING the math for the WCAG colour contrasts which is just the Y component of the YCbCr...

Y = 0.299 * Red + 0.587 * Green + 0.114 * Blue

The 'simple' formula assuming full pixel width non subhinted fonts being that 50% of luma is the minimum for legibility for as much of the population as possible. As I mentioned above, you generally want more than that since modern subpixel hinting tacks on at least 15% more, webfonts can just plain piss all over it, and the smaller your font size -- particularly if glyphs bars and slabs are not filling the pixel -- can really make a mess of things.

Laughably that same conversion was used on the original IBM VGA and MCGA when displaying colour output on a monochrome display... as yes, there WAS such a thing as VGA monochrome. Many people working in print preferred it over colour as it was sharper thanks to having no colour mask. Similar to why 80 column text was useless on colour composite but was sharp as a tack on monochrome. Monochrome CRT's POSSIBLE vertical resolution is limited by the size of a phosphor atom and the bandwidth of the signal. Hence Jobbo the Clown's preference for it on the first TWO generations of NeXT workstations, the LISA and the original Macs.