What sense does it make for "sharpness" to be adjustable on a monitor?

Solution 1:

Per https://www.cnet.com/uk/how-to/turn-down-your-tv-sharpness-control/ , "sharpness" on an LCD is part of the post-processing.

Even leaving out rescaling/upsampling (e.g. if you try to display an SD signal on an HD monitor), and the complexities of colour calibration, the monitor does not always display the image as given. This is an unfortunate side effect of marketing.

Monitor manufacturers like to distinguish their product from other products. From their point of view, if you feed the same signal to their monitor and a cheaper competitor, and it looks identical, this is bad. They want you to prefer their monitor. So there's a bunch of tricks; usually out of the box the brightness and contrast are wound right up beyond what is sensible. The "sharpness" is another trick. How do you make your picture look sharper than a competitor that is displaying the picture exactly as sent? Cheat.

The "sharpness" filter is effectively that used in Photoshop and similar programs. It enhances the edges so they catch the eye.

Solution 2:

Original Question: Where exactly does the degree of freedom for adjusting sharpness come from?

Sharpness is directly related to the type of signal and content you are viewing. Movies typically look better when sharpness is turned down and the pixels are allowed to blur together a bit. On the other hand, a computer display would want high sharpness for clear text and sharp images. Video games are another example where higher sharpness is better. Low quality TV signals also can be enhanced with sharpness controls.

Being monitors can be used for displaying a computer screen, or movie, or virtually any video source, sharpness is still a useful setting.

https://www.crutchfield.com/S-biPv1sIlyXG/learn/learningcenter/home/tv_signalquality.html

EDIT: The OP has indicated in comments that this does not answer the question.

OP: Where in the problem is there room for any adjustment? Like if I tell you x = 1 and y = 2, and then say "oh, and I want x - y = 3". That makes no sense.

The process of converting a live image/video to electrical analog/digital signals, transmitting over some medium, and recreating that image on a display device is NEVER a 1 to 1 process.

Signal noise, compression loss, manufacturing and equipment variations, cabling/signal type, and other factors come in to play. All the adjustments on a monitor are designed to work together to give the end user the highest quality viewing experience - according to the end user. The interpretation is entirely subjective.

OP: This answer does not answer the question of why have the viewer adjust the sharpness when this is already defined by the content creator (be it Spielberg or Excel).

If we are to follow this logic, then why do monitors need or have ANY adjustments at all? The answer is that what we see on the screen is not a 100% accurate representation of the original data.

Solution 3:

The answer is that a pixel is not what you think it is. There is not a 1 to 1 correlation between digital pixels and physical pixels due to "Subpixel Rendering". The way colors are displayed is different in each monitor but most LCD monitors have distinct RED, GREEN, and BLUE elements arranged in a triangle. Some additionally have a white pixel making a quad of elements per "pixel".

enter image description here

Thus, not all layouts are created equal. Each particular layout may have a different "visual resolution", modulation transfer function limit (MTFL), defined as the highest number of black and white lines that may be simultaneously rendered without visible chromatic aliasing.

Monitor drivers allow renderers to correctly adjust their geometry transform matrices in order to correctly compute the values of each color plane, and take the best profit of subpixel rendering with the lowest chromatic aliasing.

The "sharpness" on your monitor reduces the natural blending algorithm used to make lines appear to be contiguous when they are not. Turning the sharpness up will increase chromatic aliasing while producing cleaner lines. Reducing the sharpness will give you better color blending and smooth the lines that fall between the subpixel dot pitch.

For more detailed information, see this article: https://en.wikipedia.org/wiki/Subpixel_rendering