The most important difference between film and digital shooting is exposure. If there is any other.
For film you exposed for the luminance of the tonal range that was of main interest and you where good. The natural gamma of film (the gamma notion came from film material and had been only adapted to digital via the TV encoding) put this range in the steepest (or flattest - depends on how you want to view it) part of the sensitivity curve of the film material - highlight got smoothly clipped. The rest was lab work and even the simplest services performed the required corrections without asking any questions.
When you shoot JPEG the camera imitates the same thing, but when you want to get beyond that and shoot RAW things get more complicated.
One thing is the white-balance (WB). Either you choose or the camera does, but this will effect the process above and it is hard to correct afterwards. When you shoot RAW and control the exposure with the histograms - you have to control each channel of the RGB - the histograms are adjusted by the WB that had been active during the shot. This can lead to “burned” highlights when correcting the WB: In the histogram you have seen on the display the BLUE channel looked OK, but the correct WB requires a hight blue level and blow out becomes visible. There is a way to prevent his, it is called UniWB, but still the questions remains what you are exposing for.