Well, its a bit tricky. Of course manufacturers tend to capture what human eye see by their cameras. But there exist some color spaces (like for example CMYK for printed stuff - subtractive mixing, sRGB and Adobe RGB for displayed stuff - additive mixing). Cameras use some RGB spaces, however the color space of human perception is much wider than most commercial technology can capture and display. For example, when I took some pics of 520 nm and 532 nm, they look about the same, however you see them different. And the blue is the most tricky in my experience so far.
Also human brain just adjusts color perception according to lighting, that's why you have to balance white differently for example under lightbulbs and under daylight lighting.
I fight with that stuff, when I'm adjusting my artworks to look like by human eye as much as possible, but with precise laser wavelengths it is just not possible due to current technology imperfections.
Check this (look for 520 nm where it lays):
(source is wiki - just click on pic for more info)
Regarding IR - it is not supposed that IR is present much in most cases that much to influence picture too mich in most cases: In cheap cameras (like mobile phones) you can clearly see it, however DSLR and more serious cameras are IMO somehow IR filtered, how would you take pic of fire for example? There is a lot of "heat" radiation - IR. Try it with mobile and with some DSLR and you will probably see the difference.
Just for interest look here:
https://www.extremetech.com/electronics/144388-how-to-turn-your-dslr-into-a-full-spectrum-super-camera
Very interesting photography under IR.