Not very many scientific applications at all, since it would be a very rough kind of spectrometry.
But what use is taking IR/UV photographs? For that matter, what
use are 99% of the pictures people have on their memory cards or films? But we do it anyway, because it's fun, not because it's useful
For instance, people often take IR photographs because it gives "unreal colors" that can result in an artistic effect. Plants tend to be very bright, because leaves reflect infrared light a lot.
In a similar vein, if you combine a GRGB photograph with a GCMY one, you should end up with very green plants (at least, I
think the leaves' green is pretty much pure green), while other "green" things (such as the green printed by a - cough - CMYB printer) won't be green at all.
(What color will they be? Well, that depends on what you want to obtain. The only "logical" choice is green again, since that's the primary that our eyes sense.)
Generally speaking, you can create "false color" images that are, in a sense, more
true color than any others, since you can tell colors that are composite apart from ones that are a pure wavelength.
The software/technical challenge would be to write a program that aligns two such raw images from the two types of cameras (much like the hundred image stacking applications there are around, except it much load BRGB and GCMY rasters directly) and then lets you decide how to filter your wavelengths.
It could probably also be done with a vanilla image alignment program, if you let the cameras also produce JPEGs and the program is able to give you a numeric measure of the misalignment; then you can align the raws manually using that data. Or you can do purely manual alignment.
You still need something convenient that lets you choose how to combine the colors, but that's easy.