Assuming that A, B and C are all defined as int types in the program and given values greater than 1, how are 1/a, B and C expressed in the programming language?
1.0/a/b/c Because in the programming language, when an integer is divided by any number, the result will be an integer and the decimal will be omitted, so the real expression of this formula is 1.0/a/b/c or1.0/(a * b *. Why not use 1/A * B?