ghoulishlearns
New Coder
Hello, I'm currently learning C# by reading 'Introduction to CSharp by Rob Miles. I'm still really new at it but I have come across a part where I don't really understand the concept.
I don't really understand how ("2.0"+3.0); turns into 2.03, I understand that it takes "2.0' literally, which is why it is not addition but why is it 2.03 instead of 2.30? An easy to understand explanation would be highly appreciated, thank you.
I don't really understand how ("2.0"+3.0); turns into 2.03, I understand that it takes "2.0' literally, which is why it is not addition but why is it 2.03 instead of 2.30? An easy to understand explanation would be highly appreciated, thank you.