I'm taking an intro to computer science class in college, and we're using python. We were told to make a simple bank account program that calculated interest after 1, and 2 years. When the program is run it asks for your principal amount, and the interest rate, as a decimal.
We were also told to modify the code so we could enter the iterest as a percentage and my code does not work as intended and I need help!
principal = input('Please enter your starting principal:')
interestRate = input('Now enter the interest rate as a percentage:')
# interestRate = interestRate/100 (Note: I was testing this to see if it worked as well and it did not)
print 'Your interest is: $', (interestRate/100)*principal, 'after one year.'
print 'Your interest is: $', (interestRate/100)*(principal*2), 'after two years.'
if (interestRate/100)*(principal*2)+principal >=1000000:
print 'Yay! You are a millionaire!'
When I start the program through terminal it goes through the motions, asks for the inputs but when I enter a percentage, say 5. It says my interest after 1 and two years is $0... When I enter a number less than 1, say .05, my program correctly divides the number and gives a correct percentage. Why?
Any help is appreciated.