# Learn How to Work with Numbers and Strings by Implementing the Luhn Algorithm - Step 21

### Tell us what’s happening:

Instead of converting digit to an integer, can I redeclare the variable as such?

digit = int(odd_digits)

### Your code so far

def verify_card_number(card_number):
sum_of_odd_digits = 0
card_number_reversed = card_number[::-1]
odd_digits = card_number_reversed[::2]

# User Editable Region

for digit in odd_digits:
digit = int(odd_digits)
sum_of_odd_digits += digit

print(sum_of_odd_digits)

def main():

# User Editable Region

card_number = '4111-1111-4555-1142'
card_translation = str.maketrans({'-': '', ' ': ''})
translated_card_number = card_number.translate(card_translation)

verify_card_number(translated_card_number)

main()

### Your browser information:

User Agent is: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36 OPR/107.0.0.0

### Challenge Information:

Learn How to Work with Numbers and Strings by Implementing the Luhn Algorithm - Step 21

odd_digits is a string with numbers and some - characters. You cannot use int on a string with that value because it would raise an error.

I think what you thought to do was

for digit in odd_digits:
digit = int(digit)
sum_of_odd_digits += digit

In principle, it works. But why would you declare it again? It’s redundant.