Python Project, Budget App: Something wrong with Transfer(),

Tell us what’s happening:
Everything went fine until Transfer() is tested, it seems to run withdraw() and deposit() on both transfer origin and transfer target instead of running normally [withdraw() on origin, then deposit() on target] and I can’t seem to find what’s wrong. I have tried looking at other people’s code, and the algorithms seem to be similar in steps.

This is a very strange problem, no one seems to be encountering this same problem as me. Please help, thank you.

The code so far

class Category:
    # Attribute
    ledger = list()
    _category = str()
    balance = float()

    # Constructor
    def __init__(self, _category: str):
        self.ledger.clear()
        self.balance = 0.00
        self._category = _category

    # Method
    def deposit(self, amount:float, description:str = ""):
        self.ledger.append({"amount" : amount, "description": description})
        self.balance += amount
    
    def withdraw(self, amount:float, description:str = "") -> bool:
        neg_amount = 0 - amount
        if self.check_funds(amount):
            self.ledger.append({"amount" : neg_amount, "description": description})
            self.balance += neg_amount
            return True
        else:
            return False

    def get_balance(self) -> float:
        return self.balance

    def transfer(self, amount:float, transfer_to) -> bool:
        if self.check_funds(amount):
            self.withdraw(amount, f"Transfer to {transfer_to._category}")
            transfer_to.deposit(amount, f"Transfer from {self._category}")
            return True
        else:
            return False

    def check_funds(self, amount:float):
        if amount > self.balance:
            return False
        else:
            return True

    def __str__(self) -> str:
        #adapt from someone else
        title = f"{self._category:*^30}\n"
        items = str()
        total = float()

        for i in range(len(self.ledger)):
            items += f"{self.ledger[i]['description'][0:23]:23}" + \
            f"{self.ledger[i]['amount']:>7.2f}" + '\n'

            total += self.ledger[i]['amount']

        output = title + items + "Total: " + str(total)
        return output

def create_spend_chart(categories):
    # Do later
    return None

Your browser information:

User Agent is: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36

Challenge: Budget App

Link to the challenge:

Problem is resolved!

Ok, You can see at the top of my code

# Attribute
    ledger = list()
    _category = str()
    balance = float()

I commented it all out, and the code just works with no problem at all. So now the question is that “Python class do not need attribute like in Java?”. This is so strange coming from the java language. I need to find out about this attribute thing in python.

Adding those three variables just under the class definition makes them class variables instead. Variables defined in the __init__ as self.variable are instance variables.