Data Analysis with Python Projects - Sea Level Predictor

Tell us what’s happening:

AttributeError: module ‘numpy.typing’ has no attribute ‘NDArray’, i have this type of error no matter what i codes, what should i do

Your code so far

Your browser information:

User Agent is: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36 Edg/126.0.0.0

Challenge Information:

Data Analysis with Python Projects - Sea Level Predictor

Sounds like an issue with the dependencies. I would search for the error message and see what you find.


Did you change any of the dependencies, or is it throwing that error with the boilerplate dependencies?

Hello and welcome to the forum!

Please share the full error message verbatim and your code.

I’ve got the exact same error. It’s happening without changing any of the starter code at all.

Hello and welcome to the forum!

Please share the full error message verbatim and your code.

Also, please open a new topic for your problem. Even if it turns out that you have the exact same problem it’s too complicated to troubleshoot 2 environments at the same time.

I don’t know much about handling Python dependencies, but it looks like the version of PIL the docker image comes with (or it’s a peer dependency) isn’t compatible with the version of NumPy in the requirements.txt

gitpod /workspace/.pyenv_mirror/user/current/lib/python3.8/site-packages/PIL $ cat _typing.py
from __future__ import annotations

import os
import sys
from typing import Any, Protocol, Sequence, TypeVar, Union

try:
    import numpy.typing as npt

    NumpyArray = npt.NDArray[Any]

If I downgrade to 9.5.0 it seems to work, but all I did was run python main.py without any code changes, so I don’t know if it breaks anything else.

python -m pip install "Pillow==9.5.0"

Not sure if NumPy should be updated or Pillow should be downgraded, but I didn’t want to start trying a bunch of different NumPy versions.


As this happens out of the box with our Gitpod boilerplate, I will open an issue for it.