Beautiful Soup, Syntax Error Extract and Post to *.csv file: Please Help!

What’s Happening:
I’m trying to construct a BeautifulSoup web Extractor on Google Collab, and it’s throwing a Syntax error on the following:

currAnimeListURL.append(href.replace(“myanimelist.net/anime/”,

on Line 134 of the following Google Collab document:

…Purpose:
I am trying to web extract all anime titles’ links from the URL’s :
"https://myanimelist.net/topanime.php?";https://myanimelist.net/topanime.php?limit={0}”.format(50*i+50)
(inside a for iterator loop…)
(get bs object of myanimelist[…].php[…] , and for all anime titles that iterates through bs objects, i.e. where a href values, in text, of the div/h3/a elements for each of said URL’s and extract the data, for some defined columns, and attempt to write, using utf-8, to write onto a predefined *.csv fileName, w/ encoding locale = ‘utf-8’)…

Kinda freaking out since I’ve looked over like a billion web resources about bs objects, and file data extraction fit to form…

Can somebody please help?

Hi. I’m currently trying to successfully create a Python Web Extractor for myAnimeList.net’s topanime.php?limit={0}.format(50*i + 50)
web URL’s and extract critical data points in the Google Collab file that I’ve tried to construct here:

I’ve successfully unstuck myself on the first related issue, and debugged to the point that it prints some stuff to console, but fails to do anything successfully but to print out the bs element of the 1st webpage, in the array, and the 1st url indefinitely… Can somebody please help me unstick the thing and (hopefully), get back on track to web extracting to *csv file…

Please, anybody…?

Merry Christmas

This is my current iteration…
I’d like some help to fix the issues I don’t know about.
The english titles/ jpn titles scraped are improperly passing upon method loading… and reading wrongly to file… Moreover, the file has only been able to pass an approximately fixed ballpark of titles before completely shutting down and deferring the post of url only and shutting down completely…

Can someone please help?

URL: