React Native, Python API Integration

Hello all.

User Story: React Native app, requests json from database (not essential), requests Python API to update database, and displays json how I wish.

Issues: I was not sure of the best method to implement this functionality. I already had a Python script that scraped data off the web, and I have it able to output this data to a .json file. I want this controlled by a React app natively.

I have been trying many different scripts with Flask APIs to update the data, and allow requests. Here are some of my failed attempts:

SQLAlchemy
from flask import Flask
from flask_sqlalchemy import SQLAlchemy

app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'mysql://user:password@host/database'
db = SQLAlchemy(app)

class Sermon(db.Model):
    id = db.Column(db.Integer, primary_key=True)
    title = db.Column(db.String(80))
    category = db.Column(db.String(120))
    date = db.Column(db.String(120))
    speaker = db.Column(db.String(120))

db.create_all()

db.session.add(Sermon(id=0,title='Test Title',category='Test Category',date='test Date',speaker='Test Speaker'))
db.session.commit()

@app.route('/test',methods=['GET'])
def testSQL():
    Sermon.query.all()
mysql.connector
from flask import Flask, request, jsonify
import mysql.connector

# Create a new Flask application
app = Flask(__name__)

# Connect to database
sermonsDB = mysql.connector.connect(
  host="host",
  user="user",
  passwd="password",
  database="sermons"
)

cursor = sermonsDB.cursor()

# CREATE TABLE
cursor.execute("CREATE TABLE sermons (id INT PRIMARY KEY, title VARCHAR(255), date VARCHAR(255), speaker VARCHAR(255), category VARCHAR(255)")
# INSERT VALUES
sql = "INSERT INTO sermons (id, title, date, speaker, category) VALUES (%d, %s, %s, %s, %s)"
val = [
    (0,"Test Title", "Test Date","Test Speaker","Test Category"),
    (1,"Test Title 2", "Test Date 2","Test Speaker 2","Test Category 2")
]
cursor.executemany(sql, val)
sermonsDB.commit()

@app.route('/test',methods=['GET'])
def testSQL():
    cursor.execute("SELECT * from sermons")
    result = cursor.fetchall()
    res = []
    for x in result:
        res.append(x)

# Run Debug
if __name__ == '__main__':
    app.run(debug=True)
marshmallow
from flask import Flask
from flask_sqlalchemy import SQLAlchemy

from marshmallow_jsonapi.flask import Schema
from marshmallow_jsonapi import fields

from flask_rest_jsonapi import Api, ResourceDetail, ResourceList

# Create a new Flask application
app = Flask(__name__)

# Set up SQLAlchemy
app.config['SQLALCHEMY_DATABASE_URI'] = 'mysql://user:password@host/sermons'
db = SQLAlchemy(app)

# Define a class for the Artist table
class Sermon(db.Model):
    id = db.Column(db.Integer, primary_key=True)
    title= db.Column(db.String)
    category = db.Column(db.String)
    date = db.Column(db.String)

# Create the table
db.create_all()

# Create data abstraction layer
class SermonSchema(Schema):
    class Meta:
        type_ = 'sermon'
        self_view = 'sermon_one'
        self_view_kwargs = {'id': '<id>'}
        self_view_many = 'sermon_many'

    id = fields.Integer()
    title= fields.Str()
    category = fields.Str()
    date = fields.Str()

class SermonMany(ResourceList):
    schema = SermonSchema
    data_layer = {'session': db.session,
                  'model': Sermon}

class SermonOne(ResourceDetail):
    schema = SermonSchema
    data_layer = {'session': db.session,
                  'model': Sermon}

api = Api(app)
api.route(ArtistMany, 'sermon_many', '/sermons')
api.route(ArtistOne, 'sermon_one', '/sermons/<int:id>')

# Run Debug
if __name__ == '__main__':
    app.run(debug=True)

For all of these examples, I have been trying to host my API on https://www.pythonanywhere.com/
At one stage, using the marshmallow package, I managed to request some data. I then changed some code, and since have not even been able to connect to the database anymore, and get told I have no permission.

There must be a better way of doing this. The marshmallow package worked to an extent, but the data structures are far too complicated in their base configurations for the data I plan on working with.

Does anyone have any ideas for the following:

  1. To database or not to database?
  2. To Flask or not to Flask?
  3. To PythonAnywhere or not to PythonAnywhere?
  4. To API or not to API?
  5. To Python or not to Python? (By this, I mean: Should I scrap the Python webscraper, and find a way to implement the functionality using JavaScript, considering my app is written in JavaScript (I do not know how this will fair with being built and ejected for deployment))

:smile:

Hey,
I have the same issue and I just came upon this thread. I have a React Native app , a Python script that scrapes data and exports it to a .json file, and I need to use that data in the App. Did you by any chance manage to resolve this and how?
Happy Birthday btw. :v:

Welcome, Ianin.

I resolved this by not using Python for the app. I ended up recreating the functionality in JavaScript.

Thinking back on this, it should be perfectly possible to create a simple Flask API to cause a web_scrape function to return the data on a specific route. However, it was easier, in the end, to port everything over to JavaScript.

If you need, I can dig through old files to find exactly how I did it?

Thanks, for the wish :smiley:

Hi again,
If it isn’t too much of a hassle, yes I would like to know how you did it. I’m still unsure of how I would do it, I don’t really have any experience with Flask, neither have I done scraping with JavaScript.
If I go with the Flask option, can I host the API on PythonAnywhere?

This should be possible. The only tricky bit is data storage - You likely do not want to scrape the website every single time you want the data…maybe you do?

What I ended up doing was using Cheerio.js:

  • I wrapped the scraper in a function that returned the data how I wanted it:
  • I called the function from an endpoint based on a button that updated a MongoDB (Atlas) database. The endpoint just returned the result (success/failure)
  • So, the app just made a call to fetch data from the database.

The database was important for my usecase because:

  1. It took about 5-8min for the webscraping to complete
  2. I only wanted to update the database once per month.

Basic Axios/Cheerio webscraping boilerplate:

Click
function getSeriesURL() {
    let seriesURL = "";
    axios(URL)
      .then(html => {
        const $ = cheerio.load(html.data);
        let refs = $(
          "div.elementor-row > div > div > div > div > div > div > a"
        );
        for (let ref in refs) {
          try {
            seriesURL = refs[ref].attribs.href;
            break;
          } catch (err) {
            console.log("(1) SERIES ERR: ", err);
          }
        }
      })
      .then(() => setTimeout(() => findDataAfterDate(seriesURL), 8000));

function findDataAfterDate(url) {
    axios(url)
      .then(html => {
        const $ = cheerio.load(html.data);
        let refs = $("article > h3 > a");
        let tempDates = $("article > abbr");
        let tempAudios = $(
          "some_scraping"
        );
        let tempSpeakers = $("a > span");
        let tempCategories = $("a");
        for (let ref in refs) {
          if (
            !isNaN(ref) &&
            Date.parse(tempDates[ref].attribs.title) > Date.parse(recordDate)
          ) {
            data.push(refs[ref].attribs.href);
            titles.push(refs[ref].children[0].data);
            audios.push(tempAudios[ref].attribs.href);
            dates.push(tempDates[ref].attribs.title);
            speakers.push(tempSpeakers[ref].children[0].data);
            categories.push(tempCategories[ref].children[0].data);
          }
        }
      })
      .catch(err => console.error("(2) findDataAfterDateERR: ", err));
    setTimeout(setDataArray, 20000);

There are a few variables missing, but if you go that route, then this might help.

1 Like

I actually do want to scrape it every time. The script is very simple and executes fast, and the data I need is likely to change multiple times during the day.
Thank you so much for the comprehensive answer, I really appreciate it!