I have stats on tennis players M&F as far as rankings, name, location, image etc in a mysql DB and excel cvs.
i want to create a page where I can have a form asking for M or F and rank and then have the various stats like handedness, country, image etc come up on the page without refreshing.
I want to do it the quickest way but in the most current type of language or DB or structure or framework.
Anybody have any ideas where to start with this?
You already have a mysql DB. What was used to populate that database? Was there a php script or was it some other method?
I could transfer the DB from mysql to something else it that is better
you need a server to handle the form submit request
I recommend writing the server in node
I will try all of that. Is there a challenge here that is similar that I can emulate?
is there a way to do it directly from the mysql DB?
So I assume you know how to connect to the mysql DB and create a query to select records?
If so, then once you get your records with something like below (I am using the PDO class for the connection to the DB). This code will
// this code assumes you are properly connected to the mysql DB and the connection
// is called "$DBcon" and you have a DB table named "tennis_stats_tbl" in your DB.
$query = "SELECT * FROM tennis_stats_tbl";
// would need to add a WHERE clause to filter out data gender and rank received
// from the request/post)
$stmt = $DBcon->prepare($query);
$tennisStats= array(); // initialize an array to inserts query results
// now loop through each row of data and insert into $tennisStats array
$tennisStats['TennisStats'] = $row;
$jsonData = json_encode($tennisStats);
// the above assign a JSON string of the data which can be returned to the client
// requesting this page
thanks very much!
I know how to do this in PHP but I am trying to learn the new technologies.
this is how I do the whole thing in PHP and mysql:
I webscrape the data into a csv file. Then I upload it to hosting site via phpadmin to convert it to mysql. Then I write the sql and php code to do the crud.
But what I am not understanding is how to do this in node or whatever.
I keep reading about having to download stuff to my laptop and do all sorts of command lines to set it up. But isn’t here some place that this is all done for you like is done for a php website?
Do hosting services like godaddy etc have a setup where I can do this as easily as I can with PHP?
Even in PHP I almost never set it up on my laptop. I just do it on the hosting service.
And say I wanted to collaborate with someone via github to do a PHP project would we hammer it out on github THEN ftp it up to the hosting
Do you use an offline program to webscrape the data? Why not webscrape using php and instead of creating a csv file, just import the data directly into your mysql database? There is no need to upload via phpadmin. That is just an extra step. I pull data off websites all the time and incorporate into a db of some sort.
I use a proprietary software that I bought to scrape.
The mystery to me is how to do the same thing in node or mongo or some other newer language.
How would I set this all up on a hoster in node like I do in PHP?
Still learning node myself, so I can not help you there. I would just do it all iin PHP (without using the proprietary software) until you learn node. If you can do it in PHP, it will be similar in node once you learn it.
Thanks again. But my prob right now is I do not know how to set up node on a hosting website.
There a many hosts that have node pre-installed and manager the updates for you. Maybe you should look into that option.
these are different things - you’re doing some backend work to populate a db - that is relatively straightforward - now you want a web app to use the db - this requires a webserver - writing a server is actually quite easy in node - setting it up needs a bit of server admin - I prefer postgres over mysql - it has the best of both worlds - a relational db with good json support
you could try a cloud database like firebase if you want to avoid writing a server
btw you should never have to pay for webscraping - it usually can be done in a few lines of bash
rm & ppc,
you both have given me some good directions to explore. just have to do a lot of googling and reading now
I will let you know how it turns out