I got a php script that fetches records. These need to be put into the database.
For each record I need to check if it already exists in the database, if exist use that ID and insert parts that belong to that record into another table.
If not exist, insert new record then insert the parts that belong to that record into the "parts" table.
Currently it's fetching 10000 records a time.
when the table is empty it doesn't take to long.
Now The database is 8GB and it needs to scale up to atleast 100-150GB
If needed, we can alter the table structure. And also need advise on the indexes.
If the main postgresql server needs configuration adjusting that also shouldn't be a problem.
For someone with database experience this can be done very quickly and shouldn't be hard