Need to work out what the data structure is for 1 min data coming IN on a port.
ie. inspect data arriving on a port - determine if is xml or binary. If files structure is "symbol","date", "time","open","high","low","close","vol","openinterest"
If it is binary - probably need to mask and shift to get values and determine length and positions and types.
Is possible to do in VB6 but .Net or Java may be easier.
Would prefer .Net.
***Step 2*** (if step 1 is successful)
Need to create two tables in MDB file -
and so create some dummy "1 min" data for the symbols.
Using either .Net (possibly VB6 or Java), serve the dummy data to the port being listened to by "data centre" and you should see it come through to the charts in "MT 4".
Reverse the masking and shifting to get a data match.
Tweak the length of records and data types to get a match.
Will involve working out the request data structure too.
ie. symbol, password, ip address, etc. date/time range too?
To do this download files -
"MT 4" = mt 4 [url removed, login to view]
"Data centre" = mtdc [url removed, login to view]
get files from meta quotes directly or FX DD, or inter bank FX etc.
There is a 30 day trial.
Configure so that the "data centre" receives the data from the "trial server" (there are many eg. FX DD, inter bank fx) and the "MT 4" app receives the data as a relay from the "data centre" rather than directly from the "trial server".
Then you can inspect the packets that arrive/sent on the port/ports to/from the "data centre", and determine file structures re: data sent and re: requests for data.
The new "data centre" configuration should be easy - simply change the setting in the "data centre" configuration panel to "localhost", and so now "data centre" will look from/to that port/ports but instead now on the same PC - and accept/send a stream of data you serve it.
Do use the password field in the configuration panel of "data centre" and this will help identify requests - and you will be able to see it.
Two ports are probably used, one to listen and one to send.
Your program will 'always' listen on a port and accept the requests and will also 'always' streaming in reply.
End of the day all i need as deliverables are -
1) the mdb file with dummy EURUSD 1 min data (50 recs) and USDJPY 1 min data (50 recs)
2) a .net code file that does the following -
listens to a port for a request from the "data centre" - will probably be 'symbol' 'password' and an 'IP address' the app is to allow for - then pushs dummy data from the MDB to the port/ports.
To test results I will run code on my .NET IDE and it will sit there running and listening and awaiting numerous requests on that port.
The "data centre" will probably request new data only when I open a chart in the client and it relays the need for updates to the "data centre". Then it may request only if there are no new records after 15 secs. Or it may only request if another chart is opened.
So I will open EURUSD chart and the request will probably be sent from the chart to the "data centre" and then onto your script. The script will then check the symbol/password/ipaddress and start to send (push) the 1 min EURUSD data at a spaced out interval.
(say every 2 secs for testing purposes)
Then i will also open an USDJPY chart and that should probably trigger the data centre to look for updates - and so will send a reuest for more updates - and your script should be able to read this new request ( whilst still serving up EURUSD data) and start serving this data too.
Probably there will be a stream of EURUSD JPYUSD EURUSD JPYUSD EURUSD JPYUSD EURUSD JPYUSD records being pushed.
Possibly the "data centre" is polling for data rather than it being push, but i doubt it. Either way it is a similar concept but the listening and releasing is a little different.
The communication is probably TCP and so is IP specific.
UDP may be used but probably not.
If polling is happening then the server ( your script ) will be awaiting a request for updates and responding to them.
At the end of the day both the EURUSD chart and USDJPY chart should update at the timed intervals with the dummy records being served up and a new BAR of price should show up.
If i open a 3rd chart , another EURUSD, it will probably get this data from the "data centre" and an additional request may never arrive to your script , but instead the "data centre" will still be requesting new data every 15 secs - or the "data centre" will be expecting new data and not request again until it thinks there should be more.
Not sure perhaps the "data servers" will accept the ip address of the "data centre" to send to and then stream data, but every 2-3 minutes check for a connection. ie ACK NACK. or else cease to stream.
Perhaps the "data centre" will expect new data to arrive every 15 secs and if it does not receive it , then will send a new request based on the number of unique symbols open in the charts at the time.
So your script should test for dis-connections too.
i.e. get you script ot fit in with the architecture and processes that exists.
i.e. get it to listen for re-connection requests - or have the script determine that there is no "data centre" open anymore to both to send to.
It may well be the "data centre" gets pinged from time to time and responds accordingly. You may see standard ping traffic being sent when listening to the ports.
Also when a chart is opened there may be an initial request for records to be filled. So there could be a data "date and time" range request.
ie. one request to fill the history.
ie. then a request for streaming data.
ie. one request does all - and the response is to give missing history and then start streaming.
In this case there should be a 3rd table "EURUSDHistory"
and these should be sent first.
The charts will probably handle nissing bars automatically. So if you have some missing 1 min bars then next data will probably show on the charts still.
Is this going to be possible ?
There are many possibilities - pushing polling. etc.
There is litle point in proceeding if the project can not be accomplished.
I am hoping for some investigatory work.
your fee is to get some code working and show proof of concept.
Some clues to the puzzle will be easy.
I need someone that can find the answer to the difficult parts of the puzzle.
i.e. is "data centre" requesting updates or expecting a stream?
What is "data centres" response to a broken stream?
What is "data servers" response to an uncontactable client?
How often are requests from "server" to "centre" ?
How often are requests from "centre" to "server" ?
Can the record format be determined and mimicked and
streamed to charts successfully ?
Will you be able to workout the request response messages if you do workout what data packets are simply price data?
Do you have skills to hunt for the password that you know and see it in hex mode so as to help you determine the header used for request and response messages.
Do you have the skills to see the headers used for common price data packets ?