How to discover profitable automation strategies?

Advanced automation available in Guardian - Chat with others and share files here.
User avatar
firlandsfarm
Posts: 2724
Joined: Sat May 03, 2014 8:20 am

Anbell wrote:
Thu Sep 03, 2020 12:26 am
azzadeath wrote:
Wed Sep 02, 2020 2:54 pm
I run an automated strategy just through bet angel that is profitable long term, It has limits though and the drawdowns can last up to 2 months :o but its generally a set and forget job
How do you load the markets every day?
Hi Anbell, you may find something in this thread that is of interest.
Anbell
Posts: 2126
Joined: Fri Apr 05, 2019 2:31 am

firlandsfarm wrote:
Thu Sep 10, 2020 6:08 am
Anbell wrote:
Thu Sep 03, 2020 12:26 am
azzadeath wrote:
Wed Sep 02, 2020 2:54 pm
I run an automated strategy just through bet angel that is profitable long term, It has limits though and the drawdowns can last up to 2 months :o but its generally a set and forget job
How do you load the markets every day?
Hi Anbell, you may find something in this thread that is of interest.
That's it. Thanks. I installed autohotkey a while back but never gave it a spin
User avatar
firlandsfarm
Posts: 2724
Joined: Sat May 03, 2014 8:20 am

Hi abgespaced you sound like my kinda guy, I think you are talking of operating similar to how I do.
abgespaced wrote:
Sat Sep 05, 2020 10:44 am
1. Don't trust early results
I started with a basic supposition based on a manual strategy that I've had some limited success with. I ran it through practice mode first to make sure I wasn't losing a ton of money, then went live. I left it on most of the day and was up in profit by the end, though not by much. I understand variance so crossed my fingers and left it on overnight. In the morning it had deteriorated into a loss.
Agreed. In fact I don't trust 3 month results! I only increase the stakes after I have made sufficient profit to cover the increase, I don't use new money to cover increasing stakes.
abgespaced wrote:
Sat Sep 05, 2020 10:44 am
2. Only adjust one thing at a time
I always feel tempted to bring in 3-4 changes when something isn't working. It's hard to resist but it's better in the end. If I go about changing more than one thing at a time I might be introducing or omitting a second or third factor that just confounds the measurements. Better to go slow and play things out to a definite conclusion, as others have said.
Agreed. How can you tell if one changed increased the profits and another reduced them. Building a system take a lot of time … rushing usually results in the poor house.
abgespaced wrote:
Sat Sep 05, 2020 10:44 am
3. Use small stakes and get real data
I started out using guardian in practice mode but turned to the real market once I could see my automation working as intended. With that switch my stakes dropped to about 100th of what they were in practice mode. I'm taking 5c a trade and turning it into hard data that I can analyse and pivot on.
Agreed. And Practice mode can be a pain in adding the winners in order to evaluate the results.
abgespaced wrote:
Sat Sep 05, 2020 10:44 am
They are the 3 main things I'm focusing on. I guess the next area of improvement is in the database management side of things. The market reports folder is getting messy and I have no database management experience besides copy-pasting things in and out of excel. So perhaps some kind of SQL crash course or something would be beneficial. I just don't want to lose track of the results I've got so far and get stuck going around in circles.
If you are serious about your betting, and I guess you are, don't use Excel to archive data. By all means use it to harvest data but then transfer to a proper database facility for storage and analysis. You will soon see the shortcomings of Excel. I use Access from MS Office. You don't need to know anything about SQL to get started, it has a graphical interface … you drag and drop what you want, give it some filter parameters and use a few formulae to process the data. Access can handle a database file of up to 2GB, as you grow the database you may find that restrictive. No problem. You can add MS SQL Server Express to your computer (I'm assuming you are using a windows PC) and link it to the Access database. SQL Server Express is free and can handle a database up to 10GB.

OK you are now moving into the world of real databases. One thing many don't realise when first embarking on SQL is that it is only a coding language, it has no user interface. It is what is termed in database parlance the BackEnd (BE). You will need a FrontEnd (FE) to give the ability to communicate with the data. You can use VB to create your own FE but I use Access as my FE. It allows me to do quick and dirty data retrievals when experimenting and when it's a finished product I move it to SQL.

Once you have SQL Server installed you can move your data tables to SQL all at once or one at a time because the SQL Server will be controlled by Access so you continue to use Access to design and implement your data queries even though your data is in SQL Server. Access also has a facility that translates your queries designed with the graphical interface into SQL so you can use that to help learn SQL. Unfortunately it's not the same SQL as used in Server Express but very close. As part of my SQL learning process I take the SQL produced by Access and translate it to Server SQL, there are only a few everyday code functions that differ. So now you are using Access to tell SQL Server to run QueryA or QueryB and when I has the result from SQL Server how to present it to you. Simple! :)
Anbell
Posts: 2126
Joined: Fri Apr 05, 2019 2:31 am

firlandsfarm wrote:
Thu Sep 10, 2020 6:57 am
If you are serious about your betting, and I guess you are, don't use Excel to archive data. By all means use it to harvest data but then transfer to a proper database facility for storage and analysis. You will soon see the shortcomings of Excel. I use Access from MS Office. You don't need to know anything about SQL to get started, it has a graphical interface … you drag and drop what you want, give it some filter parameters and use a few formulae to process the data. Access can handle a database file of up to 2GB, as you grow the database you may find that restrictive. No problem. You can add MS SQL Server Express to your computer (I'm assuming you are using a windows PC) and link it to the Access database. SQL Server Express is free and can handle a database up to 10GB.

OK you are now moving into the world of real databases. One thing many don't realise when first embarking on SQL is that it is only a coding language, it has no user interface. It is what is termed in database parlance the BackEnd (BE). You will need a FrontEnd (FE) to give the ability to communicate with the data. You can use VB to create your own FE but I use Access as my FE. It allows me to do quick and dirty data retrievals when experimenting and when it's a finished product I move it to SQL.

Once you have SQL Server installed you can move your data tables to SQL all at once or one at a time because the SQL Server will be controlled by Access so you continue to use Access to design and implement your data queries even though your data is in SQL Server. Access also has a facility that translates your queries designed with the graphical interface into SQL so you can use that to help learn SQL. Unfortunately it's not the same SQL as used in Server Express but very close. As part of my SQL learning process I take the SQL produced by Access and translate it to Server SQL, there are only a few everyday code functions that differ. So now you are using Access to tell SQL Server to run QueryA or QueryB and when I has the result from SQL Server how to present it to you. Simple! :)
This is very helpful. Thanks.

My spreadsheets are 160000 rows deep with many sumif and vlookups and whatnot and are about to break my machine. YOu make it sound simple.
User avatar
mcgoo
Posts: 898
Joined: Thu Jul 18, 2013 12:30 pm

Might have to look at Access and SQL again (years ago dabble) . The thing that is breaking my brain today is whether to extend my 50% stake/profit take out from 4 ticks or to extend the pull back threshold to x ticks so that I don't get shaken out after taking profit by a short term move. It hurts! :ugeek: :D
Edit: Or both..see it hurts!
User avatar
firlandsfarm
Posts: 2724
Joined: Sat May 03, 2014 8:20 am

Anbell wrote:
Thu Sep 10, 2020 7:41 am
This is very helpful. Thanks.

My spreadsheets are 160000 rows deep with many sumif and vlookups and whatnot and are about to break my machine. YOu make it sound simple.
You're welcome, it's nice to have the opportunity to return the help I have been given by forumites.

My racing database is 2.2 million rows * 50 columns in the largest table, the Horse-in-Race table that stores all the race specific data for each horse in a race. It's supported by tables for the Races data, Jockies, Trainers, Going, (non-race specific) Horse data, etc. etc. but it can sort and display in a few seconds.

One thing you will learn with a real database is the use of relational tables and indexes. Indexes tell the software where a record is in a table so that it doesn't have to go through all data row by row to find stuff. You don't index everything just the columns you usually search on. Relational tables (referred to as 'Normalisation' in database world) are where you have a table for each field with common multiple entries. So in my example the Horse-in-Race table will refer to the ID of the horse in the Horses table so for each Horse-in_Race entry I store the ID number of each runner from the Horses table and that gives me the name, sex, DOB, Sire, Dam etc. of the Horse for every race it runs in while only storing that data once. Relational tables may sound more complicated than Excel but once you start thinking of data in categories it falls into place and allows you to apply filters more efficiently. You can of course still do an all encompassing single table while you think of how to design the data! :)
User avatar
abgespaced
Posts: 176
Joined: Sun Aug 23, 2020 2:25 am
Location: Australia

firlandsfarm wrote:
Thu Sep 10, 2020 6:57 am
Hi abgespaced you sound like my kinda guy, I think you are talking of operating similar to how I do.
Thanks my dude! Great points about Access. I'll look into it. Anything that can make the process easier.

Question - how do you move the data into Access in the first place? Does BA have an import feature like it does for Excel?
User avatar
abgespaced
Posts: 176
Joined: Sun Aug 23, 2020 2:25 am
Location: Australia

mcgoo wrote:
Thu Sep 10, 2020 8:42 am
Might have to look at Access and SQL again (years ago dabble) . The thing that is breaking my brain today is whether to extend my 50% stake/profit take out from 4 ticks or to extend the pull back threshold to x ticks so that I don't get shaken out after taking profit by a short term move. It hurts! :ugeek: :D
Edit: Or both..see it hurts!
Can't you run separate tests for both?
User avatar
firlandsfarm
Posts: 2724
Joined: Sat May 03, 2014 8:20 am

abgespaced wrote:
Thu Sep 10, 2020 11:48 am
Question - how do you move the data into Access in the first place? Does BA have an import feature like it does for Excel?
Access can only be used as an archiving app. it does not replace Excel as a data capture alternative. Some of my uses are to download the Betfair BSP files monthly, merge them into one file and then import them into Access for assessment. Likewise I use some of the data capture worksheets available in the forum to capture market prices and then link the sheet to Access for import and assessment. Basically any data you can get as a text or Excel file or you can copy from a website page and paste into Excel you can then import into Access. I don't know if you use the football data at Football-Data.co.uk but you can download the football data as an Excel file and import it.

The link between Access and Excel goes both ways so I harvest data in Excel, import it to Access/SQL, query the data in Access and sometimes link the extracted data back to Excel if it can process the extraction more efficiently. For example Access does not have a graph facility. They can be linked such that they work in parallel using the one most suited to your needs.
User avatar
ShaunWhite
Posts: 9731
Joined: Sat Sep 03, 2016 3:42 am

Access and SQL are great but quite a learning curve if you're not used to that world. But there's a perfectly adequate way to do it using just excel.

Basically, it's never going to be necessary to access all of your data at one time because that would just encourage backfitting. Ie you'll always just be using in and out of sample data.

With that in mind you could organise your raw data into quarterly chunks (or some other subdivision). Then rather than your analysis sheet containing data it just contains references to cells in a file called something like EnqData. So when you want to run your analysis on different sets of data just rename the appropriate archive file to EnqData and it'll be pulled into the analysis sheet. (or you could find/replace the formulas in Excel to point at a different source, either is fine)

The premise is to seperate your archive from your reporting databases because that's always going to be smaller than your entire dataset.

It's not ideal and adds a small extra process, but it's easy, familiar and also speeds up your analysis because the sheet doing all your analysis only contains the minimum amount of data you want for any given job. Also you can easily swap say UK horses for Aus horses or even dogs and run the same tests.

A huge monolithuc database isn't necessary and even though I use SQL my individual physical data files only contain 1 sport each for just 1 day each. For analysis I just pull the days in that I need (maybe even or odd numbered days or 6 random months out of 12) into my EnquiryData.mdb. The reason my archive is spilt is because I collect almost 2gb of data a day, that very quickly becomes a ridiculously unwieldy file esp for backups etc.
User avatar
ShaunWhite
Posts: 9731
Joined: Sat Sep 03, 2016 3:42 am

... that all sounds harder than it actually is. Ask if anything doesn't make sense. Basically just 3 things. An archive, an enquiry db, and an analysis sheet. All seperate entities.
spreadbetting
Posts: 3140
Joined: Sun Jan 31, 2010 8:06 pm

2gb of data a day
:o

That's hell of a lot of data, I still collect the Betfair SP files pretty much out of habit as it's automated, used to look at the IPMIN MAX data like a lot of people but that goes back to 2008 and is only 621Mb.

Do you find it useful or is it a case of getting everything in case one day you might need it?
User avatar
ShaunWhite
Posts: 9731
Joined: Sat Sep 03, 2016 3:42 am

spreadbetting wrote:
Thu Sep 10, 2020 4:05 pm
Do you find it useful or is it a case of getting everything in case one day you might need it?
Thing about data is you can't predict what you'll need in the future and it's sods law you haven't got it when you do. So I'm a hoarder.
It's tick data so gets stupidly big pretty quick. What I don't have and slightly regret is supplementary data that's not from Betfair such as race class and all runner fundamentals.

I don't use data more than a year old very often. If something hasn't made a $ in the last 12 months it's not worth going back any further. Data degrades anyway and I can't be bothered to get into weighing data significance by age and all that.
User avatar
Derek27
Posts: 23974
Joined: Wed Aug 30, 2017 11:44 am
Location: UK

I'm the opposite of you Shaun. I always keep hard drive backups but too quick to permanently delete data (especially emails) that I later need.
User avatar
ShaunWhite
Posts: 9731
Joined: Sat Sep 03, 2016 3:42 am

Derek27 wrote:
Thu Sep 10, 2020 4:54 pm
I'm the opposite of you Shaun. I always keep hard drive backups but too quick to permanently delete data (especially emails) that I later need.
Manual vs auto init. Your edge is is in your head, mine is in an SSD.
Post Reply

Return to “Bet Angel - Automation”