How to discover profitable automation strategies?

Advanced automation available in Guardian - Chat with others and share files here.
azzadeath
Posts: 13
Joined: Sat Feb 18, 2017 12:04 pm

Anbell wrote:
Thu Sep 03, 2020 12:26 am
azzadeath wrote:
Wed Sep 02, 2020 2:54 pm
I run an automated strategy just through bet angel that is profitable long term, It has limits though and the drawdowns can last up to 2 months :o but its generally a set and forget job
How do you load the markets every day?
Through guardian as you would normally.
Vladimir CC
Posts: 88
Joined: Wed Jan 22, 2020 1:13 pm

spreadbetting wrote:
Mon Aug 31, 2020 7:31 pm
Crazyskier wrote:
Mon Aug 31, 2020 5:30 pm
I see a lot of automated stuff on horses and dogs in particular that are surely very profitable long term, but impossible to replicate using standard software such as BA, etc.
Time to get out those coding books and write your own, plenty of help on the developers site and Liam even released his python framework for free. At the end of the day it'll always be more efficient to use your own bespoke software rather than trying to fudge your strategies within someone else's framework. But it all boils down to exactly what you're doing and how much effort you want to put in. Off the shelf software may well do all you need if not it's time to get out the textbooks or chequebook and get someone else to code it.
By "standard software such as BA" do you guys mean rules file automation or excel&vba automation? i'm still manual trading and i'm looking forward to write some vba for my automation. do people have success with excel&vba bots? i mean, even though it's not python/java it's still coding. i was hoping i could work on coding something successful without paying 200quid for the api access right from the start.
spreadbetting
Posts: 3140
Joined: Sun Jan 31, 2010 8:06 pm

All depends on what you're trying to do. I have excel based bots that turn over decent profits as I'm sure many others do.

It's just sometimes the more you bot the more you realise you can probably do things faster, more efficient ,save data, get additional data etc by writing your own rather than using someone's else access to the API.
User avatar
abgespaced
Posts: 176
Joined: Sun Aug 23, 2020 2:25 am
Location: Australia

I guess it's my turn to weigh-in on how I've been approaching the markets. Not that I am entitled to much of an opinion as I have only been here since breakfast! But there are some things I've been noticing.

Warning! I may have no idea what I'm talking about :lol:

1. Don't trust early results
I started with a basic supposition based on a manual strategy that I've had some limited success with. I ran it through practice mode first to make sure I wasn't losing a ton of money, then went live. I left it on most of the day and was up in profit by the end, though not by much. I understand variance so crossed my fingers and left it on overnight. In the morning it had deteriorated into a loss.

I knew this might happen, so I left my stakes very small. But I can imagine someone seeing the early profits and thinking this is great, increasing the stakes only to see a big loss the next day. Don't do this!

2. Only adjust one thing at a time
I always feel tempted to bring in 3-4 changes when something isn't working. It's hard to resist but it's better in the end. If I go about changing more than one thing at a time I might be introducing or omitting a second or third factor that just confounds the measurements. Better to go slow and play things out to a definite conclusion, as others have said.

3. Use small stakes and get real data
I started out using guardian in practice mode but turned to the real market once I could see my automation working as intended. With that switch my stakes dropped to about 100th of what they were in practice mode. I'm taking 5c a trade and turning it into hard data that I can analyse and pivot on.

They are the 3 main things I'm focusing on. I guess the next area of improvement is in the database management side of things. The market reports folder is getting messy and I have no database management experience besides copy-pasting things in and out of excel. So perhaps some kind of SQL crash course or something would be beneficial. I just don't want to lose track of the results I've got so far and get stuck going around in circles.
User avatar
Crazyskier
Posts: 1157
Joined: Sat Feb 06, 2016 6:36 pm

Vladimir CC wrote:
Thu Sep 03, 2020 9:56 pm
spreadbetting wrote:
Mon Aug 31, 2020 7:31 pm
Crazyskier wrote:
Mon Aug 31, 2020 5:30 pm
I see a lot of automated stuff on horses and dogs in particular that are surely very profitable long term, but impossible to replicate using standard software such as BA, etc.
Time to get out those coding books and write your own, plenty of help on the developers site and Liam even released his python framework for free. At the end of the day it'll always be more efficient to use your own bespoke software rather than trying to fudge your strategies within someone else's framework. But it all boils down to exactly what you're doing and how much effort you want to put in. Off the shelf software may well do all you need if not it's time to get out the textbooks or chequebook and get someone else to code it.
By "standard software such as BA" do you guys mean rules file automation or excel&vba automation? i'm still manual trading and i'm looking forward to write some vba for my automation. do people have success with excel&vba bots? i mean, even though it's not python/java it's still coding. i was hoping i could work on coding something successful without paying 200quid for the api access right from the start.
Though I use excel to record results and data, I'm not conversant with VBA and use only off-the-shelf automation. The new 5k hourly transactions limit (up from 1k) has allowed far more experimentation with high-frequency 1-tick stuff than before and this is where I'm seeing far superior systems based on WOM mostly, that are able to re-bet only when their current bet isn't at the best price. For me very short fill n kills are the work-around but the main issue there is you are always at the back of the queue each time, and those using code remain in the queue until the price moves, giving a huge advantage on certain events like the horses with a 1.09 shot called 'Enable' the other day. There were massive volumes traded and often are on GRP1 / GRP2 races but I'm starting from a point of disadvantage each time I use standard software, no matter how sophisticated the settings.

CS
User avatar
firlandsfarm
Posts: 2688
Joined: Sat May 03, 2014 8:20 am

Anbell wrote:
Thu Sep 03, 2020 12:26 am
azzadeath wrote:
Wed Sep 02, 2020 2:54 pm
I run an automated strategy just through bet angel that is profitable long term, It has limits though and the drawdowns can last up to 2 months :o but its generally a set and forget job
How do you load the markets every day?
Hi Anbell, you may find something in this thread that is of interest.
Anbell
Posts: 2005
Joined: Fri Apr 05, 2019 2:31 am

firlandsfarm wrote:
Thu Sep 10, 2020 6:08 am
Anbell wrote:
Thu Sep 03, 2020 12:26 am
azzadeath wrote:
Wed Sep 02, 2020 2:54 pm
I run an automated strategy just through bet angel that is profitable long term, It has limits though and the drawdowns can last up to 2 months :o but its generally a set and forget job
How do you load the markets every day?
Hi Anbell, you may find something in this thread that is of interest.
That's it. Thanks. I installed autohotkey a while back but never gave it a spin
User avatar
firlandsfarm
Posts: 2688
Joined: Sat May 03, 2014 8:20 am

Hi abgespaced you sound like my kinda guy, I think you are talking of operating similar to how I do.
abgespaced wrote:
Sat Sep 05, 2020 10:44 am
1. Don't trust early results
I started with a basic supposition based on a manual strategy that I've had some limited success with. I ran it through practice mode first to make sure I wasn't losing a ton of money, then went live. I left it on most of the day and was up in profit by the end, though not by much. I understand variance so crossed my fingers and left it on overnight. In the morning it had deteriorated into a loss.
Agreed. In fact I don't trust 3 month results! I only increase the stakes after I have made sufficient profit to cover the increase, I don't use new money to cover increasing stakes.
abgespaced wrote:
Sat Sep 05, 2020 10:44 am
2. Only adjust one thing at a time
I always feel tempted to bring in 3-4 changes when something isn't working. It's hard to resist but it's better in the end. If I go about changing more than one thing at a time I might be introducing or omitting a second or third factor that just confounds the measurements. Better to go slow and play things out to a definite conclusion, as others have said.
Agreed. How can you tell if one changed increased the profits and another reduced them. Building a system take a lot of time … rushing usually results in the poor house.
abgespaced wrote:
Sat Sep 05, 2020 10:44 am
3. Use small stakes and get real data
I started out using guardian in practice mode but turned to the real market once I could see my automation working as intended. With that switch my stakes dropped to about 100th of what they were in practice mode. I'm taking 5c a trade and turning it into hard data that I can analyse and pivot on.
Agreed. And Practice mode can be a pain in adding the winners in order to evaluate the results.
abgespaced wrote:
Sat Sep 05, 2020 10:44 am
They are the 3 main things I'm focusing on. I guess the next area of improvement is in the database management side of things. The market reports folder is getting messy and I have no database management experience besides copy-pasting things in and out of excel. So perhaps some kind of SQL crash course or something would be beneficial. I just don't want to lose track of the results I've got so far and get stuck going around in circles.
If you are serious about your betting, and I guess you are, don't use Excel to archive data. By all means use it to harvest data but then transfer to a proper database facility for storage and analysis. You will soon see the shortcomings of Excel. I use Access from MS Office. You don't need to know anything about SQL to get started, it has a graphical interface … you drag and drop what you want, give it some filter parameters and use a few formulae to process the data. Access can handle a database file of up to 2GB, as you grow the database you may find that restrictive. No problem. You can add MS SQL Server Express to your computer (I'm assuming you are using a windows PC) and link it to the Access database. SQL Server Express is free and can handle a database up to 10GB.

OK you are now moving into the world of real databases. One thing many don't realise when first embarking on SQL is that it is only a coding language, it has no user interface. It is what is termed in database parlance the BackEnd (BE). You will need a FrontEnd (FE) to give the ability to communicate with the data. You can use VB to create your own FE but I use Access as my FE. It allows me to do quick and dirty data retrievals when experimenting and when it's a finished product I move it to SQL.

Once you have SQL Server installed you can move your data tables to SQL all at once or one at a time because the SQL Server will be controlled by Access so you continue to use Access to design and implement your data queries even though your data is in SQL Server. Access also has a facility that translates your queries designed with the graphical interface into SQL so you can use that to help learn SQL. Unfortunately it's not the same SQL as used in Server Express but very close. As part of my SQL learning process I take the SQL produced by Access and translate it to Server SQL, there are only a few everyday code functions that differ. So now you are using Access to tell SQL Server to run QueryA or QueryB and when I has the result from SQL Server how to present it to you. Simple! :)
Anbell
Posts: 2005
Joined: Fri Apr 05, 2019 2:31 am

firlandsfarm wrote:
Thu Sep 10, 2020 6:57 am
If you are serious about your betting, and I guess you are, don't use Excel to archive data. By all means use it to harvest data but then transfer to a proper database facility for storage and analysis. You will soon see the shortcomings of Excel. I use Access from MS Office. You don't need to know anything about SQL to get started, it has a graphical interface … you drag and drop what you want, give it some filter parameters and use a few formulae to process the data. Access can handle a database file of up to 2GB, as you grow the database you may find that restrictive. No problem. You can add MS SQL Server Express to your computer (I'm assuming you are using a windows PC) and link it to the Access database. SQL Server Express is free and can handle a database up to 10GB.

OK you are now moving into the world of real databases. One thing many don't realise when first embarking on SQL is that it is only a coding language, it has no user interface. It is what is termed in database parlance the BackEnd (BE). You will need a FrontEnd (FE) to give the ability to communicate with the data. You can use VB to create your own FE but I use Access as my FE. It allows me to do quick and dirty data retrievals when experimenting and when it's a finished product I move it to SQL.

Once you have SQL Server installed you can move your data tables to SQL all at once or one at a time because the SQL Server will be controlled by Access so you continue to use Access to design and implement your data queries even though your data is in SQL Server. Access also has a facility that translates your queries designed with the graphical interface into SQL so you can use that to help learn SQL. Unfortunately it's not the same SQL as used in Server Express but very close. As part of my SQL learning process I take the SQL produced by Access and translate it to Server SQL, there are only a few everyday code functions that differ. So now you are using Access to tell SQL Server to run QueryA or QueryB and when I has the result from SQL Server how to present it to you. Simple! :)
This is very helpful. Thanks.

My spreadsheets are 160000 rows deep with many sumif and vlookups and whatnot and are about to break my machine. YOu make it sound simple.
User avatar
mcgoo
Posts: 898
Joined: Thu Jul 18, 2013 12:30 pm

Might have to look at Access and SQL again (years ago dabble) . The thing that is breaking my brain today is whether to extend my 50% stake/profit take out from 4 ticks or to extend the pull back threshold to x ticks so that I don't get shaken out after taking profit by a short term move. It hurts! :ugeek: :D
Edit: Or both..see it hurts!
User avatar
firlandsfarm
Posts: 2688
Joined: Sat May 03, 2014 8:20 am

Anbell wrote:
Thu Sep 10, 2020 7:41 am
This is very helpful. Thanks.

My spreadsheets are 160000 rows deep with many sumif and vlookups and whatnot and are about to break my machine. YOu make it sound simple.
You're welcome, it's nice to have the opportunity to return the help I have been given by forumites.

My racing database is 2.2 million rows * 50 columns in the largest table, the Horse-in-Race table that stores all the race specific data for each horse in a race. It's supported by tables for the Races data, Jockies, Trainers, Going, (non-race specific) Horse data, etc. etc. but it can sort and display in a few seconds.

One thing you will learn with a real database is the use of relational tables and indexes. Indexes tell the software where a record is in a table so that it doesn't have to go through all data row by row to find stuff. You don't index everything just the columns you usually search on. Relational tables (referred to as 'Normalisation' in database world) are where you have a table for each field with common multiple entries. So in my example the Horse-in-Race table will refer to the ID of the horse in the Horses table so for each Horse-in_Race entry I store the ID number of each runner from the Horses table and that gives me the name, sex, DOB, Sire, Dam etc. of the Horse for every race it runs in while only storing that data once. Relational tables may sound more complicated than Excel but once you start thinking of data in categories it falls into place and allows you to apply filters more efficiently. You can of course still do an all encompassing single table while you think of how to design the data! :)
User avatar
abgespaced
Posts: 176
Joined: Sun Aug 23, 2020 2:25 am
Location: Australia

firlandsfarm wrote:
Thu Sep 10, 2020 6:57 am
Hi abgespaced you sound like my kinda guy, I think you are talking of operating similar to how I do.
Thanks my dude! Great points about Access. I'll look into it. Anything that can make the process easier.

Question - how do you move the data into Access in the first place? Does BA have an import feature like it does for Excel?
User avatar
abgespaced
Posts: 176
Joined: Sun Aug 23, 2020 2:25 am
Location: Australia

mcgoo wrote:
Thu Sep 10, 2020 8:42 am
Might have to look at Access and SQL again (years ago dabble) . The thing that is breaking my brain today is whether to extend my 50% stake/profit take out from 4 ticks or to extend the pull back threshold to x ticks so that I don't get shaken out after taking profit by a short term move. It hurts! :ugeek: :D
Edit: Or both..see it hurts!
Can't you run separate tests for both?
User avatar
firlandsfarm
Posts: 2688
Joined: Sat May 03, 2014 8:20 am

abgespaced wrote:
Thu Sep 10, 2020 11:48 am
Question - how do you move the data into Access in the first place? Does BA have an import feature like it does for Excel?
Access can only be used as an archiving app. it does not replace Excel as a data capture alternative. Some of my uses are to download the Betfair BSP files monthly, merge them into one file and then import them into Access for assessment. Likewise I use some of the data capture worksheets available in the forum to capture market prices and then link the sheet to Access for import and assessment. Basically any data you can get as a text or Excel file or you can copy from a website page and paste into Excel you can then import into Access. I don't know if you use the football data at Football-Data.co.uk but you can download the football data as an Excel file and import it.

The link between Access and Excel goes both ways so I harvest data in Excel, import it to Access/SQL, query the data in Access and sometimes link the extracted data back to Excel if it can process the extraction more efficiently. For example Access does not have a graph facility. They can be linked such that they work in parallel using the one most suited to your needs.
User avatar
ShaunWhite
Posts: 9731
Joined: Sat Sep 03, 2016 3:42 am

Access and SQL are great but quite a learning curve if you're not used to that world. But there's a perfectly adequate way to do it using just excel.

Basically, it's never going to be necessary to access all of your data at one time because that would just encourage backfitting. Ie you'll always just be using in and out of sample data.

With that in mind you could organise your raw data into quarterly chunks (or some other subdivision). Then rather than your analysis sheet containing data it just contains references to cells in a file called something like EnqData. So when you want to run your analysis on different sets of data just rename the appropriate archive file to EnqData and it'll be pulled into the analysis sheet. (or you could find/replace the formulas in Excel to point at a different source, either is fine)

The premise is to seperate your archive from your reporting databases because that's always going to be smaller than your entire dataset.

It's not ideal and adds a small extra process, but it's easy, familiar and also speeds up your analysis because the sheet doing all your analysis only contains the minimum amount of data you want for any given job. Also you can easily swap say UK horses for Aus horses or even dogs and run the same tests.

A huge monolithuc database isn't necessary and even though I use SQL my individual physical data files only contain 1 sport each for just 1 day each. For analysis I just pull the days in that I need (maybe even or odd numbered days or 6 random months out of 12) into my EnquiryData.mdb. The reason my archive is spilt is because I collect almost 2gb of data a day, that very quickly becomes a ridiculously unwieldy file esp for backups etc.
Post Reply

Return to “Bet Angel - Automation”