Category Archives: Business

How to Set Up Amazon S3 for Website Hosting

Amazon Web Services S3 is a great way to host static websites. Here’s how to set up Amazon AWS S3 for Website Hosting. 

If you want to run WordPress on Amazon S3, see Serverless WordPress.

This tutorial assumes you’ve already got an Amazon Web Services account.

  1. Go to Services and search for or select S3.

    Select the S3 service
    Select the S3 service
  2. Click Create Bucket.

    Select create bucket
    Select create bucket
  3. Enter a name for your bucket, this must be unique. For instance bobs-cool-hosting.

    Give your bucket and a name and select a region
    Give your bucket and a name and select a region
  4. Select a region to host your bucket in. This will be geographically where your files are served from.  It’s best to choose a location close to where your users visit from. Click Next.
  5. The options can be left default, just click Next.

    Options can be left default
    Options can be left default
  6. Permissions set up is important. By default AWS S3 sets the bucket up to be secure and prevent it from being made publicly accessible. This is due to so many people just setting up buckets and accidentally or carelessly making them public, resulting security breaches. We want our bucket to be public because we’re hosting a website, so uncheck all the Public access settings and click Next.

    Permissions settings
    Permissions settings
  7. On the Review page you may be warned that this bucket may become public, that’s ok as we said so click Create bucket.

    Review page
    Review page
  8. So we’ve now created our bucket, as you can see here it’s marked “Objects can be public“. Click on the name of the bucket to open it.

    List of buckets
    List of buckets
  9. Click the Properties tab, then click Static website hosting.

    Select static website hosting
    Select static website hosting
  10. Click the option Use this bucket to host a website. Take note of the URL at the top, this will be used to access our website. Type in index.html as the index document and error.html as the error document. Click Save.

    Configure static website hosting
    Configure static website hosting
  11. If you now go to the URL we saw you’ll see it’s still saying 403 Forbidden. We now need to set up it’s permissions to enable public access.

    By default access is prevented
    By default access is prevented
  12. Click on the Permissions tab, then Bucket Policy. Copy in the following policy, being sure to change the bucket name in the Resource field from “my-serverless-wp” to match the name of your bucket. Click Save.
    {
     "Version": "2012-10-17",
     "Statement": [
     {
     "Sid": "PublicReadGetObject",
     "Effect": "Allow",
     "Principal": "*",
     "Action": "s3:GetObject",
     "Resource": "arn:aws:s3:::my-serverless-wp/*"
     }
     ]
    }

    Set up bucket policy
    Set up bucket policy
  13. Now create a test.html file with just a bit of text in it. On the Overview tab click Upload.

    Select Upload
    Select Upload
  14. Click Add File and Select the file you created and then click Next.
  15. Under Manage public permissions select Grant public read access to this object(s). Click Next.

    Set object to public
    Set object to public
  16. On the Set properties page the standard Storage Class is fine for this, click Next.

    Default properties are fine
    Default properties are fine
  17. Click Upload, our file will then be displayed in the list. 
  18. Go to the bucket url from step 10, enter this in a browser and add at the end “/test.html”. You should see your test.html page displayed.

    Test page is now displayed
    Test page is now displayed

Your S3 bucket is now ready to serve your website, but you’ll probably want to set up a DNS CNAME to give it a friendly domain name. I’ll explain how to do that in another article.

Serverless WordPress (sort of)

Here’s how I run my site(davidfindlay.com.au) in a sort of serverless way using Amazon S3. I say it’s sort of serverless because you still need an Apache/MySQL/WordPress installation, but it doesn’t need to be running all the time and can just run on your local computer.

Why host your WordPress site on Amazon S3?

Firstly S3 is very fast. WordPress hosted on LAMP has to bootstrap WordPress, talk to the database, process your request and generate a page before sending it to the browser. This all takes time. It makes sense if you host dynamic content. However if your content doesn’t change much, it doesn’t.

If you update your site maybe once a day, why have the HTML generated every time a visitor hits the site? With Static WordPress hosting you generate the HTML once when you make a change and the generated HTML is then served to each new visitor. This is much faster. 

As mentioned S3 is very fast, but it’s also scalable. If your site suddenly gets visited by 10000 people in an hour, S3 can handle it. Your typical WordPress installation on a LAMP hosting provider probably can’t. 

Secondly static hosting is more secure. Because your WordPress installation is hidden behind a firewall on your local network, you don’t have to worry about security updates and zero-day exploits as much. Sure you still should keep up to date, but because attackers don’t have any access to the PHP pages or database you’re kept much safer. Amazon has good security measures on S3 and as long as you use them, your S3 should be kept safe. 

Assumed Knowledge

  • Basic set up and installation of WordPress and WordPress plugins
  • Using hosting software such as MAMP or a LAMP server or Docker
  • Working at the command line

Step 1: WordPress installation

Firstly install WordPress locally. Perhaps using MAMP or on an Apache/MySQL/PHP installation on a linux box on your local network. How you do this part is up to you. I’ve actually got mine running on a small EC2 micro instance, that I just turn on and off when I want to make changes to my site.

No one will actually visit this WordPress installation, so it can just be local on your machine, not world accessible via the internet. Firewall it off so no one can reach it for maximum safety.

You’ll also need to install the AWS CLI. If you’re using an EC2 instance with an Amazon AMI, you’ll already have this. 

Step 2: Set up an S3 Bucket

You’ll need an Amazon Web Services account first, a free-tier account should be fine for most small sites for at least the first year. Afterwards you may need to pay, but S3 is really cheap.

There’s a lot of steps to setting up an S3 Bucket for web site hosting, so I’ve put them in a separate article here: How to Set Up Amazon S3 for Website Hosting.

Once you’ve got the S3 bucket set up return here.

Step 3: Install Simply Static WordPress Plugin

This is pretty much a standard WordPress plugin install, so I want explain it too much.

The Simply Static plugin automatically generates a plain html version of your site and exports it to a directory on your WordPress host. 

Static means that it’s plain HTML, no PHP. It can run on any sort of hosting without needing a PHP or MySQL installation. 

Once Simply Static is installed, activate it.

  1. Select Simply Static, then Settings from the left hand menu.
  2. Set Destination URLs to Use Relative URLs.

    Simply Static settings
  3. Set Delivery Method to Local Directory.

    Simply Static settings, continued
    Simply Static settings, continued
  4. Set Local Directory to a suitable location, for instance on my linux installation “/var/www/html_static”. Take note of this path as you may need to modify the script in Step 4 to match.

Step 4: Configure AWS IAMs user and AWS CLI

You’ll need an AWS IAM account set up to use the AWS CLI.

  1. Click Services at the top of the screen and in the search box type IAM. Click on the IAM option that appears in the drop down.
  2. Click Add User.

    Add IAM user
    Add IAM user
  3. Enter a user name such as “s3hosting”. Under Access Type, select Programmatic access. This is required so that the AWS CLI can use the user credentials. Click Next.

    Set up programmatic access
    Set up programmatic access
  4. Under Set Permissions, select Attach existing policies directly, then search for s3. Select the AmazonS3FullAccess policy. Click Next. Note that this policy means that using this AWS Access Key ID and Secret Key, someone could access any file in any bucket on your AWS account. This can be dangerous! 

    Select existing policy
    Select existing policy
  5. Continue through to the review page with default settings. The review page should look like this. Click Create User.

    Review and create user
    Review and create user
  6. You’ve now created the AWS CLI user. You’ll need the Access key ID and Secret access key displayed on this page for the next part of the process.

    Note access key id and secret key
    Note access key id and secret key

Next move back to your terminal where you’ve installed your WordPress.

Run the AWS Configure command. You’ll need to supply user IAM user Key ID and Secret Key as well as the default region, which should be the region that your S3 bucket is in:

aws configure
Configure AWS CLI
Configure AWS CLI

Create the following bash script and call it syncStatic.sh:

#!/bin/bash
aws s3 sync /var/www/html_static s3://my-serverless-wp/

Change ‘my-serverless-wp’ to match the name of your bucket and you may need to change ‘/var/www/html_static’ to match the local directory you set in Step 3.

Step 5: Generate Static HTML

In the WordPress Admin pages, select Simply Static from the side menu. Click Generate.

The log will show progress as the static html pages are generated. When the log shows “Done!” move to the next step.

Step 6: Sync to S3

In your terminal, run the syncStatic.sh script. It’ll quickly transfer the files to S3. If you’re running in a small EC2 instance this will be super quick, but a bit slower otherwise.

Step 7: Test the site

Go to your S3 public endpoint URL in your browser. For instance: http://my-serverless-wp.s3-website-ap-southeast-2.amazonaws.com/

You can get your URL from the S3 bucket configuration by going to Services->S3->Select your bucket->Properties->Static Web Hosting

Static website hosting url
Static website hosting url

After clicking on that URL or pasting it in your browser, you should be able to see your WordPress site and browse it.

Step 8: Set up DNS CNAME

Your site is now on the web, but it’s on an ugly Amazon AWS S3 url. You don’t want to direct people to that. 

The next step depends on how you want to host your site. You’ll need to set up CNAME(canonical name) which points your website domain to the the AWS S3 bucket address.

I’ll show how to do this for Amazon Route 53 DNS hosting in another article.

Notes on Entry and Result Management at the Pan Pacific Masters Games Swimming 2018

Last week I attended my 3rd Pan Pacific Masters Games(PPMG) Swimming competition in the role of Chief Recorder. This is a role that I have created and developed over my time as Director of Recording for Masters Swimming Queensland, and which I believe is critically important to the running of successful large swimming meets. 

Preparing to run the meet

In 2018 we had 564 competitors in the PPMG Swimming event. Many of these entrants came from outside of Australia, with a large contingent from New Zealand, New Caledonia and China. This presents a significant challenge to handle. Manual input of entries in to the sports event management software, Hy-Tek Meet Manager, would take many days of work for volunteers. In the past it has been very prone to error. Masters Swimming Queensland has its own online entry system which interoperates with Hy-Tek Meet Manager, but it is usually open only to members.

To handle this, since 2014 I’ve developed tools which allow the data from the PPMG Administration’s entry system to be imported into the Masters Swimming Queensland system. The system uses a multistep approach which allows errors to be detected and dealt with. Every year the PPMG Administration has had a different data format for entries, so for each of the bi-annual events changes have had to be made to the system. 

In step 1, the CSV of the entries is uploaded to the MSQ Entry Manager system and a list of entries is created. In Step 2, matches between PPMG Entrants and masters swimming members known by MSQ Entry Manager are flagged and linked. In Step 3, temporary event memberships are created in the MSQ Entry Manager system for non-members and international entrants. Then in Step 4, individual event entries are created for all entrants in the MSQ Entry Manager system.

Individual event entries include what is known as a seed time. This is the entrant’s estimation of what time they expect to swim in the event. This time is used to put entrants into heats with other entrants of similar capabilities. 

As part of Step 4 mentioned above, I’ve developed Natural Language Processing technology which takes a wide variety of time formats and converts them into the internally used quantity of seconds. For instance, the correct time format for “2 minutes, 34.23” seconds is “2:34.23”, but this may be entered by users as “2:34:23” or “2.34.23”. Or it may be spelled out as “2 min 34.23 sec”. I’ve had an automatic time normalisation system in place for some time, but a newly upgraded version is now able to handle all such formats and correctly understand the intention of the user when they typed in the time. I’ll be publishing a paper on this technique along with a reference implementation in the future. 

From this point onwards the entry data can be handled in the MSQ system in the same way as we handle any swimming meet. Standard checks that I’ve developed were against all entry times, looking to flag times that appeared to be too short(less than 20 seconds per 50 metres) or too long(greater than 2 minutes 30 seconds per 50 metres). I have plans to add automated checks against national and world record times, as well as against individual competitor personal bests, but there was not enough time to get these prepared for the PPMG2018 meet. 

The ultimate result of this was that we had one of the cleanest sets of entry data we’ve ever had for a Masters Games. All errors found in the draft entry lists were due to user error by the entrants. Quite simply they were caused by people typing in the wrong entry time, or selecting the wrong events or entrants not knowing how long it would take them to swim a particular event. 

There were some issues that carried over from the PPMG entry system. Where entrants had edited their entries on the PPMG entry system, the edits were not reflected in the exported data provided to sports organisers by PPMG. However this was easily rectified because I was able to publish draft lists and we had the time and capacity to make changes to entries before the start of the event. We were able to accepted several late entries and late changes, because our entry management systems were so efficient and refined. 

In the final days before the meet, I produced meet programmes for printing and extracted statistics about competitors for use in the handouts to competitors. PPMG Administration required full updates on any changes to the entries for the swimming competition, so I used Trello to manage my workflow. I created boards for To Do, Doing, Waiting, Done, PPMG Informed and PPMG Information Not Required. When a new change request came in via any channel(email, phone, etc), I immediately created a card for it in To Do. Where changes could not be actioned due to further information needed, these were put into Waiting, with notes about the next action required. Once complete each card was moved into Done. From there I made a decision on whether or not PPMG Administration need to be informed. If so, I emailed it to them in the next batch and once done moved the card to PPMG Informed. Otherwise I’d put the card into PPMG Information Not Required, for changes that PPMG Administration didn’t need to know about. This allowed me to keep PPMG Administration fully informed on all changes they needed.  

Unfortunately, there were some data corruption issues in the import this time. Some non-master’s member entrants were imported into the system as female incorrectly. This was quickly corrected before the day the meet started. It was isolated to just a small subset of the entries and they were able to be manually checked. The few that were missed were fixed when entrants checked the draft entry list. Others had club information not import correctly, partially because international masters were non-consistent about how they provided their club details. This would have to be resolved as the meet proceeded. 

During the Competition

During the competition I oversaw all matters related to event entries and results. Actual operation of the timing system(Quantum Automated Officiating Equipment or AOE) and the meet software(Hy-Tek Meet Manager 7) was handled by two highly skilled contracted staff members who work with the venue on a regular basis. 

My role was to act as an interface between Masters Swimming Queensland and the recording staff to ensure that MSQ’s needs were met. I was responsible for changes to the programme, entries and the integrity of the results. 

Where changes were to be made to the programme on future days, I would handle these each night after competition. Where a change was to be made in a future event on the same competition day, this was handled by the recording operator. Changes to the currently running event were delegated to the Marshalling team, who would then inform recording. This approach enables us to ensure that entrants are able to flexibly change their entries as needed. If a competitor arrives late for a heat, marshalling is able to put them into an empty lane from another heat. Provided the information is given to recording in a timely fashion, the scoreboard and result information can be immediately updated to reflect the change and to ensure that the correct person receives the correct change. 

I’ve always taken the approach that if I can accomodate an entrant’s request for a change, I will. I want the competitors to enjoy the event as much as possible, so they’ll want to return again in the future. Arbitrary rules based on perceived data management limitations prevent this. With the right team and the right procedures in place, result data management doesn’t limit changes to sporting event entries. In sporting events where individuals are competing directly by their own performance there is no good reason to not allow changes to programmes right up to the last minute.

Daily Routine

During a large swim meet my start of day routine is as follows:

  1. Check overnight scratchings and programme change requests. Action where possible.
  2. Produce a Meet Manager backup file for start of day, provide to Meet Manager operator.
  3. Produce Marshalling Sheets and provide to Marshalling, so they can get started with organising events and heats for the day. I also provide Marshalling with two copies of the programme.
  4. Produce Lane Sheets and provide to Chief Timekeeper, so they can be distributed to Lane Timekeepers.
  5. Produce programmes for the refereeing officials as necessary.

This order of processing ensures that the other teams working on the meet get what they need in order of priority. Recording takes the highest priority followed by marshalling. Marshalling needs to have heat swimmers organised 5-10 minutes ahead of their actual heat, so they need their information before other officials. After that the lane timekeepers need to have their paperwork so they can write down information on whether or not there was a swimmer in their lane and any changes to the expected swimmer’s identity. Finally the referees need programmes to know who they have in different lanes. They have the lowest priority however as if they need to they can work simply from heat number and lane number, referring to recording to find out the identity of the infracting swimmer. 

By following this start of day process, even when there are technical delays, I can help ensure the meet can get underway on time. 

Throughout the meet, I ensure that any recording problems are quickly resolved. 

Each afternoon at the end of meet I did the following:

  1. Get a copy of the backup from the main recording computer.
  2. Produce a report of all the day’s results with splits to be sent to the PPMG Administration and MSQ for posting on their respective websites. 
  3. Export interim results for upload to the MSA Results Portal.
  4. Action updates and changes known for subsequent days.

Relays

The other big task for me in my role as Chief Recorder is overseeing the organisation of relay teams. Normally this has been entirely done on the day at the PPMG. This year PPMG Administration allowed entrants to nominate and pay for relay entries when people entered the PPMG. This presented some challenges.

The MSQ Entry Manager system previously only tracked the overall cost and overall payment of an entrants entry to the entire swimming meet. This would not easily allow us to track relay nomination payments. 

I had to make some decisions about system design and business rules to enable tracking of these nominations and payments:

  • Nominating for a relay event does not automatically put you in a relay team. 
  • If you’re a member of a club, that club can see your nomination to know that you want to be in that relay event.
  • If your club put you in a relay team, in an event you’d nominated for, your nomination payment would be applied to your position in the team. The club would only need to pay the remainder for those members who had not already paid.
  • Your club may choose not to put you in their relay team for the event you nominated in. In this case you may be a member of an unattached team, and your nomination payment would be applied to your position in the unattached team.
  • If you’re not a member of a club, you can nominate on the day to be a member of a team, pay the nomination fee and we would attempt to put you in a random team.
  • Anyone can register a team of four people and pay the nomination fee for those members of the team who had not already nominated and paid online.
  • If you had nominated for a relay event, but not been in a team for that event, your nomination fee could be applied to your position in a team in another event. 

I upgraded the MSQ Entry Manager system to track the cost of event nominations and payments for those nominations. I created an interface to track those payments. I had planned to also allow new nominations and payments to be recorded, but this was not completed in the end due to time constraints and competing priorities. 

An existing interface from previous MSQ meets was used to show the cost of each relay team, and the payments made online for those relay entries. Now that the meet has been completed, I will be exporting these details to Excel spreadsheets so that total amount owed by clubs for relay entries can be calculated and invoiced via PayPal. 

Non-club relay team payments on the day were noted in a receipt book for future reconciliation. It would have been good to have this handled in the MSQ Entry Manager system, but again due to time constraints this wasn’t possible. 

In future events I’ll have this interface prepared and volunteers trained in advance to operate the relay tasks. 

The other part of relay nominations at PPMG meets is actually getting the team information into the Meet Manager system. Relay nominations can be entered directly into Meet Manager, but this is not a user friendly process and requires a second computer linked to the live Meet Manager recording computer.

At my first PPMG, I spent many hours entering paper relay team forms into Meet Manager. This process was laborious and difficult. Some people’s writing was unreadable. Forms were not completely filled out. Entrant names were not able to be found in the entrant list, or entrants had been entered into more than one relay team in the event. After this debacle, I built a new jQuery based relay entry system for PPMG16. 

At PPMG16 the new system mean that the volunteers at the Relay Desk directly entered entries into Entry Manager’s Relay Entries module. It would prevent people being in more than one team, and allow search and selection of relay team members from the competitor list. It enforced relay team rules, for instance club relays were only able to have members from that club, whereas unattached relays could have any entrant in them. The system was very successful at that meet and cut relay entry workload considerably. In the end it proved to be easier for the Relay Desk volunteers to take a paper form and then enter it into the computer later, than processing it in the computer at the time of presentation. However other rules I enforced, such as fully filling out relay forms before they could be accepted and requiring relay team contact phone numbers, meant that the desk was easily able to get all relay teams organised with limited involvement by me.

The new MSQ Entry Manager Club Relay Teams module
The new MSQ Entry Manager Club Relay Teams module

Once relay teams were created in MSQ Entry Manager, they were able to be downloaded as a hy3 file for direct import into Meet Manager. This meant no double handling of the already checked relay entry data and minimal errors. 

This time, there were less volunteers available for the relay desk, so on the first day of relays, I needed to spend most of the morning at the relay desk. This lack of volunteers and the early relay events on Day 2 made the day a bit of a struggle. However the system still performed well. Some international masters member club data corruption issues originating in the import of PPMG entrant data did require a small amount of remediation after import into Meet Manager, but the workload was still considerably less than if we’d done it the old way.

As previously mentioned, we intended to put people who had nominated online for a relay event into random teams if they did not find their own team. We did this on the first day of relay events. However many of the people we put into into teams never turned up at marshalling. On the remaining days we only put people who had presented to the relay desk into teams. There were no complaints about this change and it meant less stranded relay team members. 

The club data corruption also seemed to cause some problems with the scoreboard when relays were imported into Meet Manager. Entries are usually imported into Meet Manager using a hy3 file. Checking the hy3 files showed no differences between a hy3 file exported by Hy-Tek Team Manager and a hy3 file exported from MSQ Entry Manager. Yet after importing relays the scoreboard’s country field showed the club name, instead of country of origin. The issue had not appeared when the same system was used for PPMG16 and the MSA National Championships in 2017 at the same venue. Further analysis and testing will be required to remediate the problem for future events. 

This year I developed and deployed a new online relay entry module was for Masters Swimming clubs to use when registering their relay teams for the PPMG event. Instead of having to go to the relay desk with forms, Masters clubs were able to register a club captain who was then able to use an online interface to register their teams. The module was built using a frontend based on Bootstrap4. As this had to be implemented in our legacy Joomla CMS, the functionality was built using jQuery. Implementing the advanced functionality such a two-way data binding was more difficult in jQuery, but ultimately it was possible to provide a very modern, accessible and easy to use user experience. Over half the relay teams in the meet were registered via the tool and feedback from clubs was very positive. 

I will be reimplementing the new relay system in Angular and be part of the new MSQ Quick Entry system under development for future meets. This will allows us to retire the old Joomla CMS based entry system and give me the ability to implement new functionality more easily. 

Other Recording Functions

Another function I provide during swim meets is the delivery of statistics and meet information to the announcer. Records broken are provided where possible to the announcer to inform the competitors and spectators. This is secondary to my role of ensuring the meet recording runs smoothly. In this particular meet, due to various time constraints and lack of volunteers, I was only able to provide limited updates to the announcer. In future I’d like to organise a dedicated person in the recording team to provide such information to the announcer, PPMG Administration and media as applicable. This would mean that these functions continue even if I’m busy troubleshooting other higher priority issues. 

This meet was the second major event where MSQ has included Multi-class competition. Competitors with disabilities are able to compete in the same heats and events as able-bodied athletes and are scored in their own age group categories. This is something quite new for Masters Swimming in Australia and we still lack sophistication in this area. By and large the multi-class part of the event functioned well, but there were issues in registration and results publishing. Primarily these relate to us just not having a comprehensive understanding of how Meet Manager handles multi-class results, and not yet having a fully developed set of procedures. Through the lessons learnt out of PPMG18, I intend to develop a full set of procedures to be adopted at state and club levels, which will make our operation of future multi-class events easier and trouble-free. 

I’ve made contacts with Victorian clubs who are also involved with multi-class and intend to use the connections to work towards an effective nation-wide approach for multi-class recording in Masters Swimming. 

In Conclusion

Since the end of the event I’ve received a lot of praise for the way the swimming event was run at the Pan Pacific Masters Games 2018. This was a major team effort with huge contributions from Meet Director Shane Knight, MSQ Administrator Christina Scolaro, Susanne Milenkevich, Martin Banks and many, many others. I’d especially like to thank Liala Davighi for her help with relays. 

Over coming months I’m planning to consolidate the lessons learned and start building our systems for the next large MSQ events, starting with State Championships in 2019 and the Great Barrier Reef Masters Games. I hope to build an ongoing team in the recording space to ensure we can have world class data systems that allow MSQ to lead innovation in community sports events. 

Not many people actually realise all the work that goes into running a major swimming meet. There’s been months in the lead up, and there’s still weeks worth of work for me. I still have to provide official results to international Masters Swimming governing bodies and finalise relay reconciliation information to provide to our finance auditors. At least a couple more weeks of work in evenings and weekends outside my full-time job and family responsibilities. Hopefully this helps people understand what goes into running such an event.

 

Getting Started with Google AdSense

Here’s the companion resource links to go with my presentation on Network Live Virtually 30/05/2017 on earning passive income via Google AdSense

I also mentioned there are other better tools you can use for Keyword Research if you want to go to the next level. Here’s the tool I was referring to:

  • Long Tail Pro(affiliate link) – I have used a trial version and it seemed pretty good. I don’t use it right now as I’m not actively building new sites.

I’ll link to the presentation replay once it’s posted. 

JSON Feed – The New Way to Syndicate Posts

There’s a new way to provide syndication feeds for websites. JSONFeed does basically the same thing as the traditional RSS or Atom feed, but instead of using XML it uses JSON. The big advantage of this is that the file size of the feed is much smaller and processing of the feeds can be much more efficient. 

You can enable it now on WordPress using the JSONFeed Plugin for WordPress. You can find it by searching for JSONFeed via the Plugins->Add New option in the WordPress Administration Dashboard. 

Enabling the plugin will make your feed accessible via JSONFeed. You can see an example by viewing the davidfindlay.com.au JSONFeed. I’m not aware of any feed readers yet that use this, but you can be sure they’ll come soon. It’s a bit of a chicken and egg problem. I’m always keen on being an early adopter on things like this. 

My Site Moves to Amazon Web Services

I’ve finally moved my site onto Amazon Web Services. It’s now running on a t2.micro EC2 instance in Amazon’s Asia Pacific Sydney region. It’s using Amazon’s linux AMI, with Apache httpd. MySQL is served via an Amazon RDS MySQL instance. The domain is delegated to Route 53 for DNS. 

So far it seems to be faster than my traditional shared hosting and that’s without even looking at any particular optimisations yet. I’m going to try to get some metrics soon to prove it. I also plan to transition all my other sites across to this type of hosting. 

The only thing still running on the old hosting is email. Amazon still has a particular gap here. I could run my own email server, but I’d prefer not to. I’m going to look into some options though. 

How I Made $1000 From Google AdSense on My Authority Site

AdSense mobile app
AdSense mobile app – $1000 lifetime earnings

On Thursday 5th of January 2017, I hit the $1000 mark on Lifetime earnings on Google AdSense. All but about $30 of that was earned in the immediately preceding 12 months. This is a tale of how you can have some minor success with time and effort. 

You can see a presentation on this in the presentation section at the bottom of the page.

I started my site, Digital TV Help, in February 2014 on the topic of Do It Yourself TV Antenna work and TV setup. It was something I had previously done as a self employed technician. 

Then I had basically zero traffic and zero income from the site. In the month of December 2016, according to Google Analytics, I had 9775 sessions with 8775 users. Last month I had estimated earnings of $102.42 Australian and just under a week later I reached $1000.11 lifetime estimated earnings.

How I got there

I started the site after discovering Pat Flynn’s Smart Passive Income podcast in December 2013. The site was created in January 2014.

Initially my site was very unfocused and I had the idea of educating both the DIYer and people entering the industry. I didn’t have a very good design, just a typical blog design, and I only had a few articles of poor quality. Didn’t have that many pictures either only the ones I’d taken while running my business.

In August 2014 I went to Pat Flynn and Chris Ducker’s 1 Day Business Breakthrough seminar at Problogger on the Gold Coast. The feedback I received there helped me to change to a narrower focus just on the do-it-yourselfer audience. Unfortunately it took me a long time to implement things suggested.

By December 2014 I had written 65 posts, ranging from articles about how things worked, to tip of the day articles and only a few how-to articles. I’ve always taken a publish first, improve later approach. In December 2014 I had 494 sessions and 437 users, with $2.19 of estimated earnings. In the whole year of 2014 I had estimate earnings of $11.18.

A bit demotivated and busy with full-time work, study, the birth of our first child and other commitments, I didn’t do much over the following year. After March 2015 I didn’t do anything to my site until March 2016. 

Monthly Sessions on Google Analytics
Monthly Sessions on Google Analytics

In January 2016 I discovered my site had received a huge increase of users and was earning me a lot more money suddenly. In that month I had estimated earnings of $7.56 from 4506 sessions and 4008 users. From January 2014 to December 2015 my estimated earnings were $33.70. 

Between January and March 2016 I did some changes to my AdSense advertising based on research on various SEO sites. After checking my stats and doing some experimentation I determined that none of my AdSense income came from header or sidebar ads. All of it came from the ads in my articles. I eventually discovered best income came from 2 or 3 horizontal banners in my articles, one after the second paragraph, one before the last paragraph and maybe one in the middle.

Earnings steadily increased over following months until between March and July each month I had between $70 and $85 each month. I started receiving a bank transfer from Google every second month. Regaining motivation in March I started reviewing and improving some of my content and adding some new stuff. I now have 88 posts and about 6 videos. I still need to do a lot of work. (See SPI: 200 on Content Audits)

Eventually I started getting $100 or more a month in estimated earnings. From July to October I got a regular monthly payment of just over $100 from Google. A slight drop in earnings in November meant I didn’t get a December cheque. Since then the income seems to regained its losses. 

Month by Month Google AdSense Estimated Earnings
Month by Month Google AdSense Estimated Earnings

On January 5th 2017 I hit the $1000 lifetime estimated earnings mark. Finally! 

What Made the Difference?

The vast majority of my income has come from one post. It was post aimed at do-it-yourselfers. Over time it had just become my number one landing page with it’s acquisitions coming from Google. It seems I had just done a really good job of making that article. Over the whole time since I wrote it, the article has gradually increased in Google Search hits and income. 

Over the lifetime of the site, it had these stats:

  • 40655 pageviews (24.91% of site pageviews)
  • $215.71 Australian (30.19% of site income)

It is titled “DIY Antenna Alignment Methods for Digital TV”. This title immediately explains its usefulness to people as does the blurb. It matches pretty well to a search of “How to align a Digital TV Antenna” and the alternate searches people might use for that topic. It contains 756 words and is quite comprehensive on the topic. It has pictures, multiple headings and an embedded YouTube video. 

I believe it’s just a matter of that as more Google search traffic clicked that post, the higher the pagerank rose. The quality of the article got it into the search initially, but clicks brought higher pagerank and more clicks.

One thing I did do to the article in December 2014 was to add a strong first sentence. A member of the mastermind group I meet with pointed out that the blurb that appeared for the article on Google Search was “In a previous article blah blah blah…” It didn’t say anything about this article. I changed it to add a new first paragraph(which is still there). It now says exactly what the user will get from the article. I think it helped increase clicks, which as I say bring more clicks.

Other articles on the sites do get hits and income. The 2nd best article on the site is another How To article. However it gets only half the landings and income of the best article. The rest of the landings are spread across the site. 

I don’t get many landings on my front page(only 6%). Almost all my traffic comes from organic Google Search. I don’t do much social media promotion and no paid promotion.

How to do it yourself

These are my recommendations. I’m not an expert in SEO or writing. I’ve just followed what I’ve read from experts I follow and done some experiments. I certainly haven’t tried everything I could have tried and still have a lot of work to do to increase the quality of my site and increase my income. 

This is what I think works for me: 

  • Use the official Google AdSense WordPress plugin – it automatically makes ads responsive for mobile
  • Use only 2 or 3 horizontal banner ads, set between paragraphs of your posts
  • Allow graphics or text
  • Write lots of deep, quality content, including embedded video and photos
  • Write for people not search engines
  • Write useful articles, like do it yourself or how to content
  • Strong first sentence and paragraph – tell the user what they’ll get from your article – e.g. “Here are four methods to do blah blah blah….”
  • Wait

I think I could have reached where I am more quickly had written a lot more high quality do-it-yourself how-to articles earlier. Also more videos and social media promotion may have got me here quicker. 

My aim now is to follow a process of reviewing and improving my existing content. I’m looking to significantly increase the number of high quality how to articles and embedded videos I have. Hopefully this will get me to a level of income where it can become my day job. 

Presentations

I’ve recently presented my story and Google AdSense advice on Network Live Virtually with Sabrina Watson.  You can see a replay here:

http://networklivevirtually.com/how-i-made-1k-with-google-adsense/

Resources

Sites with useful authority site information:

Here’s what I’ve used:

Against Anti-Doping

It’s time to end anti-doping efforts. They have failed. More than that they are immoral.

Anti-doping regulations apply to all levels of sport, from junior, to elite professional, to masters. Yet we only test a small group of our elite athletes. Elite sport is only a tiny proportion of the actual sporting community. Who is checking that a junior soccer player isn’t being given a performance enhancing substance? Who is checking that a local B grade rugby league player isn’t taking performance enhancing substances.

I am an administrator of swimming at the masters level. We are an amateur sporting organisation, providing age group competition from ages 18 to 100 and higher. We are also covered by anti-doping rules. Just recently Lance Armstrong was prevented from competing as a masters swimmer because of his life time drugs ban.

Yet it’s likely that many masters athletes are taking prescription medications from the banned list. The banned list includes many medications such as vasodilators, stimulants, asthma treatments and growth factors.

Technically they’re supposed to provide us evidence of a medical need, but rarely do we get such evidence. Athletes could be taking a performance enhancing substance for a medical reason or not, but we have no way of knowing.

Why should we anyway? The medical conditions of amateur sports people should be their own business, nothing to do with us.

Advancing medical technology is one of the biggest arguments against our current anti-doping efforts. Several billion dollar companies are now working on anti-aging technologies and early indications are promising. In the past year Google has started a subsidiary named Calico Labs with the sole purpose of extending the human life span.

Already several medications are in testing that slow effects of aging. By definition these will be performance enhancing, allowing athletes to continue to perform like a young person as they age.

For instance, currently there is research going on into anti-aging effects that may be attributed to the diabetes drug Metformin. Belgian researchers found that mice treated with it lived 40% longer. The studies showed reduction in age related illness.(http://www.telegraph.co.uk/science/2016/03/12/worlds-first-anti-ageing-drug-could-see-humans-live-to-120/)

How could we possibly deny our athletes access to this technology? Are we going to say that if you want to be an athlete, sorry you have to age naturally? It would be absurd to even ask athletes to deny themselves access to anti-aging treatments that the rest of the community can access.

How is this any different to athletes using performance enhancing drugs available now?

One of the arguments often cited is the dangers to the athlete, particularly in relation to drugs that are not highly tested or administered in ways they’re not listed for. This is a case where the prohibition creates additional danger. If such drugs were not being administered secretively, proper clinical research could be undertaken and everyone would benefit.

It’s time to end this expensive, failing and misguided war on doping. We can replace it with scientific advancement of performance and anti-aging science that can benefit everyone.

 

Why Australians Don’t Use Pay at Pump

Lots of Service Stations in Australia now have Pay at Pump systems. I can’t say I’ve ever seen anyone use them though. Yesterday I decided to give it a go at my local Caltex Woolworths Petrol station, and I found out why no one uses them.

Pay at Pump is just too hard
Pay at Pump is just too hard

Firstly you have to pay before pumping. I know this is the norm in some places, but generally in Australia we don’t think about paying until after we’ve already filled up. By that time you are locked into going inside. If you can pay after pumping by going inside, why can’t you pay after pumping outside?

As if pre-paying isn’t bad enough, they pre-authorise your card for a payment of $100. It warns you it can take up to 24 hours to release the unused funds. I usually fill my tank for about $50-60. Why should I have an extra $40-50 locked up for 24 hours just for a little convenience?

Usability is another major problem. I had to go through 5 screens, pressing buttons to select options before I could actually start pumping petrol. This took a couple of minutes. After finishing pumping I again had to go through a few screens with options before I finally got confirmation I had completed and paid.

I scanned my Woolworths Rewards card, although it took about 10 tries with the reader to make it work. You’d think you’d just present it flat to the camera, but no that didn’t work. Had to try a number of different angles to finally get it to work. That was a waste of time though because it didn’t apply the discount offer I had on the card.

Overall it was just a bad experience and I don’t intend to ever use it again. It seemed like a good idea. As a parent of a young baby, it’d be easier to not have to get her out of the car to go in to pay. After having used it, I think it’s easier and more convenient to go inside. I have to wonder if it’s a conspiracy to try to get people to buy stuff inside the store.

I don’t know why they’re even bothering with Pay at Pump if they’re going to provide such a bad customer experience. It should just be a matter of pumping your petrol, swiping your Woolworths Rewards card, tapping your bank card and off you go. Ten seconds max. Not ten screens and two and a half minutes of button pressing either side of the pumping.

 

Trade a Park No One Uses for a Useful Carpark

The Woody Point Village has rapidly become more and more popular in recent years. On any given weekend you will find that all the on and off street parking is completely full. With more high-rise developments, shops and cafe’s planned in the future this problem is only going to get worse.

Google Earth image showing the proposed carpark location
Google Earth image showing the proposed carpark location

Just behind the Woody Point Village shops and Belvedere hotel there is a small park called Lion’s Park. I have never once seen a person in that park. It’s really just a space for an open air storm water drain to flow.

Instead of being wasted space, this space could be used for a multi-level carpark. Two covered levels and 1 open air level would provide a huge boost to the parking capacity of the area. I know carparks aren’t popular, but they are a necessary evil. The Lions Park location is close enough to Woody Point Village to significantly help the area out, with no significant loss of amenity.