Check out my first LIVE CASE STUDY and watch me build a 300,000+ page site! I show everything – domain, Google analytics, SEO strategy...

The Man Who Sold the Web Blog | Category Archive | Autoscale


Archive | Autoscale RSS feed for this section

My venture into the world of MLM. Isagenix.

26 Nov

Okay, I haven’t tended to this blog in a while.  So, why not re-ignite things with a new live case study?

This will be a completely new venture for me.  I am going to give Multi-level Marketing (MLM) a  try.  This is my first time playing in the MLM space.  For those unfamiliar with the concept of MLM, give the wikipedia article a read: http://en.wikipedia.org/wiki/Multi-level_marketing.  Many large, successful companies use this business model — the best known examples include Avon and Tupperware.

The specific program I decided to join is called Isagenix.  Isagenix is an ecommerce site that sells health supplements.  Specifically, they sell health supplements promoting body cleansing, anti-aging, and weight loss.  Here is their official slogan: Isagenix: World Leader in Whole-Body Nutritional Cleansing, Cellular Replenishing & Youthful Aging.

So, why Isagenix?

There are 4 primary reasons why I decided to join Isagenix. […]

Here’s a great data source for food and nutrition facts.

24 Aug

I stumbled across this earlier this food nutrient data set earlier this week: http://ashleyw.co.uk/project/food-nutrient-database.  What the author, Ashley, did was take food nutrient data from the US Department of Agriculture and make some sense of it.

It’s available as a free download on his site.  The data set comes as a single JSON file, so may be a bit hard to parse.

In this data set, there are…

  • 6,636 unique foods
  • 375,176 nutrient figures
  • 94 unique nutrients across all foods

What you can do with this data is create a food nutrients site of 6-7 thousand pages (easily).  It would act as a great, value-added companion to a health site (e.g. food blog, fitness blog, recipes site).  […]

Here’s a live case study with some eye candy. Case of the supermodels.

4 Aug

About a week ago, I stumbled across what may very well be the holy grail of search keywords.  These are keywords that, according to Google AdWords Keyword Tool, have the lowest possible competition (i.e. competition score of 0) and receive tens, if not hundreds, of thousands of local monthly searches.

I am referring to this as the holy grail, because we’re not  talking about a small handful of search keywords.  I’m not talking 5, nor 10, nor even 50 search terms.  I’m referring to hundreds upon hundreds of these keywords that you can dig up in the matter of 30 minutes.

Now, don’t worry.  This is not an eBook sales page and I’m not selling you anything.  You won’t need to spend $17 to discover what this big “secret” is all about.

So what is this holy grail? […]

Here’s a tip. Create a 40,000 page site instantly with a zip code dataset.

9 Jul

Not too long ago, I introduced the idea of creating a megasite using publicly available datasets.  In this article, we’ll discuss how we can use US zip codes to create a 40,000 page megasite instantly.

(You can pick up your copy of the zip codes data set here: http://federalgovernmentzipcodes.us/.)

So, how does it work?  […]

APIs, APIs, and more APIs.

8 Jun

I get a lot of emails these days.  One recurring area of interest is the subject of APIs.

I first introduced the concept in my blog article around autoscaling with APIs, where I listed the big names out there.  Then, after rummaging around online, I wrote about Freebase and DBpedia; and, both my live SEO case study and 300K in a Box solution speak to the power of Indeed’s API.

But, do you want more APIs? […]

Fresh out of the box – 300K Page Job Search In-a-Box!

20 May

I am very excited to announce the release of 300K Page Job Search In-a-Box.  This is a turnkey solution based off the Case of 300,000 and Counting.

Just like WordPress, all you need to do is download, upload, and configure.   The difference is… with WordPress, out of the box, you have maybe ~10 pages and will need to continuously add content yourself.  With my solution, out of the box, you will have 300,000 pages and it will autoscale with fresh content itself.   In other words, you have an autoscaling, autopilot megasite.  Big difference!

To my blog readers, here’s a link with a 25% discount.  This coupon will also expire after a set number of uses.

25% Discount for 300K Page Job Search In-a-Box

Questions?  Just ask.

dave

 

Power your site’s information with this hidden gem, Freebase.

15 May

So, I’ve written article about Wikipedia, because it’s a great source of information.  I’ve written about about APIs, because it’s way to autoscale.

What if you could combine those 2 concepts–or, even better, for your database-savvy folks, query data using SQL?  After some mindful web surfing, I across quite a powerful source of data.  It’s called Freebase (formerly known as Metaweb).  Check out this video first.

[…]

21K in 21 Days (my latest free eBook).

8 May

Last week, I took the results from the first 21 days of my live SEO case study, analyzed it, and compiled everything (data + analysis) into a book, 21K in 21 Days.  This is another free guide, which you can download here.

Here is some of what you will find in the book:

  • Day-by-day update log — so you can see indexing growth rates and plateaus
  • Detailed analytics — screenshots taken from Google Analytics
  • Key insights
  • Site architecture/design

  • SEO strategy

I think many people may find the “Key insights” section the most interesting part of the book, because I highlight insights that counter conventional “wisdom” about Internet Marketing.  So, I’ve reproduced most of that section below here (for those too lazy to download the book). […]

Learn to autoscale with the best source of information, Wikipedia.

7 May

Wikipedia is such a great source of information.  Though originally controversial as a source of legitimate information, it has becoming increasingly accepted as a reliable source.  In fact, in my day job of business consulting for companies (including Fortune 50 organizations), Wikipedia is one of the first places I check when conducting research.

However, if you have a website, and would like to automate the process of pulling data from Wikipedia, it’s not a simple task.  You will need a very sophisticated scraper.  Wouldn’t it be convenient if you could just query Wikipedia like a database?

Well, it seems like you actually can… with the help of DBpedia. […]

Live Case Study: The case of 300,000 pages and counting.

11 Apr

This is my first live case study.  In this case study, I will build and grow an autoscale, autopilot, value-add site from scratch.  The purpose of this case study is to demonstrate techniques in real time.  With the exception of coding the initial site (these activities are tabulated under day 0), everything is done in real time, including domain registration.

The subject of this case study will be a a niche jobs search site, built off the Indeed API.  Indeed.com is an established jobs search engine of US-based job opportunities.  They have a publishers program with an API that allows our site to pull the job results.  For job seekers that click through to Indeed, we will also get paid as a publisher.  A nice added bonus, eh?

Our niche job search site will focus on clerical jobs.  Within clerical jobs, we have 3 sections: 1 for accounting jobs, 1 for bookkeeping jobs, and 1 for auditing jobs.

Now, how does the autoscale work?

First, upon launch, the site will have 300,000 pages.  Note, this does not mean all 300K pages will be indexed by Google.  It only means a Googlebot will find 300K unique pages across our site.  Here’s how I came up with that estimate. […]

| TheManWhoSoldtheWeb.com

I'll send you an email when there's exclusive or important news. Subscribe below.

© Copyright 2011-2024.   TheManWhoSoldtheWeb.com