ASP.NET / C# Image Optimization

ASP.NET Image Optimization Code

Original Source:

May 6, 2008 at 11:30 am 2 comments

Simple C# Delegate Example

Simple example of passing a delegate to a method to extend the functionality of the recieving method. This allows us to reuse the code in the recieving method since it can be combined with the method passed with the delegate to produce new functionality.

using System;
using System.Collections.Generic;
using System.Text;

// Delegates are used here to provide a method containing common functionality
// a way of using a method passed in (via a delegate) to extend functionality.
// This is a way of re using code since common functionality is extended via
// the use of delegates that pass in specialist functionality


    public class UserClass
    {
        // Method that takes a delegate as a parameter and uses it to complete its work
        public void ReturnText(ProcessDelegate newDelegate)
        {
            string stubString = "Stub ";

            string getDelegateResponse = newDelegate();

            stubString += getDelegateResponse;

            Console.WriteLine( stubString );
        }
    }

    public delegate string ProcessDelegate();

    public class MainClass
    {
        // Instance method to be passed to delegate
        string AppendText()
        {
            Random random = new Random();

            string txt = "";

            txt += random.Next(0, 100000).ToString();

            return txt;
        }

        // Static method to be passed to delegate
        static string StaticAppendText()
        {
            Random random = new Random();

            string txt = "";

            txt += random.Next(0, 100000).ToString();

            txt += (" (Static)");

            return txt;
        }

        static void Main(string[] args)
        {
            // Create an instance of the class which contains a method that uses a delegate
            UserClass newUserClass = new UserClass();

            // Call the user class's method passing a methed of an instance object
            MainClass newMainClass = new MainClass();
            newUserClass.ReturnText(new ProcessDelegate(newMainClass.AppendText));

            // Call the user class's method passing a static method
            newUserClass.ReturnText(new ProcessDelegate(StaticAppendText));

            Console.ReadLine();
        }
    }

May 5, 2008 at 4:23 pm Leave a comment

Simple JavaScript Image Preloader

Add this code at the bottom of the page, creating a new variable for each image you wish to be preloaded.

function ImagePreloader( imgSrc )
{
    var preloadedImage = new Image;

    preloadedImage.src = imgSrc;

}

Cognize Web Design Services

May 5, 2008 at 4:18 pm Leave a comment

Search Engine Optimization Easy Guide

This guide will see you through the search engine optimization (SEO) process from code to upload, with a no-nonsense explanation of each technique, and direct links to the resources you need.

First off, here are two myths that need to be dispelled:

1st Myth: There are ‘tricks’ that can get you to the top of major search engine rankings.

Whilst I’m going to show you how to optimise your site to give you the best possible chance off appearing high up in the search rankings, you need to be aware that Google, and Yahoo have been successful by making an art form out of suppressing spam techniques, and thereby producing the most relevant results for their users.

In short, they are very practiced at spotting deliberate attempts to deceive the web crawlers, and may even penalise your site if it severely breaks the rules.

2nd Myth: You need to pay ‘expert’ to help you achieve the highest possible placement in the Search Engine rankings.

Of course there’s plenty of best practices to follow, and a good understanding of how the search engines work will be an asset to you, but this is no harder than getting a decent grasp on best usability practices, or how to design an accessible site. Follow this guide, and you’ll be covering pretty much everything an SEO expert would suggest.

Rule Numero Uno

After all is said and done, the most important thing you can do to boost your sites rankings is to get other search engine indexed web pages to link your site.

Do not however use special services that offer to link your site on loads of other web sites. They’re scams at the end of the day, and the search engines are wise to it. You might even hurt your rankings by using these services.

Now that’s out of the way let’s fine tune your site, because every little helps…..

Code to Include In Your Web Pages

Before we run off scouring the web for bit part search engines to suggest our lovely new site to, let’s get our web pages sorted first. We’re going to make them search engine friendly.

1. Start with the file name, and make it meaningful. Rather than call your page ‘cogSEO2.html’, use some proper words, hyphens and underscores and try something like ‘cognize_seo_optimization_guide.html’.

2. Give every page a meaningful <title> tag. Put your site name in by all means, but add a little information after, up to about 70 characters.

3. Add in relevant <meta> information. Two very common meta tags are the ones that provide the description of the page and keywords.

Whilst you may think that these two tags might be just the thing to get you shooting to the top of google, the historic abuse of keywords has limited the attention that the search engine bots will pay to them. If you do use the keyword meta tag, be careful not to spam. This may be considered bad practice by the search bots and harm your ranking.

The description meta tag also has limited use for ranking purposes. What it will do however is dictate the way that you want your page to be described in the search engine results. If this suits, then you may want to use this meta tag.

<meta name="Description" content="Cognize -  a web developers resource.">
<meta name="Keywords" content="html, css, javascript,  web design, web design,
article, tutorials, resource, programs,  internet, web, cognize, coding,
code, script, form, validate,  w3c, w3, mozilla, mdc, lint, debugger, search">

Other meta tags you can use are shown below. Meta tags are not essential however, the search engines will still make sense of your site. Use them at your own discretion.

<meta name="author" content="me">
<meta name="subject" content="Web Development, Design, Consulting">
<meta name="Classification" content="Cognize - a web  developers resource." >
<meta name="Geography" content="Your Full Address"> <meta name="Language"
content="English">
<meta http-equiv="Expires" content="never"> <meta name="Copyright"
content="Cognize.co.uk">
<meta name="Designer" content="me"> <meta name="Publisher"
content="http://Cognize.co.uk">
<meta name="Revisit-After" content="1"> <meta name="distribution"
content="Global">
<meta name="city" content ="My City">
<meta name="country" content ="England, UK, United Kingdom,
Britain, Great Britain">

A good automatic meta tag generator can be found at SEO Tools.

4. Create a site-map.html page and link to it from every page.

You should create and maintain a file named site-map.html, place it in the root directory, and link to it from every page on your site. Make the links to the page and on the page basic <a> link tags to ensure that the bots to not have any problems in accessing the file. This is a good way of ensuring that the whole of your site is crawlable.

5. Follow Usability and Accessibility best practices, they also help search engines.

Basic rules really, all the little things that make your web site more accessible to special browsers such as using the alt attribute for images or using html links rather than dynamically created ones.

Providing html alternatives where other technologies are used all help the search bots crawl your site more easily. If you are able to check your web site in Lynx, that may give you a good vision of how a search bot sees your site. If you use images, make the alt tags descriptive, using around 7 words if you can.

6. Add multiple basic html links to the main areas of your site on all pages.

7. Make your page content rich. Some search engines test the ratio of code to text on your page. The more text, the more content rich the page will be considered, thus improving your ranking. Write pages that clearly and accurately describe your content.

8. When adding text and content to your page, think about keywords. Some of the SEO tools suggested later will help you to work out which are the best keywords. The keywords in the meta data are often ignored, but those in your content will not be.

9. Check the page for broken links. If the links to a page are broken, a search bot might not be able to reach them. The W3C link checker can check online sites automatically.

In your web space

There are two main types of file you can include in your web space that can help to guide search bots through your site.

1. Add a ‘sitemap.xml’ file.

This is a site map in a special recognised format (XML) especially for search engines. It should be placed at the highest level in your website (i.e. in your root directory).

Here is an example:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://www.cognize.co.uk/</loc>
<changefreq>daily</changefreq>
<priority>0.7</priority>
</url>
<url>
<loc>http://www.cognize.co.uk/resources.html</loc>
<changefreq>daily</changefreq>
<priority>0.5</priority>
</url>
</urlset>

You can copy this code exactly, change the urls in the <loc>, <changefreq> and <priority> tags, and this will be enough to help the search bot check all of the pages you specify in your site, even if it struggles with the links for some reason.

As a quick guide, <loc>, refers to the URL of the page.

<changefreq> tells the search bot how often the page changes. Example options are ‘daily’ ‘weekly’ or ‘monthly’.

<priority> Tells the search bot how important this page is in the context of your site. Values range from 0 to 1. Putting 1 for every page will not increase the ranking of your site, it is only relevant to the page your on.

There are other tags and options you can use, although they are beyond the scope of this guide. For complete information on site maps, visit Google Site Maps Protocol.

If you want a short cut, a nice little xml sitemap generator can be found here xml-sitemaps.com.

2. Add in a file named robots.txt

A robots.txt file placed in your root directory will give a search bot a little more info. If you would like the bot not to crawl certain pages, maybe because they aren’t particularly relevant to incoming searches, then this the way to do it.

Again, a full blown explanation of exactly how these files work is beyond the scope of this guide, but full documentation can be found on Google’s robots.txt guide

For now, here is an example of a robots.txt file that you can place in your web space. It’s very simple, this is the contents of a (very short) complete file asking the bot not to check the directory named ‘boring’, because, well it’s boring!

User-Agent: *  Disallow: /boring/

Putting it Out There

To make things easy for you, here is a list of the main search engines that you will want to submit your site to. You may have to satisfy certain conditions before they will index your pages. For example, you will need to sign up for a Google account if you want to use their stats system to check how your site is doing.

Please note that even with submission, it can take the search engines a little while to get around to your site, we’re talking weeks. If however you’ve followed the rules so far, you should find it’s probably quicker than that, especially if you can get links to your site into web pages that are already indexed.

Google https://www.google.com/webmasters/tools/siteoverview

Yahoo https://siteexplorer.search.yahoo.com/submit

MSN http://search.msn.com.sg/docs/submit.aspx

LIVE http://search.live.com/docs/submit.aspx

Open Director Project http://www.dmoz.org/

AltaVista Included uses the Yahoo engine, so no need to submit to this one.

Ask Jeeves included in Teoma, which you can not manually submit to.

Checking Your Position and More Fine Tuning

Once you’ve completed the above, you can start using more tools to check how your sites doing.

Some great SEO resources for webmasters can be found here:

SEO Chat

Self SEO

Google Web Masters

The Final Word

We started with it and so we’ll finish with it, because it can’t be emphasised enough:

Get as many incoming links to your site as you can. If you maintain other sites, have them link to one another.

It all helps!

March 26, 2008 at 3:00 pm Leave a comment

Lotto Number Generator Form Class (C# .NET)

This code demonstrates another way to generate unique random numbers, using an if statement in nested for loops to check and reassign numbers where it is found they have already been selected.

Not to the most graceful way of going about this, but works none the less. There is another random number generator class in this blog which is a bit neater.

Download Code File: LottoForm.cs

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;

namespace Lotto
{
public partial class LottoForm : Form
{
public LottoForm()
{
InitializeComponent();
}

Random m_random = new Random();

int[] selectedNumbers = new int[7];

private void goButton_Click(object sender, EventArgs e)
{
// For each array member, assign a random number
for (int numberCount = 0; numberCount < selectedNumbers.Length; numberCount++)
{

int newNumber = m_random.Next(1, 50);

for (int i = 0; i < numberCount; i++)
{
if (newNumber == selectedNumbers[i])
{
// MessageBox.Show(“A duplicate (” + newNumber + “) has occured”);
newNumber = m_random.Next(1, 50); // Assign a new number
// MessageBox.Show(“Replacing with ” + newNumber);
continue; // Test again
}
}

selectedNumbers[numberCount] = newNumber;

}

// Place the bonus number in the 7th space before sorting

textBox7.Text = selectedNumbers[6].ToString();

// Create and populate intemediary array which we can exclude the bonus number from
int[] nonBonusNumbersArray = new int[6];

for (int x = 0; x < nonBonusNumbersArray.Length; x++)
{
nonBonusNumbersArray[x] = selectedNumbers[x];
}

// Sort the intemediary array
Array.Sort(nonBonusNumbersArray);

// Place sorted numbers in non bonus text boxes
textBox1.Text = nonBonusNumbersArray[0].ToString();
textBox2.Text = nonBonusNumbersArray[1].ToString();
textBox3.Text = nonBonusNumbersArray[2].ToString();
textBox4.Text = nonBonusNumbersArray[3].ToString();
textBox5.Text = nonBonusNumbersArray[4].ToString();
textBox6.Text = nonBonusNumbersArray[5].ToString();

label9.Text = “”;
label9.Text += “Your new numbers are: \n\n”;
label9.Text += nonBonusNumbersArray[0].ToString();
label9.Text += “, “;
label9.Text += nonBonusNumbersArray[1].ToString();
label9.Text += “, “;
label9.Text += nonBonusNumbersArray[2].ToString();
label9.Text += “, “;
label9.Text += nonBonusNumbersArray[3].ToString();
label9.Text += “, “;
label9.Text += nonBonusNumbersArray[4].ToString();
label9.Text += “, “;
label9.Text += nonBonusNumbersArray[5].ToString();
label9.Text += ” (Bonus: “;
label9.Text += selectedNumbers[6].ToString();
label9.Text += “)”;

}

}
}

March 26, 2008 at 12:17 pm Leave a comment

Unique Random Number Generator Class (C# .NET)

This is a class for generating an absolute unique random number. The class works by removing selected numbers from a list once they’ve been selected, so it’s impossible for them to be selected again, thus making it possible to simulate real life random number selection in lottery type scenarios.

Pass the maximum and minimum numbers that you would like returned from the class to the constructor when creating the object.

Download Code File: GenerateUniqueRandomNumbers.cs

using System;
using System.Collections.Generic;
using System.Text;

/// <summary>
/// Class for generating unique random numbers within a given range. Removes numbers from list
/// once chosen from an instance of an object so that it is impossible for them to be selected again.
/// </summary>
public class GenerateUniqueRandomNumber
{
Random m_random = new Random();

int m_newRandomNumber = 0;

List<int> RemainingNumbers;

// Constructor
public GenerateUniqueRandomNumber( int requiredRangeLow, int requiredRangeHi )
{
// Get the range
int range = (requiredRangeHi – requiredRangeLow);

// Initialise array that will hold the numbers within the range
int[] rangeNumbersArr = new int[range + 1];

// Assign array element values within range
for (int count = 0; count < rangeNumbersArr.Length; count++)
{
rangeNumbersArr[count] = requiredRangeLow + count;
}

// Initialize the List and populate with values from rangeNumbersArr
RemainingNumbers = new List<int>();
RemainingNumbers.AddRange(rangeNumbersArr);
}

/// <summary>
/// This method returns a random integer within the given range. Each call produces a new random number
/// </summary>
/// <returns></returns>
public int NewRandomNumber()
{
if (RemainingNumbers.Count != 0)
{
// Select random number from list
int index = m_random.Next(0, RemainingNumbers.Count);
m_newRandomNumber = RemainingNumbers[index];

// Remove selected number from Remaining Numbers List
RemainingNumbers.RemoveAt(index);

return m_newRandomNumber;
}
else
{
throw new System.InvalidOperationException(“All numbers in the range have now been used. Cannot continue selecting random numbers from a list with no members.”);
}
}
}

March 26, 2008 at 12:00 pm Leave a comment

A Couple of Quick Tips on Setting Up IIS 5 (XP)

It’s not always obvious whats up with IIS if all not going according to plan. The error messages are not always as clear as they could be. None of it’s that tricky once you’ve set up a couple of sites, however, there are some things to watch out for:

 1. IIS 5 must be installed before you  install the .NET framework. Lots of problems arise if the two components are not installed in that order. It’s probably advisable to reinstall any instances of SQL Server that you’ll be using too.

 2. Make sure that anonymous access is enabled. Default Website > Properties > Directory security.

 3. If you want to put your application at a level higher than wwwroot (e.g. wwwroot/mysite/default.aspx), then you need to set up a virtual directory, even if it’s just going to be the same name. This is because it is an error to place a web.config file at a level higher than the application root.

 This page explains how to set up a virtual directory in IIS 4 or 5:

 http://www.microsoft.com/technet/security/prodtech/IIS.mspx

 With these simple rules adhered to, you should be able to start testing your C# or VB web pages.

March 25, 2008 at 2:32 am Leave a comment

Older Posts


Categories

  • Blogroll

  • Feeds