Sunday, August 3, 2008

Clean Our Mouse

If the mouse work sluggishly or not all, do not buy another cheap mouse. This is probably only a dirty mouse. A deep cleaning can solve the problem.

First, you must determine which type of mouse you have. If you turn the mouse and part of the fall of the ball, showing you the ball mouse. If you see the lens, you have one or laser optical mice. Each species requires a different type of mouse cleaning.

Optical or laser mouse should not be cleaned up to as often as the ball mouse, but on the surface is still able to get dirty. If not, then clean the surface of trips to this will become black. So about once a month or so, cleaning off area with wet cloths. Lenses probably never will get dirty, but if not, take a soft cloth, cotton swab, or Dr. Type wetted with window cleaner or alcohol and cleaning lenses.

Ball mouse May need cleaning very often, so you can get familiar with the procedure after cleaning. The dirtier the ball receives a mouse that is difficult to move the mouse cursor on the screen smoothly. If you move the mouse in Mouse pad, a few times to move the cursor off half way, it must be cleaned.

Mouse does not need to be switched off for cleaning, but you have to close all running programs so you do not accidentally click on something and mess it up. If you choose to disable the mouse, do not forget to turn off the computer first. Mouse brain should not be in networking, while he works. In this way, could ruin the motherboard.

Turn the mouse over the fall of the ball and find the cover that kept the ball drop in the century Look arrows cover the show to be transformed. Put two fingers cover push in the direction of the arrows. After the cover was turned around the reserve, to cover ga with his hands and turn the mouse on the back facing the site. Ball and must cover the fall in your hands. If not, carefully shake the mouse.

Cleaning the ball fall out with a wet rag.

Now it seems better to the fall of the ball and find the three rolls. Begin by cutting a buildup of your role Noktoplastika (steel knife, or choose a dentist can be used carefully), then turn and roll by removing buildup going together. If you've done this correctly, you will achieve with one curled strip of buildup for each cylinder. Be sure to remove the buildup of good. If he falls somewhere inside, and blow a little shake, until you leave.

Get wet rag and clean each with rollers is wiping it and then turns and wiping again. Continue until the entire roll is cleaned. Put the ball back to the fall in the fall of the ball well and lock cover back in place.

If the mouse still has problems after being installed, try cleaning that has not disappeared. If that does not work, May you need to buy a new mouse.

This is a good idea to regularly clean surfaces that are more mouse because the surface clean, less pollution will be obtained within the mouse and less often, they have to clean.

If your mouse is shared by many people (especially if one of them is sick), May want to disinfect the beginning of the mouse between users.

Follow these instructions and the mouse will be up and darting back in no time.

Saturday, August 2, 2008

Clean Your Keyboard - Part Two

When it comes to cleaning your keyboard, there are many methods that can be used, some difficult and more effective than others.

The simplest method is the method shake. It is light, so you can do it now. Transfer to the keyboard in order to make protection of press any buttons and shake it. See all this stuff? He is dirtier, than you thought, right? You can use one of the following methods of cleaning it.

In the working-method - you can buy cans of compressed air in the computer service or computer specifically for cleaning your computer. They usually have no hose and nozzle or expansion of tubes of jet. Keep out to the vertical keyboard (which means that at the end of the keyboard, and the other end), will answer to the keys and press the button. Do you work at all residues are foams. Be sure you look around and between all key. This can be done by computer, but is better if he is off, so you do not have to worry about pressing buttons and come with one part of aaaaaaaaaaaaaaaaaaaaa's.

In the vacuum method - This is just like a work-method, except that a vacuum is used instead of a can of compressed air. It is very simple. Turn on the vacuum to drag a hose and nozzle of a run of the keys. Before you make sure that your keyboard is not in bulk off the pop-button can be smuchat in vacuum.

Cotton swab method - This may, in addition to the above methods and instead of the following methods. Make a cotton swab or wet cloth and redeploy with alcohol. It should not be so wet that alcohol is provided in the cracks of the keyboard. Cleaning one of the most serious and sides of the keys.

In the dishwasher method - I hesitate to say about this method because it is the possibility that it could fry your keyboard. When I was less experience with computers, I dunked my keyboard in my water softener full of water for cleaning. Then it was work, so I have no doubt who say that this method will not mess at the keyboard, but if it does not complain to me. I warned you. If your keyboard is not the standard type of membrane keyboard or if it comes at a laptop, do not even think of trying it.

This is the way it made. Turn off the keyboard and put him face down in an empty dishwasher. Not to disassemble the keyboard and not in dishwasher, the dirty plates in it. Some say for soap, some say not to do so. Run the dishwasher through a regular cycle. Take the keyboard, shake the water is, and it is the end until it is completely dry (This may take several days). If it does not work after so he can not dry. Let stand one week and try again. If it still does not work, I warned you.

Dismantling method - This is the most thorough way, but this should not be limited to laptop keyboards or non-standard non-membrane keyboards.

Turn off the computer and unplug the keyboard. Turn the keyboard on the head. Would you like books, or maybe two short planks of the keyboard. Office to lead to the ends of the keyboard, where included. This should dangle from key and do not touch books or on the ground. This is especially important if off the keyboard, otherwise it will be the key from its position with the word (or whatever area is one).

Take a screwdriver and removing all the bolts on the back of the keyboard. Put down the keyboard books and carefully remove the back.

Take everything apart and thoroughly cleaned. It is better to take the keys, during and so pure that you will again in the wrong place. Cleaning with a damp cloth and then with dry cloth. Each key that can be difficult to return to be cleaned without having to remove. Cleaning around the keys, as they sit stone and every beat, with the lungs (with compressed air or vacuum to break the keys). If you're brave, you can use all the buttons simultaneously and provide a good framework for cleaning down.

Make more than once and make sure that everything cleaned. Then do everything.

Remember the keyboard cable. Wrap a damp cloth around it and wipe it down. It is a collection of pollution must be cleaned down. Even if one or more of the letter keys are rubbed off, you can use a fine point permanent marker on the back of a letter button.

Using these methods of cleaning and keyboard takes a long time and be something that can be proud.

Friday, August 1, 2008

Clean your keyboard - Part one

I know most of you will not heed this advice, but as keyboards health risks (such as this sensitivity?). Germs live in your hands and fingers. If, many of them directly on the keyboard. If someone types on the keyboard, fingers on the germ. Or, if later on the same keyboard, these germs reclaiming your fingers. Regularly disinfecting keyboard can prevent this.

To disinfect the keyboard, turn off your computer. Then spray disinfectant on the canvas. Make sure that disinfectants and not just any kind of cleaner, because not all the cleaning staff disinfect. Also disinfectent not spray directly on the keys. Spray on the canvas. Delete the top and sides of the keys. Given the key for a few minutes to dry before using computers.

Now that you know how to do it, should ga regular practice to disinfect the keyboard. This is a step on the road to healthier.

So you have to disinfect the keyboard, and life is great. Then your child koolaide Leaks all over it. What to do? There are certain steps you should, if anything, such as pop, beer, wine, coffee, milk or Kool-Aid, spilled on the keyboard.

The first thing to do, Unplug the keyboard from the back immediately on the computer and turn on the keyboard, so that his key. This will drain fluids. You will probably be a cloth under the keyboard, or at least ensure that the surface that is washable.

Then use the mouse to shut down Windows and turn off the computer (this is important, because later you have a keyboard for entry into the computer, and you should never turn the device on a computer, while).

While the keyboard is at the helm with dry towel, which is more than a liquid, as you can. If you have a can of compressed air or a vacuum, a keyboard is slashes in the head or vacuum ga. Then let the keyboard on the head, at least for one night, so they are dry enough.

If the liquid spilled, it is persistent, you can follow the extensive cleaning procedures introduced in the next article, to clean a keyboard - Part 2.

Liquid spilled on a laptop keyboard can easily get to the hard disk, so that it immediately, and to some ga in this position until dries.

Keyboards are very robust and it should work if it is again. But if not, another attribute is that keyboards are cheap and therefore ga not cost too much to buy another one.

With these pointers in mind and a quick reaction, all you've just saved on the keyboard of total destruction the next time with coffee.

Tuesday, July 15, 2008

Search Engine Rankings for Beginners

Search Engine Optimization is better to leave Earth in internal intenet marketing teacher at the truth? Search good evaluation is difficult to achieve. They search engine marketing takes years of study, and only true with people familiar with the secrets and secret knowledge among the images correctly? Wrong. Svinstvo pure, in fact.

Graph and search engines is difficult. In fact, is the perfect search engine is relatively easy. What stopped most people from peace as well as search engines, is dezinformace. Each week there is another element crucial to peace as well as search engines and one of the people go to other quick solution, and I hope for the best 10 sites in photographs and never achieve this goal.

There are 6 main areas of search engine optimization, you need to know. These are the basics. The right to obtain these pictures help to achieve team that you have always wanted.

King - the jury is still relevant keywords on how rich domain name. What I mean by keyword rich domain name is to say, for example, would your money-making ideas, the idea of a domain name could be either or www.moneymakingideas.com www.money decisions - ideas.com. My personal preference is to approach such as hyphenated www.money decisions - ideas.com simply because I think that search engines can be read more easily. Registed domain, I would recommend one of the first 000 domains.com godaddy.com

Content - the content useful. A keyword stuffed, spammy sites. Write something useful for visitors. If you can not write and then you need to learn. Transforming words into the great resource for learning the writing and content, and a source because I personally can be used. Transforming words into service gives you AZ article writing - from the perspective of the book. If you do not want to learn how to write, and then you can always use quickly develop site content that will help to accelerate the process.

Keywords - Keywords are words or phrases that you expect people to search on your site / service / content. There are two different factors to consider when selecting keywords for your site.

First, there is a need to focus on specific keywords. For example, if you have a web site, after that date will find it extremely difficult for most likely to achieve higher search engine evaluation at the outset to be a single term. However, if you have to clarify areas on your site to reach the goal, say gay and lesbian history to be more specific. You can also localization of keywords or phrases, such as gay & lesbian key walk in Toronto. The refining process is often referred to the location (pronounced neech) marketing.

Second you need to choose keywords that you want people to find their website. Visitors take in shoes - think of what can be used to find the search engines. What are the key words or phrases you would add to the search engine to find your site? What are the key words or phrases potential visitor will be used to search for you? Always, always think like a model websurfer.

If you are interested to know what people are looking at how and profitable web site, which could be for you, I would recommened to verify the keywords in a good state. It's free and it is great to work to help you find good keywords :-)

I would also like to suggest you also try to wordtracker. It is a great service for tracking specific, popular keywords.

Keyword density - which, in many instances how to use keywords on the site. For example, if you have 100 words of text on one page, and you remember your keyword 5 times then you have 5% of the density of keywords.

Views differ on this but anywhere from 1% - 7% is considered ideal. If it does not make sense to you, then it OK. Rule is simple: what is more than that. No change in the text on your website keywords again and again because ......... And also ... Looks forward to blbost, instill any confidence in the future, customers and perhaps even you are punished or removed from the search engines.

Title page - the mark of your site is crucial to be evaluated as well. Stuffing with a keyword in the hope of arrangement in addition to being unnecessary. That your card, once upon properly, would move quickly to top search engine evaluation. I take this simple fact in mind when creating your own website, and will you all the best.

Links - good relations with reference to your site and good links to your site (often overlooked aspect of the Search Engine Optimization) is important, and if the search engine arrangement walking. In referring to other sites and shows your own, and it is important that your site on search engines, or if the content on your very valuable to owners of other sites that want to link to it then must be important.

Meta tags - Meta tags dead child. No, it is not completely dead, but they are used only as a reference point for search engines. It is regrettable to see the Web site, which has signs of dead is still stuffed with keywords. It is not necessary. Give initial keywords in the keyword Meta part of your website, but apart from that, you lose any sleep or waste energy in any of them.

It is not enough space in this article to explain all of the above in the details that I would like to. Also, I'm not an expert in search engine even some only explanation for this is best left to those that I learned from (Sean Burns and Jay Stockwell worship, please :-)

Having learned the basics of Search Engine Optimization and has achieved a top 10 arrangement for all my pages. It is not difficult to do. It only takes work, patience and a little common sense.

How to clean your Mother Board

by: Ray Geide

If you have not done the inspection mentioned in the previous article - How to Clean your Case, now is the time to do so. Look at the blades of the fan in the back of the computer. Also look at any vents. Is there clusters of dust there? Is there grime caked on to it? If so, the inside needs to be cleaned. If the fan blades are clean but it has been several years since you have cleaned the motherboard or if the computer is around cigarette smoke, it probably should be cleaned anyway. Dust and particles in the air (like cigarette smoke) can build up on the circuitry of the motherboard and cause it to heat up and/or corrode.

The first thing that you need to do is unplug your computer. Then open up the case to get access to the motherboard. Cases open differently. If you don't know how to open your case, look on the back of your computer along the edge for some screws. These screws may hold on side panels or an upside down U shaped panel that covers the sides and top. Removing the screws will allow you to take off the cover. Other cases have the screws on the front of the computer. To get access to these screws, you must first remove the front panel by pressing a hidden latch. The cover is there to give easy access to the inside of your computer, so if you look hard enough, you should be able to figure out how to remove it.

Remember that if you touch anything on the motherboard, you should be grounded by either touching the metal frame of the computer with your other hand or by wearing a special grounding device.

The goal of cleaning the motherboard is to remove all dust and debris from the motherboard and all components inside of the case. This can be done using one of three methods.

The preferred method is to use a can of compressed air to blow it out. Always hold the can in an up-right position to prevent the propellent chemicals which can damage or corrode components from coming out. Dust and dirt should be blown away from the motherboard and out of the case.

Another way to remove dust is to use a vacuum. The common advice is to only use a battery operated vacuum because an AC powered vacuum causes static and static can ruin the motherboard. I have used an AC powered vacuum (before I knew that it was not recommended) to clean my motherboard many times and it has never caused any problems, but I may have just been lucky. When using the vacuum, keep the nozzle a couple of inches away from the motherboard or any other components so that it does not come in contact with them and so that any small parts are not sucked into the vacuum.

If you do not have a can of compressed air or a vacuum, you can use a dry cloth and brush to clean the motherboard. Be careful not to dislodge or break anything using this method.

While cleaning the motherboard, be careful not to unplug any cables or connections or to dislodge any loose components, such as, jumpers.

Methodically clean the whole inside of the case going over all of the motherboard from one end to the other and all other components. Don't forget to clean the fans and heat sinks. Do not open up the power supply box or stick anything in it beyond the fan. If you do, you could get a shocking surprise and ruin your computer.

If your computer does not work when you put it back together, something was obviously dislodged during the cleaning. Open the case back up and push all connections and cards into their slots. Look for anything that may have become disconnected.

Cleaning the motherboard is probably the most dangerous form of cleaning but it is necessary to prevent an early death of your computer.

About The Author

Ray Geide writes a free weekly newsletter called Ray's Computer Tips and moderates a discussion board answering computer questions called Computer Q&A.

He is an experienced computer programmer who has been writing top-rated software for over a decade. Though he has written for some big-name companies, he prefers to write for his own company, Super Win Software, Inc. http://www.superwin.com/

source: Articlecity

Friday, July 11, 2008

Web 2.0 And Why You Shouldn't Fake Reviews

by: Simon Dance

The latest offering from Ramsay's Kitchen Nightmares aired on Channel 4 last night, followed the somewhat disastrous adventures of ex-boxer Mike and his wife Caron Ciminera as they struggled to run the Fish & Anchor, a restaurant in Lampeter, West Wales. Whilst the couple's arguing appeared to better the food they were originally sending out (a mix of jarred sauces and home cook book trophy dishes) they did let slip on a fantastically poor bit of black hat optimisation, which I hope made all white hat SEOs laugh out loud.

If there was one lesson to take away from the show, it would be - Don't fake reviews!

In order to gauged the feeling of the local community for the failing restaurant come sports bar, Ramsay conducted a search on Google for the Fish & Anchor, to which he was presented with a range of reviews, two of which were rather suspiciously from a character calling himself Michael or Mike Burns.

On the Wales portal of the BBC website Burns had posted "Well i don't get excited about food too often, and having dined in Rick Stein's, and Gordon Ramsay's,I think i have found a better restaurant in West Wales". On the SugarVine website he also posted "what a fantastic restaurant for couples, and families. it seems to have everything, the food has to be the best i have eaten (home or abroad) this place will go far". Other online reviews echoed what has already been said, but with the dire state of the restaurant, its food, its reputation and its perception from both the local community and Ramsay itself, would it not be right to question who was telling the truth?

The restaurateur confessed to posting the reviews, his rational pointing to stimulating custom, however with any reactive strategy it requires a degree of foresight - and I am not sure he really thought through the wider ramification of posting these "inaccurate" reviews.

Firstly, a warning must be expressed. For example, if someone finds your restaurant or hotel via a positive (fake) review and they have a bad experience, there is a chance that they will post a true review to assist fellow users and generally have a rant. The initial seeding of this true review has the potential to lead to an onslaught of further reviews from other visitors who might not have otherwise posted. Don't forget the saying "people don't lead... they follow".

But how can you manage your reviews and ultimately what your customers are saying about you? Well first and foremost, address the problem(s)!

You wouldn't put a sticking plaster on a gun shot wound, so why think that a positive review about the quality of your food or the softest of your sheets is going to counteract the adversities of your customer service?

The customer is king, a point stressed by Ramsay, and one that should ring true for any business, after all, without them, where would we be?

By rectifying or at least making plans to manage any failings within your business, regardless of its size, will be the first step in managing your online reputation, but this is an area I will not going into comprehensive detail for this post. Instead, I will offer some simply pointers as to how to harness online reviews for good.

Sites like Trip Advisor, which boasts over 10,000,000 user generated reviews of various hotels, holidays and restaurants is gaining increasing weighting as an resource for honest and unbiased review and via its system of community recommendation it really has the power to drive custom, and in many instances, divert customer - the key factor being positive, and consistent reviews.

But if you do run a successful hotel or restaurant and wish to harness these social spaces, but wish to do so in a more ethical way than that demonstrated in Kitchen Nightmares than why not encourage your diners of hotel guests to post a review after their stay.

When the customer is paying their bill or even booking their hotel room why not take their email address, or even ask them to submit their business card in return for entry into a monthly prize draw for a free meal in the restaurant?

In addition to building up a client database by collecting this data - for use in promotional mailings including notifying customers of events, promotional and the launch of a new menu - you can also harness it to stimulate online reviews by dropping your customers a short email after their stay / meal, which might look something like the following example...

"Good afternoon Simon, and thank you very much for your booking at the Leapfrogg Restaurant, we hope you had an enjoyable meal.

We pride ourselves on the quality of our food and our attentive staff however we're always striving to enhance and improve what we do, and as such we would appreciate you taking two minutes of your time to write a review for us at Trip Advisor (http://www.tripadvisor.com), a free travel guide and research website that allows users to post review and ratings.

Your comments are important to us, and will be used to improve the Leapfrogg restaurant.

Thank you very much for your time and we look forward to welcoming you again to the Leapfrogg restaurant in the near future.

Sincerely,

A Restaurateur
Leapfrogg restaurant
Brighton
Tel: 01273 669 450"

Of course, many of your requests will be ignored, but providing you are personal in your emails (a point we at Leapfrogg have mentioned previously in this blog) then you are more likely to get a response, and even if you only have a 5% success rate, this is still 5% of valuable customer feedback.

A point to which I will conclude this article is one which has stuck with me from London's SMX, and one that I will most certainly be repeating from here on out is that "Yesterday's news no longer wraps today's fish and chips". Online news and online content, including user generated reviews do not simply get binned like a newspaper at the end of the day, but they remain live, and can even appear within the search results for a brand keyword search... so isn't it worth paying attention to what your customers are saying?

Thursday, July 10, 2008

A Guide to RSS Aggregators

by: Terry Leslie

One of the most popular features of Internet portals, websites, pages and even emails is a frame that features an organized list of news headlines and periodic updates from other web sources. Really Simple Syndication, formerly “Rich Site Summary” or simply, RSS makes this possible.

Most users visit a lot of websites whose content continually change, such as news sites, community organization or professional association information pages, medical websites, product support pages, and blogs. As Internet surfing became an intrinsic part of business and leisure, it became important to get rid of the very tedious task of repeatedly returning to each website to see updated content.

RSS easily distributes information from different websites to a wider number of Internet users. RSS aggregators are programs that use RSS to source these updates, and then organize those lists of headlines, content and notices for easy reading. It allows computers to automatically retrieve and read the content that users want, then track changes and personalize lists of headlines that interests them.

The specially made computer programs called “RSS aggregators” were created to automatically find and retrieve the RSS feeds of pre-selected internet sites on behalf of the user and organize the results accordingly. (RSS feeds and aggregators are also sometimes referred to as "RSS Channels" and "RSS Readers".)

The RSS aggregator is like a web browser for RSS content. HTML presents information directly to users, and RSS automatically lets computers communicate with one another. While users use browsers to surf the web then load and view each page of interest, RSS aggregators keeps track of changes to many websites. The titles or descriptions are links themselves and can be used to load the web page the user wants.

RSS starts with an original Web site that has content made available by the administrator. The website creates an RSS document and registers this content with an RSS publisher that will allow other websites to syndicate the documents. The Web site also produces an RSS feed, or channel, which is available together with all other resources or documents on the particular Web server. The website will register the feed as an RSS document, with a listed directory of appropriate RSS publishers.

An RSS feed is composed of website content listed from newest to oldest. Each item usually consists of a simple title describing the item along with a more complete description and a link to a web page with the actual content being described. In some instances, the short description or title line is the all the updated information that a user wants to read (for example, final games scores in sports, weblogs post, or stock updates). Therefore, it is not even necessary to have a web page associated with the content or update items listed -- sometimes all the needed information that users need would be in the titles and short summaries themselves.

The RSS content is located in a single file on a webpage in a manner not very different from typical web pages. The difference is that the information is written in the XML computer code for use by an RSS aggregator and not by a web user like a normal HTML page.

There are 2 main parts that are involved in RSS syndication, namely: the source end and the client end.

The client end of RSS publishing makes up part of the system that gathers and uses the RSS feed. For example, Mozilla FireFox browser is typically at the client end of the RSS transaction. A user’s desktop RSS aggregator program also belongs to the client end.

Once the URL of an RSS feed is known, a user can give that address to an RSS aggregator program and have the aggregator monitor the RSS feed for changes. Numerous RSS aggregators are already preconfigured with a ready list of RSS feed URLs for popular news or information websites that a user can simply choose from.

There are many RSS aggregators that can be used by all Internet users. Some can be accessed through the Internet, some are already incorporated into email applications, and others run as a standalone program inside the personal computer.

RSS feeds have evolved into many uses. Some uses gaining popularity are:

•For online store or retail establishments: Notification of new product arrivals
•For organization or association newsletters: title listings and notification of new issues, including email newsletters
•Weather Updates and other alerts of changing geographic conditions
•Database management: Notification of new items added, or new registered members to a club or interest group.

The uses of feeds will continue to grow, because RSS aggregators make access to any information that individual users like more convenient and fun.

In the mean time, Good Luck on your journey to success…

OR if you would like to succeed immediately to create financial freedom working only 4 hours a week, check out http://www.Secrets2InternetFortunes.com.

AND for a Limited Time, you will also receive a FREE copy of a limited number of the amazing 60 page eBook “52 Highly Profitable Instant Online Business Ideas That You Can Steal As Your Own And Start Today On A Very Tight Budget!”, which is jam packed with so many ideas you can use to instantly create an automated income for life! That’s my GIFT to You as a way of saying thank you for reading my articles.

Wednesday, July 9, 2008

A Guide on RSS Tool

by: Terry Leslie

RSS is an abbreviation that has evolved into the following, depending on their versions:

• RDF Site Summary (also known as RSS 0.9; the first version of RSS)
• Rich Site Summary (also known as RSS 0.91; a prototype)
• Really Simple Syndication (also known as RSS 2.0)

Today, RSS stands for 'Really Simple Syndication', and it has the following 7 existing formats or versions:

• 0.90
• 0.91
• 0.92
• 0.93
• 0.94
• 1.0
• 2.0

RSS tools refer to a group of file formats that are designed to share headlines and other web content (this may be a summary or simply 1 to 2 lines of the article), links to the full versions of the content (the full article or post), and even file attachments such as multimedia files. All of these data is delivered in the form of an XML file (XML stands for eXtensible Markup Language), which has the following common names:

• RSS feed
• Webfeed
• RSS stream
• RSS channel


They are typically shown on web pages as an orange rectangle that usually has the letters XML or RSS in it.

RSS feeds can be used to deliver any kind of information. Some of these 'feeds' include:

• Blogs feed - each blog entry is summarized as a feed item. This makes blog posts easier to scan, enabling 'visitors' to zoom in on their items of interest.

• Article feed - this alerts readers whenever there are new articles and web contents available.

• Forum feed - this allows users to receive forum posts and latest discussion topics.

• Schedule feed - this allows users (such as schools, clubs, and other organizations) to broadcast events and announce schedule changes or meeting agendas.

• Discounts or Special feed - this is used to enable users (such as retail and online stores) to 'deliver' latest specials and discounted offers.

• Ego or News Monitoring - this enables users to receive 'filtered' headlines or news that are based on a specific phrase or keyword.

• Industry-specific feed - used by technical professionals in order to market, promote, or communicate with current (and prospective) customers and clients within their specific industries.

RSS feeds enable people to track numerous blogs and news sources at the same time. To produce an RSS feed, all you need is the content or the article that you want to publicize and a validated RSS text file. Once your text file is registered at various aggregators (or 'news readers'), any external site can then capture and display your RSS feed, automatically updating them whenever you update your RSS file.

RSS tools are useful for sites that add or modify their contents on a regular basis. They are especially used for 'web syndication' or activities that involve regular updates and/or publications, such as the following:

• News websites - as used by major news organizations such as Reuters, CNN, and the BBC.
• Marketing
• Bug reports
• Personal weblogs

There are many benefits to using RSS feeds. Aside from being a great supplemental communication method that streamlines the communication needs of various sectors, RSS tools and feeds can also have tremendous benefits in your business, particularly in the field of internet marketing.

RSS tools and feeds provide Internet users with a free (or cheap) and easy advertising or online marketing opportunity for their businesses. Below are some of the RSS features that can help make your internet marketing strategies more effective.

1. Ease in content distribution services. With RSS, your business can be captured and displayed by virtually any external site, giving you an easy way to 'spread out' and advertise them.

2. Ease in regular content updates. With RSS, web contents concerning your business can now be automatically updated on a daily (and even hourly) basis. Internet users will be able to experience 'real time' updates as information in your own file (such as new products and other business-related releases) is changed and modified simultaneously with that of the RSS feeds that people are subscribed to.

3. Custom-made content services. With RSS, visitors can have personalized content services, allowing them total control of the flow and type of information that they receive. Depending on their interests and needs, visitors can subscribe to only those contents that they are looking for (such as real estate or job listings).

4. Increase in (and targeted) traffic. With RSS, traffic will be directed to your site as readers of your content summary (or 1 to 2 lines of your article) who find them interesting are 'forced' to click on a link back to your site.

These are just several of the many things that you can do with RSS. The possibilities are endless, and they are all aimed at providing you with an effective internet marketing strategy for your business.

In the mean time, Good Luck on your journey to success…

OR if you would like to succeed immediately to create financial freedom working only 4 hours a week, check out www.secrets2internetfortunes.com.

AND for a Limited Time, you will also receive a FREE copy of a limited number of the amazing 60 page eBook “52 Highly Profitable Instant Online Business Ideas That You Can Steal As Your Own And Start Today On A Very Tight Budget!”, which is jam packed with so many ideas you can use to instantly create an automated income for life! That’s my GIFT to You as a way of saying thank you for reading my articles.

Tuesday, July 8, 2008

Symantec Norton Antibot The Latest In Norton Computer Protection Software

by: Lisa Carey

It seems like every other month a new “program” comes along to make our lives that much easier. For example, first we could bookmark favorites, and then RSS feed them, and then came widgets and now “bots” which are robots that do a lot of our computer work for us in the background. Examples of friendly bots are weather bots, game playing bots, instant messaging and my favorites are those on AOL Instant Messenger which do all kinds of functions for me like shop, find movie times and even give updates on the Wall Street Journal.

Unfortunately not all bots were created “equal.” Some are friendly and some are not. The ones that are not friendly can be a form of malware that allows control of your computer to be released, providing hackers with the opportunity to access your information and spread harmful bots to others. This type of computer virus can then be used to spread spam and commit various types of identity theft and other online fraud.

So with new threats to our computers and information, new methods of protection are required. One of the oldest and most well known software protection designers has recently released a new protection program, Symantec Norton AntiBot. This is a software product designed to prevent the hijacking of one’s personal computer by bots and uses the bots on design programs against them, to located and destroy them.

Many people already employ some form of protection on their personal computer, such as increasing the protection level from internet information to “high.” But these cannot detect some of the most recent bot programs and may not be the most efficient means of information protection, especially with the Internet being used more and more frequently for online shopping, ticket purchases, travel and other “high risk” activities.

A more effective method of detecting and eliminating threats caused by bots is to install software designed specifically to detect, destroy and prevent bots from having access to your computer. With Symantec Norton AntiBot software, protection against bots is enhanced several times and the threat of bot attack is greatly diminished. It’s program protects against bots by blocking bots from entering your computer through downloads and e-mail attachments (two of the common ways bots enter a personal computer), checking for any unusual behavior on your personal computer and eliminating it, and detecting malicious bot software at all levels; keeping your personal, financial and credit card information safe and stopping identify theft before it can occur.

Because bots operate in the background and are not detectable by antivirus or antispyware programs, many computer users are completely unaware that their personal computer has become infected. Many problems caused by bots go undetected until it is too late. Warning signs that your computer may have been accessed include: slowness of computer speed and unusual or irrelevant error messages. However, many times com these symptoms are sporadic and computer users will take little notice. Many people will continue to use their personal computer, unaware that bots have hijacked their personal computer and are slowly at work; looking for credit card numbers, passwords, and logon information which can be used for identity theft and in committing other types of online crime. This program scans your personal computer on a continuous basis, closing the gaps that could allow bots to infect your personal computer and better ensuring that bots do not invade and gain control.

The use of Symantec Norton AntiBot to determine what a harmful or useful bot and allows you to continue using those bots you love and have come to depend on for information and services. It can be used in addition to several other antivirus and antispyware programs. Its compatibility is not limited to only Norton products.

The cost of this software is $29.95 for one year of service. It was awarded PC Magazine’s Editor’s Choice Award (2007) and underwent rigorous testing which included using AntiBot on computers with existing threats as well as allowing threats to try to access the computer after installation.

With the growing threat of identity theft and credit card fraud Symantec Norton AntiBot offers an additional level of protection needed to combat the threat of bots and prevent them from turning one’s personal computer into a robotic that turns into an instrument of destruction to both your personal and financial well-being.

Monday, July 7, 2008

Record Your Products: Reap The Rewards of Recording And Getting Your Product Done Faster And Easier.

by: Patsy Bellah

Some of you will remember when we had to type on typewriters. Some of you, present company included, may even remember when we had to type on “standard” or manual typewriters. For those who aren’t in the know, that’s a typewriter without electricity

Then we got electric typewriters. That was something new to learn, but all our work could be done faster, easier and with less mess.

Then came computers. There was more to learn but with this technology life was made even easier for secretaries, writers, or anyone having to convey information with the written word.

With each of these advances there were those who said they couldn’t do it. They didn’t like it, they didn’t like change. They could get along just fine, thank you very much, with a manual typewriter, or an electric one. They didn’t need computers. There was too much to learn. It was too different.

Don’t let that attitude keep you from learning the latest time saver for transferring words to paper and that is the digital recorder. As the manual typewriter has given way to more sophisticated electric typewriters, which have given way to the computer, so, too, has the digital recorder made it faster and easier to transfer the spoken word to the written word.

On the average a one-hour recording will yield about 20-30 typewritten pages. That means that with a one hour “conversation,” speaking your story or information into a recording device, then getting it transcribed, you can transfer your spoken word to a document in about 25% of the time it would take you to type it yourself.

It may take a bit of practice to learn to dictate into a recorder, but once you have, you will find that you can save yourself a ton of time. Statistics prove that the longer it takes to complete a project, the less likely it is that you will finish it. Embrace this new technology.

Here are some guidelines you should consider when purchasing a digital recorder:

1. You must be able to download your recording to your computer. Some of the less expensive recorders are not “downloadable.” You need to be able to transfer your recording through the Internet in order to send it to a transcription service or even if you want to transcribe it yourself.

2. Although most recorders come with internal microphones, it is best to have the capability to attach an external microphone. External microphones work better to record presentations or to record from a distance. Additionally, you can elect to use a lavaliere microphone for yourself and not be hampered with holding the recorder. Or, if you are recording more than one person, such as if you are interviewing someone, you can get an attachment which allows you to hook up two microphones.

3. The recorder should have at least four hours of available recording time using the high quality recording setting. You want to make sure the recorder has enough time to record a full presentation before having to be downloaded to the computer.

The capabilities of recorders change all the time, and in my recent research I found that the prices, like anything else, are coming down drastically and we are getting more and more recording time.

I checked out the Olympus recorder on the Internet and found a very good quality recorder for around $100.00. I also found that you could buy this at Best Buy in the Los Angeles area at the same price. Other locations such as Samy’s Cameras for those in the Los Angeles area, Circuit City, Radio Shack and Frys may also have them.

For those of you who live in the Los Angeles area, I found an Olympus and a Marantz at Samy’s Cameras which uses a flash card and can get you as much as 4G-8G of storage space. Both of these sell for just under $400.00. The Sony or the Edirol are also good recorders, and have similar capabilities and prices.

Buying a recorder is much like buying a blender or a computer. Although it’s wise to buy as much as your pocketbook allows, at the same time, you don’t need to buy more than you will use. Why spend the extra money.

A digital recorder is small and easy to use. On it you can record all of your information products, plus your presentations, blogs or articles.

Embrace this new technology. Using a digital recorder to record your information product, presentations or teleseminars, will allow you to finish your product in less than 25% of the time it would take you to type it yourself. If you get your audio transcribed, once you get it back, all you have to do is edit it and you can have your product completed in less than a week.

Sunday, July 6, 2008

Internet And Business Online – The Act Of Interdependence

by: Scott Lindsay

The best role of business online is that of interdependency. We’ve all heard the old saying, “No man is an island.” When it comes to online business this is especially true.

If a business owner who takes their business into the online world determines they will be self reliant and never accept the help of anyone then that individual will not be in business long enough to change their minds.

It is accepted fact that the greatest tool for long-term exposure to your website is through Search Engine Optimization (SEO). Without it potential customers can’t find you. It is unreasonable to expect that you can adequately develop a website without optimizing your website for the best possible search engine ranking.

Search engines also place a high value on sites that have links placed on existing sites. These ‘backlinks’ demonstrate to search engines that others trust your site. By placing your link on their website these other businesses indicate a trust and recommendation for your site.

In effect the two strategies listed above rely exclusively on what others can do for you when it comes to your online business.

Shirley Temple once proclaimed in her movie Rebecca of Sunnybrook Farm, “I’m very self-reliant.” American westerns are filled with lines dealing with pulling yourself up by your bootstraps and holding down the fort. Many of us have grown up to believe if we want something done right we have to do it ourselves.

This thinking is in opposition to the rules associated with an online business.

The online world can only exist because people share. Individuals share technology, but the also share links, reviews, blogs, forums and a wide range of other marketing strategies that find a commingling of interdependency.

In online business you are as dependent on others as they may be on you. Unlike the word ‘dependent’, the term interdependent indicates a mutual dependency. In other words you are depending on others to help provide links back to your site while they are equally dependent on you (or others) for the success of their business.

Have you really taken a proactive approach to networking? It’s possible you are reading this today and you’ve never considered asking someone else to place a link to your site on his or her online business site.

It can feel awkward depending on others to achieve online success especially if you’ve been lead to believe reliance on others is also a sign of imposing on their otherwise brilliant generosity.

I suppose it could be a deep-seated sense of pride that makes it hard to consider the need to ask others for help. However, the truth is depending on others is really what has made the Internet possible. The growth of this online world is comprised of a link of computers, networks and servers that are connected in a way that provides the maximum benefit for all.

Building an online business can feel a bit like trying to build a house of cards. Without the ability to rely on the other ‘cards’ around you it is virtually impossible to build.

Interdependence. This is the essence of online business.

Saturday, July 5, 2008

Web Development And The Big Time Out

by: Scott Lindsay

One of the great debilitators in online business is simply the perceived (or real) lack of time. Business owners are used to moving forward. An online web presence can make them feel tied to an office chair learning skills they aren’t sure they want to know.

It’s not uncommon for those who deal in full time web design to have individuals contact them for a site design, but have absolutely no idea what they want. Furthermore when the designer questions them the response might be, “I don’t know, just make it look nice.”

Let’s not forget the core values or mission of the business. Many business owners have no idea how to answer those kinds of questions. They may stare blankly for a moment or two and there’s no more time for further deep thought so they go back to action – without answers.

In many cases it is possible to answer some of the questions needed, but it may require taking time away from a familiar setting. It may also require more time than you think you want to give.

If you can get to a place of concentrated contemplation you are likely to find yourself stripping ideas to their core to find out what your business is trying to accomplish and what your ultimate goals might be.

As with almost any project you can turn frustration around if you will just take the time to come to terms with your vision.

Sometimes we spend so much time ‘doing’ we never stop to ask the question, “Why?”

This process can be a bit like taking a bus that drives around the park. You keep looking at the flowers and the park bench and long to sit in the quiet shade of a tree and just absorb the calming atmosphere. You know they will have a positive effect on you, but for some reason you just can’t seem to find the energy to get off the bus.

It seems to me there are some sites that are misguided or rarely guided that could benefit from the process of self-evaluation. These sites may look nice, but there is a sense of disconnection that may not be easy to identify, but it’s fairly obvious to visitors.

Creative energy is at a minimum while business owners simply tackle what seem to be the most urgent details.

As more people gravitate to online business there needs to be a shift in the thinking of how one goes about doing business online. In many ways it can’t be approached in the same way a traditional business is developed, yet that is typically the way many new web commerce ventures choose to tackle the subject.

You may discover your business will be more successful if you take some time for rigorous reflection. The time set aside can be a bit like an architect that takes the time to develop plans for a new building. You wouldn’t expect the architect to simply tell a construction crew to, “Go out there and build – something.”

Work at ‘building’ your online business in a comprehensive way. Your effort can develop a firm foundation for long-term success.

Friday, July 4, 2008

Back to Back User Agents for Telecommunications

by: Danny Loeb

Today’s telecommunications networks are a delicate blend of clients and servers that together offer virtually endless possibilities when it comes to services and applications. For every new client developed, there seems to be a score more on the way — from mobile handsets, PDAs, terminals, telephones, video phones, IP set-top-boxes, and so on.

There are essentially two types of servers that connect between clients on large networks: Proxy servers and Back-to-Back User Agent (B2BUA) servers. The more prevalent Proxy servers feature predictable behavior — simply connecting between clients. Effectively, B2BUA servers are much stronger and intelligent entities that perform actions which Proxy servers cannot. Moreover, B2BUA servers provide a flexible solution for a wide range of applications and services and are becoming the primary engine for more and more SIP servers in NGN and IMS networks.

The difference between Proxy servers and B2BUA servers is sometimes not fully understood. In this article, we will explore what makes B2BUA servers such an appealing alternative to standard Proxy servers. Better understanding of B2BUA servers can help managers understand the value, and the tradeoffs, of choosing a B2BUA server, as well as the frameworks needed to develop a wide range of SIP applications and SIP services using it.

Figure 1 - Architectural difference between Proxy servers and B2BUA servers

B2BUA Server Defined
B2BUA servers are used to provide value added features for point-to-point calls and manage multi-point calls. The power behind a B2BUA server is derived mostly from the fact that it has a very generic definition, which gives it almost unlimited power. However, this same characteristic is the root of the controversy surrounding it.

IETF standard (RFC 3261) defines a back-to-back user agent as “a logical entity that receives a request and processes it as a user agent server (UAS). In order to determine how the request should be answered, it acts as a user agent client (UAC) and generates requests. Unlike a Proxy server, it maintains a dialogue state and must participate in all requests sent on the dialogues it has established.”

B2BUA servers have capabilities that far exceed those of other types of SIP servers, and answer the need for developing sophisticated value added SIP applications that cannot be implemented as Proxy applications.

Some of these capabilities, which are unique to B2BUA servers, are outlined below:

3rd Party Call Control (3PCC) Features
3rd Party Call Control (3PCC) is the ability of an entity (usually a controller) to set up and manage communication between two or more parties. 3PCC is often used for operator services and conferencing.

3PCC actions are important capabilities, exclusive to B2BUA servers since “passive” non call-stateful elements, such as Proxy servers, cannot initiate these types of activities. Some examples of 3PCC services are online billing, QoS, resource prioritization, call transfer, click-to-dial, mid-call announcement and more.

3PCC actions can be initiated automatically by B2BUA server applications, like disconnecting a call following credit expiration in an online-billing system. Or they can be initiated by remote administrative control (OSS), e.g. invite parties to a multi-point conferencing session.

Figure 2 - Schematic outline of B2BUA server offering 3PCC functionality

Inter-working Function (IWF) for Interoperability

SIP was designed as a highly flexible and extendible protocol. The very strength of this flexibility is also an inherent weakness, since the vast array of client types in the market still need to connect.

B2BUA Inter-working Functions (IWF) defines a wide range of powerful SIP servers that connect SIP clients that “speak” in different protocol dialects, or support different capabilities. This Inter-working function is very important in enabling connectivity between clients with different capabilities and/or protocol dialects. Or even between clients and networks – where the B2BUA server actually acts as an access device.

Examples of what IWF can do include:

• Connecting SIP clients to IMS networks by adding and removing IMS SIP protocol extensions (AKA P-Headers) that are essential for connecting to the IMS network
• Connecting clients with different Session Timers settings
• Connecting clients with different media capabilities and with distinct Session Description Protocol (SDP) messages by relaying between the two types of control sessions
• Connecting to different types of networks (e.g. IPv4, IPv6) and support for different transport types, such as TCP/UDP/SCTP/TLS

Figure 3 - Schematic outline of a B2BUA Inter-Working Function

Multi-point Call Management

B2BUA servers an also implement multi-point call scenarios where multiple CPE devices connect to the B2BUA, and the B2BUA provides services to all CPE.

Due to these unique capabilities, B2BUA servers are widely used in the communications industry. A few examples are listed below:

• Online-billing/prepaid functions
• Servers supporting Resource Prioritization (RP) and/or Quality of Service (QoS) features
• Multi Point Conferencing servers
• IVR servers
• PBX Applications and Softswitches
• Application Layer Gateways (ALG)
• FW/NAT Traversal applications
• Privacy servers
• 3rd-Party Call Control Applications (3PCC)
• Service Creation Environment (SCE) runtime engines
• Session Boarder Controller (SBC)
• IMS S-CSCF, P-CSCF, I-CSCF
• SIP Inter-work Function (IWF) Gateway
• Security Gateway (SEG)
• Voice Call Continuity (VCC) servers

In addition, B2BUA servers play an important role in emerging IMS networks. Recent releases of 3GPP IMS specifications (3GPP TS 24.229 V8.0.0) indicate that an increasing number of IMS network element servers, such as P-CSCF, IBCF,SBC etc., are B2BUA servers. The reason for this is that value added services are usually session stateful, and feature capabilities that go beyond basic call proxying. Applications written on top of B2BUA Application servers fulfill several roles, such as SIP User Agents, SIP Proxy servers and SIP Registrars.

B2BUA Server Challenges

B2BUA application developers face many challenges, such as achieving rapid time-to-market, conformance and interoperability, offering customization for proprietary services and support for High Availability (HA) and redundancy. A comprehensive B2BUA framework can help developers overcome these challenges.

A solid B2BUA framework should have modular application building block architecture for increased flexibility, abstraction and short delivery time. Traditional architecture, which features a single configurable state machine, is not flexible enough. Also, a B2BUA framework should facilitate developing B2BUA applications by flexibly linking “pluggable” high-level Modular Application Building Blocks (MABB). Developers should have the ability to combine these MABBs and they should be designed in a way that allows developers to further customize their behavior if needed. This type of architecture complies with contemporary Service Oriented Architecture (SOA) concepts, and is suitable for powering flexible business communication platforms. This modular architecture can save months of work. With a set of MABBs in hand, developing the application is a matter of combining existing MABBs to produce the required business logic. In addition, this architecture enhances efficiency; development of new MABBs can be done concurrently.

A B2BUA framework should facilitate developing applications that fully conform to standards and are interoperable; without restricting developers from customizing protocol behavior for special cases. Moreover, it should conform for non-standard implementations, as well as to mediate between two versions of the same standard. This type of framework allows developers to focus on their proprietary application with the confidence that their final application will be fully interoperable.

And finally, a B2BUA framework should provide the ability to configure, amend and replace application building blocks to create proprietary features. With this ability, developers can maximize existing code – significantly reducing development time, shortening testing cycles, and reducing overall time-to-market.

Figure 4 - Traditional architecture of a B2BUA framework

RADVISION’s B2BUA Application Framework http://www.radvision.com/Products/Developer/SIPServer delivers these capabilities and more. The B2BUA Application Framework module is a part of the RADVISION SIP server Platform, a software framework that offers the essential building blocks for the development of a wide variety of high performance SIP and IMS servers. The rich set of components and modules can be flexibly combined to match customers’ requirements for developing SIP servers that offer both standard and advanced SIP services.

Applications written on top of RADVISION’s B2BUA framework are developed by combining customizable modular application building blocks. This is effectively large chunks of functionality that can be strung together to form ad-hoc applications, enabling developers to focus on the high-level business logic and use building blocks that hide low-level details.

As one of the most popular IM applications, Yahoo! Messenger was the first large consumer player that adopted B2B UA. Yahoo! Messenger combined its backend scalable platform with RADVISION’s B2B UA to serve millions of monthly unique messaging users around the world. Yahoo selected RADVISION’s B2BUA due to its robust performance and scalability features.

Figure 5 - The architecture of RADVISION B2BUA Application Framework


RADVISION also offers automatic High Availability (HA) and Redundancy support. The B2BUA framework automatically replicates the run-time state of the different Services and B2BUA framework core. In the event of a server outage, a redundant server takes over seamlessly and provides uninterrupted service continuity.

B2BUA framework benefits in a nutshell

• Significantly reduces time to market developing proprietary B2B applications and services.
• Allows adding advanced services easily to retain competitive advantage and evolve to meet growing customer demands.
• Focuses on the business logic and hides low level operator communication intricacies.
• Delivers off-the-shelf conformance and interoperability.
• Enables rapid development of applications that can interoperate with different vendors.
• Enables adding high-availability features easily.

Click here http://www.radvision.com/Resources/WhitePapers/b2bua.htm for more extensive information on B2BUA Servers.

By Danny Loeb, RADVISION http://www.radvision.com Product Manager

Thursday, July 3, 2008

The Battle of the Browsers – The History and the Future of Internet Browsers

by: Nicholas C Smith

With Internet Explorer 8 now available, can Microsoft hope to retain market dominance over fierce open source rivals such as Mozilla's Firefox or the feature packed Opera web browser. Can history give us a clue to what the future of web browsers/browsing might hold? How did Netscape Navigator go from having a dominant 89.36% market share of all web browsers in 1996 and yet only 3.76% by mid 1999?

Let us take a journey that will begin long before even the intellectual conception of Internet Explorer, that will glance at its long defeated rivals, examine the current browsers available and will end with a prediction of what the future of browsing will offer us – and which browser(s) will still be around to offer it.

People often think that Internet Explorer has been the dominant web browser since the golden age of the internet began. Well for a very long time now it has indeed been the most popular browser and at times been almost totally unrivalled. This was mainly a result of it being packaged free with Microsoft Windows, in what some would later call a brutal monopolisation attempt by Microsoft. The last few years however have heralded the arrival of new, possibly superior browsers. Mozilla's Firefox has been particularly successful at chipping away at Explorers market dominance. So where did it all begin, and why were Microsoft ever allowed to have a hundred percent market dominance?

Origins

The truth is they never did have total dominance, but at times they have come very close. Microsoft actually entered the Browser Battle quite late on. Infact a man named Neil Larson is credited to be one of the originators of internet browsers, when in 1977 he created a program – The TRS-80 - that allowed browsing between “sites” via hypertext jumps. This was a DOS program and the basis of much to come. Slowly other browsers powered by DOS and inspired by the TRS 80 were developed. Unfortunately they were often constricted by the limitations of the still fairly young internet itself.

In 1988, Peter Scott and Earle Fogel created a simple, fast browser called Hytelnet, which by 1990 offered users instant logon and access to the online catalogues of over five thousand libraries around the world – an exhilarating taste of what the internet, and web browsers, would soon be able to offer.

In 1989 the original World Wide Web was born. Using a NeXTcube computer, Tim Berners-Lee created a web browser that would change how people used the internet forever. He called his browser the WorldWideWeb(http://www., which is still likely to sound familiar to internet users today. It was a windowed browser capable of displaying simple style sheet, capable of editing sites and able to download and open any file type supported by the NeXTcube.

In 1993 the first popular graphical browser was released. Its name was Mosaic and it was created by Marc Andreessen and Eric Bina. Mosaic could be run on both Unix, and very importantly, on the highly popular Microsoft Windows operating system (incidentally it could also be used on Amiga and Apple computers). It was the first browser on Windows that could display graphics/pictures on a page where there was also textual content. It is often cited as being responsible for triggering the internet boom due to it making the internet bearable for the masses. (It should be noted that the web browser Cello was the first browser to be used on Windows – but it was non graphical and made very little impact compared to Mosaic).

The Browser Wars - Netscape Navigator versus Internet Explorer

Mosaic's decline began almost as soon as Netscape Navigator was released (1994). Netscape Navigator was a browser created by Marc Andreessen, one of the men behind Mosaic and co-founder of Netscape Communications Corporation. Netscape was unrivalled in terms of features and usability at the time. For example, one major change from previous browsers was that it allowed surfers to see parts of a website before the whole site was downloaded. This meant that people did not have to wait for minutes simply to see if the site they were loading was the actual one the were after, whilst also allowing them to read information on the site as the rest of it downloaded. By 1996 Netscape had almost 90% market dominance, as shown below.

Market Share Comparisons of Netscape Navigator and Internet Explorer from 1996 to 1998

....................Netscape.......IE
October 1998..........64%.........32.2%
April 1998............70%.........22.7%
October 1997..........59.67%......15.13%
April 1997............81.13%......12.13%
October 1996..........80.45%......12.18%
April 1996............89.36%.......3.76%

In these two years Netscape clearly dominated the internet browser market, but a new browser named Internet Explorer was quickly gaining ground on it.

Microsoft released their own browser (ironically based on the earlier Mosaic browser which was created by one of the men now running Netscape), clearly worried about Netscape's dominance. It was not so much the worry that it would have a 100% market share of internet browsers on their Windows operating system, but more the worry that browsers would soon be capable of running all types programs on them. That would mean foregoing the need for an actual operating system, or at the most only a very basic one would be needed. This in turn would mean Netscape would soon be able to dictate terms to Microsoft, and Microsoft were not going to let that happen easily. Thus in August 1995, Internet Explorer was released.

By 1999 Internet explorer had captured an 89.03% market share, whilst Netscape was down to 10.47%. How could Internet Explorer make this much ground in just two years? Well this was down to two things really. The first, and by far the most important was that Microsoft bundled Internet Explorer in with every new copy of Windows, and as Windows was used by about 90% of the computer using population it clearly gave them a huge advantage. Internet Explorer had one other ace it held over Netscape – it was much better. Netscape Navigator was stagnant and had been for some time. The only new features it ever seemed to introduce were often perceived by the public as beneficial for Netscape's parent company rather than Netscape's user base. (i.e., features that would help it monopolise the market). Explorer, on the other hand, was given much attention by Microsoft. Regular updates and excellent usability plus a hundred million dollar investment would prove too much for Netscape Explorer.

2000 – 2005

These years were fairly quiet in the Battle of the Browsers. It seemed as if Internet Explorer had won the war and that nobody could even hope to compete with it. In 2002/2003 it had attained about 95% of the market share – about the time of IE 5/6. With over 1000 people working on it and millions of dollars being poured in, few people had the resources to compete. Then again, who wanted to compete? It was clearly a volatile market, and besides that everybody was content with Internet Explorer. Or were they? Some people saw faults with IE – security issues, incompatibility issues or simply bad programming. Not only that, it was being shoved down peoples throats. There was almost no competition to keep it in line or to turn to as an alternative. Something had to change. The only people with the ability and the power to compete with Microsoft took matters into their own hands.

Netscape was now supported by AOL. A few years prior, just after they had lost the Browser Wars to Microsoft, they had released the coding for Netscape into the public domain. This meant anybody could develop their own browser using the Netscape skeleton. And people did. Epiphany, Galeon and Camino, amongst others, were born out of Netscape's ashes. However the two most popular newcomers were called Mozilla and Firefox.

Mozilla was originally an open sourced project aimed to improve the Netscape browser. Eventually it was released as Netscape Navigator 7 and then 8. Later it was released as Mozilla 1.0.

Mozilla was almost an early version on another open source browser, Firefox. With it being an open source the public were able to contribute to it - adding in what features it needed, the programming it required and the support it deserved. The problems people saw in Internet Explorer were being fixed by members of the open sourced browser community via Firefox. For instance, the many security issues IE 6 had were almost entirely fixed in the very first release of Firefox. Microsoft had another fight on their hands.

2005 – Present

Firefox was the browser that grew and grew in these years. Every year capturing an even larger market share percentage than before. More user friendly than most of its rivals along with high security levels and arguably more intelligent programming helped its popularity. With such a large programming community behind it, updates have always been regular and add on programs/features are often released. It prides itself on being the peoples browser. It currently has a 28.38% market share.

Apple computers have had their own browser since the mid 1990's – Safari - complete with its own problems, such as (until recently) the inability to run Java scripts. However most Apple users seemed happy with it and a version capable of running on Windows has been released. It has had no major competitor on Apple Macs, and as such has largely been out of the Browser Wars. It currently holds a 2.54% market share and is slowly increasing.

Internet Explorer's market share has dropped from over 90% to around 75%, and is falling. It will be interesting to see what Microsoft will attempt to regain such a high market share.

Opera currently holds 1.07%.

Mozilla itself only has a 0.6% market share these days.

The Future of Web Browsing

Web browsers come and go. It is the nature of technology (if such a term can be used), to supplant inferior software in very short periods of time. It is almost impossible for a single company to stay ahead of the competition for long. Microsoft have the advantage of being able to release IE with any Windows using PC. That covers over 90% of the market. They also have the advantage of unprecedented resources. They can compete how they wish for as long as they wish. So there is no counting IE out of the future of web browsing.

Safari is in a similar position, being easily the most popular Mac web browser. Its long term survival is dependant upon Apple and the sale of their computers.

These are the only two browsers that are almost guaranteed another five years of life, at least. Firefox may seem like another candidate, but the public is fickle, and one bad release, or if it seriously lags behind the new Internet Explorer 8 for long, could easily see its popularity quickly descend into virtual oblivion.

However, it seems likely community driven browsers, such as Mozilla and Firefox, will be the only types of browser capable of competing with the wealthy internet arm of Microsoft in the near future.

As for web browsing itself, will it change any time soon? Well it already has for some online communities. For example, if you want to buy clothes you could try entering an online 'world' creating an online virtual You to go from 'shop to shop' with, looking at products and trying/buying what you see. Some 'worlds' allow you to recreate yourself accurately including weight and height and then try on things apparel such as jeans to give you an idea of how you would look in that particular item.

Will 'worlds' like this destroy normal web browsers such as IE ? - It seems unlikely. Traditional web browsers provide such freedom and ease of access that it is hard to see any other alternative taking over. However they are part of the new, 'thinking out of the box' wave of alternatives that some people will find attractive, and really who knows what the future will bring.

Wednesday, July 2, 2008

Can Data Breaches Be Expected From Bankrupt Mortgage Lenders?

by: Tim Maliyil

The stock market is in a tumult. Actually, it has been for about a year, ever since the subprime fiasco (anyone take a look at Moody's performance over the past year?) Now that that particular issue has been beaten to death, other mortgage related issues are cropping up. Most of the stuff covered in the media is financial in nature, but some of those mortgage related issues do concern information security.

It's no secret that there are plenty of companies in the US that discard sensitive documents by dumping them unceremoniously: leave it by the curb, drive it to a dumpster, heave it over the walls of abandoned property, and other assorted mind boggling insecure practices. In fact, MSNBC has an article on this issue, and names numerous bankrupt mortgage companies whose borrowers' records were found in dumpsters and recycling centers. The information on those documents include credit card numbers and SSNs, as well as addresses, names, and other information needed to secure a mortgage.

Since the companies have filed for bankruptcy and are no more, the potential victims involved have no legal recourse, and are left to fend for themselves. In a way, it makes sense that companies that have filed for bankruptcy are behaving this way. (Not that I'm saying this is proper procedure.) For starters, if a company does wrong, one goes after the company; however, the company has filed for bankruptcy, it is no more, so there's no one to "go after." In light of the company status, this means that the actual person remaining behind to dispose of things, be they desks or credit applications, can opt to do whatever he feels like. He could shred the applications. He could dump them nearby. He could walk away and let the building's owner take care of them. What does he care? It's not as if he's gonna get fired.

Also, proper disposal requires either time, money, or both. A bankrupt company doesn't have money. It may have time, assuming people are going to stick around, but chances are their shredder has been seized by creditors. People are not going to stick around to shred things by hand, literally.

Aren't there any laws regulating this? Apparently, such issues are covered by FACTA, the Fair and Accurate Credit Transactions Act, and although its guidelines require that "businesses to dispose of sensitive financial documents in a way that protects against 'unauthorized access to or use of the information'" [msnbc.com], it stops short of requiring the physical destruction of data. I'm not a lawyer, but perhaps there's enough leeway in the language for one to go around dropping sensitive documents in dumpsters?

Like I mentioned before, inappropriate disposal of sensitive documents has been going on forever; I'm pretty sure this has been a problem since the very first mortgage was issued. My personal belief is that most companies would act responsibly and try to properly dispose of such information. But, this may prove to be a point of concern as well because of widespread misconceptions of what it means to protect data against unauthorized access.

What happens if a company that files for bankruptcy decides to sell their company computers to pay off creditors? Most people would delete the information found in the computer, and that's that-end of story. Except, it's not. When files are deleted, the actual data still resides in the hard disks; it's just that the computer's operating system doesn't have a way to find the information anymore. Indeed, this is how retail data restoration applications such as Norton are able to recover accidentally deleted files.

Some may be aware of this and decide to format the entire computer before sending it off to the new owners. The problem with this approach is the same as deleting files: data recovery is a cinch with the right software. Some of them retail for $30 or less-as in free. So, the sensitive data that's supposed to be deleted can be recovered, if not easily, at least cheaply-perhaps by people with criminal interests.

Am I being paranoid? I don't think so. I've been tracking fraud for years now, and I can't help but conclude that the criminal underworld has plenty of people looking to be niche operators, not to mention that there are infinitesimal ways of defrauding people (look up "salad oil" and "American Express," for an example). An identification theft ring looking to collect sensitive information from bankrupt mortgage dealers wouldn't surprise me, especially in an environment where such companies are dropping left and right.

The economics behind it make sense as well. A used computer will retail anywhere from $100 to $500. The information in it, if not wiped correctly, will average many times more even if you factor in the purchase of data recovery software. Criminals have different ways of capitalizing on personal data, ranging from selling the information outright to engaging in something with better returns.

Is there a better way to protect oneself? Whole disk encryption is a way to ensure that such problems do not occur: One can just reformat the encrypted drive itself to install a new OS; the original data remains encrypted, so there's no way to extract the data. Plus, the added benefit is that the data is protected in the event that a computer gets lost or stolen. However, commonsense dictates that encryption is something ongoing concerns sign up for, not businesses about to go bankrupt. My guess is that sooner or later we'll find instances of data breaches originating from equipment being traced back to bankrupt mortgage dealers.

The stock market is in a tumult. Actually, it has been for about a year, ever since the subprime fiasco (anyone take a look at Moody's performance over the past year?) Now that that particular issue has been beaten to death, other mortgagerelated issues are cropping up. Most of the stuff covered in the media is financial in nature, but some of those mortgagerelated issues do concern information security.

It's no secret that there are plenty of companies in the US that discard sensitive documents by dumping them unceremoniously: leave it by the curb, drive it to a dumpster, heave it over the walls of abandoned property, and other assorted mindboggling insecure practices. In fact, MSNBC has an article on this issue, and names numerous bankrupt mortgage companies whose borrowers' records were found in dumpsters and recycling centers. The information on those documents include credit card numbers and SSNs, as well as addresses, names, and other information needed to secure a mortgage.

Since the companies have filed for bankruptcy and are no more, the potential victims involved have no legal recourse, and are left to fend for themselves. In a way, it makes sense that companies that have filed for bankruptcy are behaving this way. (Not that I'm saying this is proper procedure.) For starters, if a company does wrong, one goes after the company; however, the company has filed for bankruptcy, it is no more, so there's no one to "go after." In light of the company status, this means that the actual person remaining behind to dispose of things, be they desks or credit applications, can opt to do whatever he feels like. He could shred the applications. He could dump them nearby. He could walk away and let the building's owner take care of them. What does he care? It's not as if he's gonna get fired.

Also, proper disposal requires either time, money, or both. A bankrupt company doesn't have money. It may have time, assuming people are going to stick around, but chances are their shredder has been seized by creditors. People are not going to stick around to shred things by hand, literally.

Aren't there any laws regulating this? Apparently, such issues are covered by FACTA, the Fair and Accurate Credit Transactions Act, and although its guidelines require that "businesses to dispose of sensitive financial documents in a way that protects against 'unauthorized access to or use of the information'" [msnbc.com], it stops short of requiring the physical destruction of data. I'm not a lawyer, but perhaps there's enough leeway in the language for one to go around dropping sensitive documents in dumpsters?

Like I mentioned before, inappropriate disposal of sensitive documents has been going on forever; I'm pretty sure this has been a problem since the very first mortgage was issued. My personal belief is that most companies would act responsibly and try to properly dispose of such information. But, this may prove to be a point of concern as well because of widespread misconceptions of what it means to protect data against unauthorized access.

What happens if a company that files for bankruptcy decides to sell their company computers to pay off creditors? Most people would delete the information found in the computer, and that's that-end of story. Except, it's not. When files are deleted, the actual data still resides in the hard disks; it's just that the computer's operating system doesn't have a way to find the information anymore. Indeed, this is how retail data restoration applications such as Norton are able to recover accidentally deleted files.

Some may be aware of this and decide to format the entire computer before sending it off to the new owners. The problem with this approach is the same as deleting files: data recovery is a cinch with the right software. Some of them retail for $30 or less-as in free. So, the sensitive data that's supposed to be deleted can be recovered, if not easily, at least cheaply-perhaps by people with criminal interests.

Am I being paranoid? I don't think so. I've been tracking fraud for years now, and I can't help but conclude that the criminal underworld has plenty of people looking to be niche operators, not to mention that there are infinitesimal ways of defrauding people (look up "salad oil" and "American Express," for an example). An identification theft ring looking to collect sensitive information from bankrupt mortgage dealers wouldn't surprise me, especially in an environment where such companies are dropping left and right.

The economics behind it make sense as well. A used computer will retail anywhere from $100 to $500. The information in it, if not wiped correctly, will average many times more even if you factor in the purchase of data recovery software. Criminals have different ways of capitalizing on personal data, ranging from selling the information outright to engaging in something with better returns.

Is there a better way to protect oneself? Whole disk encryption is a way to ensure that such problems do not occur: One can just reformat the encrypted drive itself to install a new OS; the original data remains encrypted, so there's no way to extract the data. Plus, the added benefit is that the data is protected in the event that a computer gets lost or stolen. However, commonsense dictates that encryption is something ongoing concerns sign up for, not businesses about to go bankrupt. My guess is that sooner or later we'll find instances of data breaches originating from equipment being traced back to bankrupt mortgage dealers.