Thursday 30 September 2010

Compiler Optimizations

The task of every compiler is to translate high-level language source code to machine code that will run on target processor. This may be achieved through assembly language file or by linking object files, the ultimate goal of every compiler is to generate code for the target processor. In principle this is a simple task. Every high level statement can be translated in a series of target instructions. However, without some optimizations this code would be very inefficient. Unoptimized code still works, but it is slower and the files are bigger.


The nature of high-level statements is to operate with variables. Loading and storing of variables can happen in any order. But transfers from and to memory are slower comparing to transfers between registers. And if some value is stored to memory location and immediately after it is needed again in a different calculation, then it doesn't make sense to load it again since it was already present in some register. With proper register loading we can save a lot of redundant loads and stores. There are many optimization algorithms to make the code as efficient as possible. In fact, the compiler optimization is a science. There are many books written on this subject.

Most optimization algorithms are based on control and data flow analysis. There are many optimization approaches: to reduce jumps, to remove dead, redundant or common code, to remove redundant loads and stores, to use registers efficiently, to replace slow instructions with faster ones, to replace specific arithmetic calculations with short instructions, etc. Each optimization is based on some code or data properties. Some of the common optimizations include: constant folding, integer arithmetic optimizations, dead code elimination, branch elimination, code-block reordering, loop-invariant code motion, loop inversion, induction variable elimination, Instruction selection, instruction combining, register allocation, common sub-expression elimination, peephole optimizations, etc.

The basic rule of every optimization is that the original functionality of the program should not be changed. All optimization algorithms are based on the assumption that the program under analysis is the only one who changes memory locations. In reality interrupts or hardware ports can break this rule. Therefore proper actions must be taken to prevent optimizations on memory locations which might be modified outside the code we are trying to optimize. An additional programming technique that complicates optimizations is using pointers. But with proper analysis it is possible to apply safe optimizations on most parts of the code.

You can optimize the code for speed or for code size. In each case the program runs faster or occupies less memory. The later is extremely important in embedded programming where the memory of microcontrollers is limited. Compiler optimizations are an important part of every compiler. Unoptimized code is a waste of memory and time on any target system.

A practical example of  compiler optimizations in action is the Pascal compiler for 8051 microcontrollers which uses Turbo Pascal syntax and generates optimized code. If you are interested in practical compiler construction you can examine the Turbo Pascal internals. This is Turbo Pascal compiler written in Turbo Pascal which can be used as an excellent book on compiler design.

Monday 20 September 2010

Home Web Hosting - Access Your Home Server From the Internet

If you are a webmaster developing or maintaining websites then you probably have a home web server for development or testing purposes. An old computer can easily be transformed into a Linux server. If you install also Apache, MySQL and PHP applications you will be able to host websites at your home. Of course, this is not the same hosting as offered by commercial providers, it is only a place where you can install, develop and test web applications. With some additional settings you can access your home computers from any place that has internet access. This is very convenient for getting files you forgot at home.

Assuming that you already have a local area network (LAN) including the router, which is a bridge between your computers and the internet, you need no additional equipment. In order to enable web access from the internet, you have to configure your router to forward outside web requests to the your server. The default port for web is 80, but you can choose any other port number to increase security.


The first prerequisite for remote access is a domain. This is the name of your home server on the internet. There are many services that offer free domains and free updating of IP addresses. This is especially important if you don't have a static IP address. Most internet providers dynamically assign IP addresses which change once a day or with every connection. Since knowing your external IP address is crucial to access your computers from the internet, you need a way to update the domain DNS records.

Many routers support popular dynamic DNS services like DynDns. If your router has no such function you can still install simple software utility on your computer which will update address every time it will be changed. You should create an account, select preferred domain name and choose your name which you will use as a sub-domain. This account data you should enter into the router settings or external software for IP address update.

Now you have access to your router. Each time you will enter your name followed by the chosen domain, you will reach you router. If you have enabled access to its user interface from the external port, you will be able to configure it as you do it from your home computer. To reach your server you need to configure port redirection. If you would like to use the standard port 80 for web access, then simply create a rule to forward external (public) port 80 to the local port 80 at IP address of your web server. This is everything that is needed to access your web server from anywhere.

For hosting of real websites especially if they are commercial in nature you need a reliable hosting like Hostgator. Simply because the website needs to be available 24 hours a day and downtime in the case of hardware failure should be as short as possible. Commercial hosting companies usually provide professional service for few dollars per month. This includes unlimited number of domains, unlimited disk space and many other goodies.


Having a possibility to access home computers from any location is very useful when you need some files or data from home. Even if you have a dynamic IP address you can access your development websites and show them to your friends or clients.

Saturday 18 September 2010

What to Look For in a LCD Computer Monitor?

In general, it is difficult to say which computer component is most important. Probably there is no such component. Every part is needed to perform some task otherwise we wouldn't need it. But there is one difference. With some computer components we have direct contact. One of such components is monitor. Computer monitor communicates with us in a very special way. It displays all the information about the status of the computer as well as windows of currently active applications. Therefore, it is essential that this computer component is carefully selected and that it displays stable and clear picture.

Currently LCD is the predominant technology for computer monitors. There is no big difference between computer LCD monitors and LCD TV sets. The only major difference is the additional interface electronics that makes a LCD TV set look like a television and LCD monitor to look as computer monitor.


Usually, when buying computers we look at the price tag. This is normal since price ranges can vary significantly. But the price should not be the only parameter upon which we will make a decision. When buying computer monitors we should consider the following:

Main Purpose of Our Computer
We can use the computer as office tool, gaming machine, designer's drawing board, or a combination of listed and also other purposes. Each purpose needs emphasis on a different parameter. For example, designers working in desktop publishing need large desktops and realistic color reproductions, games need monitors with fast response, etc. The first step in choosing monitor is to define the main purpose of the computer.

Size
The bigger the better. This simple rule is valid for all computer purposes. With larger working area you will easily work with many applications at the same time. Application windows will not be squeezed to small cluttered rectangles, the taskbar will look as a nice informative bar and not like a bar with many small buttons, on the desktop you will be able to put more icons for frequently used programs, etc. Because all LCD monitors are flat there will be no problem with the space on the table. So the only disadvantage with large monitors is maybe higher price.

Resolution
Again, the bigger the better. The display resolution tells us how many pixels or points the monitor can display. More pixels means larger monitor or smaller image dots. A typical resolution for multipurpose computer is 1920x1200 pixels. It allows comfortable work with office programs, web browsing and also good experience with CAD programs.

Interfaces
Almost all computer LCD monitors have DVI input. This is standard for computer graphics cards. If we intend to connect multimedia devices we also need a HDMI input. Many monitors have also integrated USB hubs. This is very handy to connect keyboard, mouse and external disks. Some computer monitors have also a TV tuner which converts them into a real TV set.


Choosing computer monitor is not an easy task even if you know what to look for. For more tips and hints you can visit the http://computerlcdmonitors.org/ website which provides few basic facts about computer LCD monitors.

Perhaps the most important fact you should remember is that you will be staring for hours into this flat panel. Make sure it will look pretty.

Thursday 16 September 2010

Universal JTAG Cable

Every electronic device around has some microcontroller or other programmable device in it. Even kitchen appliances or toy has some kind of electronics which controls important functions. In some cases this electronic device has predefined functionality which can not be changed. But in most cases this electronics contains a processor that uses flash memory to store code. And if a device is programmable then you can change the program it runs at any time.


Of course, in many cases this programming is not needed since the electronics performs what is supposed to. But in some cases it makes sense to allow end users to update the device with the latest firmware. Therefore, it makes sense to have a universal interface that can be used in factory and at home to program electronic devices with simple additional hardware. This universal interface is JTAG. This is a standard that defines a serial interface to transfer data between a device and computer (or between devices). The JTAG interface needs only four basic signals. This makes it easy to use PC parallel port as JTAG interface. While this works many modern JTAG programmers use USB port to connect to the PC.

Another beauty of the JTAG interface is the possibility to connect many devices in chain. This way you can program all the JTAG enables devices on the board with only one interface. In any case you need a JTAG cable to connect to the device. Each device may have own JTAG pinout but this is not a problem since JTAG cables can be made (almost) universal--at least for few device types. Unfortunately, JTAG cable is not enough. You need also a suitable software that will program the target device.


JTAG is not used only for commercial electronics. It is also a standard interface for all embedded systems including FPGAs, microcontrolelrs and various memories. Each manufacturer has some kind of JTAG cable to program their devices. Unfortunately, these cables use dedicated software and can not be used for other purposes.

One of the very popular uses of the JTAG interface in consumer electronics is to reprogram some router models with new firmware. There are many cheap JTAG cables that can be used to bring a new life for an old router. Changing original firmware is very popular not just for routers but also for other highly popular electronic devices. The Xbox is such example where a simple JTAG hack can change some built-in functions.


There are many JTAG cable supplies around. Before you decide for specific cable you need to check the JTAG pinout and software support. Usually, cheap cables are made only for specific devices and use the parallel port signals to program the device. More advanced JTAG cables use USB interface or at least a buffer to boost signals from parallel port.

Tuesday 14 September 2010

Digital Television and Computers - Watch TV on Your Computer Monitor

Digital television broadcasting uses satellite and terrestrial platforms to deliver content to the viewers. Digital television is available also via cable and IPTV networks. Most people use TV sets to watch their favorite channels, but there is an alternative way of watching television. If you have a computer you can receive and decode all available channels including premium services if you have a suitable subscription card.

There are at least two methods to upgrade your computer for television reception. You can either use an internal radio/TV card or a small USB stick. Both hardware solutions work well and are not expensive. USB stick is more suitable for use with laptops but you can also use it with a desktop computer as a permanent solution. There are many different models that support different technologies. For terrestrial broadcasting in Europe and many other countries your device should support DVB-T. DVB-S in needed for satellite television and DVB-C for cable reception. Many cards and sticks also support analog FM radio and analog TV. Before you purchase such card or stick you should check which standards are used in your country.

Many internal cards have on-board audio/video decoding support. This hardware decoder does the work which is otherwise very time consuming and processor intensive. USB sticks have no hardware decoder, there is only a demodulator that supplies the DVB transport stream. While the main computer processor will have to decode the sound and picture, this is also an advantage. It is much easier and cheaper to update decoding software than to replace decoding hardware.

Most TV cards and sticks are provided with a remote control. It is very similar to the real TV remote control and enables you to turn your computer monitor into an advanced TV set. Every television decoding software allows you to display the TV picture over entire screen so you can enjoy as in front of standard TV set. The only disadvantage of this way of watching TV is that while you watch full-screen TV you can not use the computer for other work.


Receiving television channels with computer has also one big advantage. You can easily record any channel. You can also record one channel and watch a different one. This is all possible because the software is able to decode many different services simultaneously. Of course, this also depends on the speed of your processor, but in general recording of particular channel or entire stream is not a problem.

Digital television is much closer to computers than it was in the old analog days. Digital broadcasting uses codecs which have roots in computers. Watching television on the computer is nothing new. In fact, digital broadcasting is only a network of many dedicated computers and peripheral devices.

Sunday 12 September 2010

Computer Forensics - How Volatile Data is Analyzed

Computer forensics plays an important role in fighting terrorism and criminal activity. The fact is that bad guys use computers, internet and other modern communication tools to communicate and to store their plans. We would be naive if we would think that they can barely open Word or Excel. They are aware of all the risks and they protect themselves with modern encryption algorithms and general protective measures. Fighting criminal activities is very different from discovering occasional violations on company computers.


Many traces can be hidden if the software used for criminal activity or otherwise unwanted is not present on the computer disk and runs in the memory of the computer. It is very easy to start some process and then successfully cover all traces that were left behind. In such case analyzing disk data makes no sense because nothing suspicious could be discovered. The only solution to this problem are tools that can protect volatile data like live memory.

The static analysis of computer data (i.e. the analysis of a hard disk removed from the computer) is usually not enough because many advanced techniques can be used to erase all traces from file systems and the only relevant data remains only in memory. Theoretically, it would be possible to freeze computer memory by liquid nitrogen and this would significantly increase chances to recover the data but this approach is not practical. Analysis of live volatile data in a computer is essential for any serious forensic analysis.


There are many open source and professional commercial forensic tools that can make a snapshot of crucial volatile data for later analysis. Such tools can discover open ports, virtual disk drives, VPN connections and other resources not visible to the normal user. In some cases also the whole disk drive or individual partition can be encrypted so it is important to make an image of it before the system is shut down. Once all the data is safely stored it can be analyzed regardless of the state of the computer.

A logical question would be, for example, what can be done to successful hide some processes running in the computer memory? Theoretically, it would be possible to eliminate traces from the memory when the process is not active or when it waits for some input. But even for such approaches there are some solutions. It is possible to create memory snapshots at periodic intervals and sooner or later the secret process will show itself.

For many computer users the most requested computer forensic service is recovering lost files. Check the http://digitaldatarecovery.org/ website for some practical tips on digital data recovery and practical advices to recover important files from broken hard disks or flash memories.


Computer forensics is becoming increasingly important part of the efforts to detect and prevent terrorist activities. But the game will never end. More advanced hiding techniques will lead to more advanced discovery techniques which will lead to even more advanced hiding techniques, etc.

Friday 10 September 2010

Popular Microcontrollers

Embedded systems are not just complex projects in electronic laboratories--they are present in everyday devices. Every mobile device, electric toy or kitchen appliance has some electronic board which usually includes a programmable device--microcontroller. This is a special microprocessor with peripheral devices and I/O ports. Depending on the volume of the device the manufacturer can decide whether to develop an ASIC--a dedicated integrated circuit which performs all functions for this device or to make a standard board with discrete components. In both cases some microcontroller is used, either as a soft core in ASIC or a standard integrated circuit.

Popular microcontrollers are popular mainly because of availability of affordable development tools and low prices of devices. When hobby engineers start using one family they get used to it and it is very likely that they will use it later in a professional project. While PIC and AVR microcontrollers are heavily used in hobby projects, ARM has prevailed in the professional embedded world. However, 8051 microcontrollers are still used in many hobby and professional projects.

There is a plethora of choices from open-source projects to various IP cores with significant royalties for each device. Despite this choice there are few microcontroller families that are popular because of their flexibility, powerful development tools or because of historical reasons.

ARM


This is currently the hottest RISC core used in almost all mobile phones, portable devices and many other applications. It has powerful instruction set, low consumption, offers easy integration and there are many good development tools for easy development and debugging. The ARM core is also used in many popular microcontroller families from Atmel, Luminary Micro (now Texas Instruments), NXP and many other manufacturers. These microcontrollers are very popular among embedded engineers and are used in various applications from automotive industry to hobby projects.

AVR


This is one of the most popular microcontroller families from Atmel. It is also very popular among hobby engineers and it is used in many projects from simple LED controllers to complex communication devices. The RISC architecture offers fast execution and low power consumption. Development tools are available for free which is a great bonus for electronics enthusiasts. AVR is a direct competitor to Microchip's PIC. Some favor AVR, others like AVR. There is no clear winner. Both families work well. It is up to the developer/programmer what he like or prefers.

PIC


This is a leading microcontroller family from Microchip. PICs are available in very small packages with only few pins and also as powerful 32-bit microcontrollers with many peripheral modules and I/O pins. They are very popular among hobby engineers--in hobby projects you will find either AVR or PIC.

8051


This is a very old 8-bit microcontroller architecture that has managed to survive for more than 30 years. Many excellent compilers, a lot of code examples and simple development has contributed to the popularity of this family. This core is still used in many modern microcontrollers from Silabs, NXP, Atmel and many other microcontroller manufacturers. It is very likely that the 8051 is the most widely used core in embedded applications. Of course, many new designs will probably use ARM or some other advanced architecture, but because of popularity of the 8051 family in the past and availability of development tools (C, Assembler and Pascal),  it is still used in many applications.

Internet Service - Wireless Vs Wired

It is amazing that roughly 20 years after first commercial use of the internet we are now totally dependent of it. This dependence in many cases evolves into addiction. There are many popular services that we are using on a daily basis. Facebook, Gmail, and Twitter are only few popular names that wouldn't exist without internet.

First home access to the internet was via modems and telephone infrastructure. At that time speeds were adequate for websites and services that were available. The next step was replacement of modems with xDSL technology which significantly increased the range of available bandwidths. The latest approach is using fiber cables to get the fastest internet speeds in individual homes. 1 GBit/s is now reality for anybody having the possibility to connect to the optical network.

Another technology that evolved in parallel with wired access was, of course, wireless access to the internet. The most common technology today in use is Wi-Fi which allows speeds over 100 Mbit/s. All you need for a wireless access is an access point (usually a wireless router) and WiFi enabled laptop or any computer with wireless network card.

For average user there is probably no big difference between these two ways to access the internet. However, there are some very important aspects that need to be taken into account when deciding which technology to use.

Wired access provides constant bandwidth between individual user and internet service provider. It is mainly the bandwidth of the ISP's backbone which determines actual speed that we will be able to achieve. Wired access enables permanent connection. This is important when you need reliable connection which should be also available from the outside. Static IP address is usually used for such purposes and if the connection is important it is also powered by an UPS.

The main difference of wireless internet access is that it uses a radio channel to transfer data. This channel has limited capacity which is usually shared among many users. Even if there are many channels available the total bandwidth that we can get depends also on the number of users and their transfers. In general, this is an additional bottleneck for our access to the internet. Another significant difference is that the connection is not permanent. When it is established there is no guarantee that is will stay as it is. The situation in the radio frequency spectrum can change, other users may start using the same access point, slight change of antenna position may significantly decrease connection speed, etc.

One typical example of a device that can successfully use wireless connection is a wireless credit card reader. It enables credit card processing away from the office or counter. Wireless credit card terminals significantly simplify making business and allow buyers to keep their credit cards in their hands.

In many cases you can use either wired or wireless access. But there are also cases which imply reliable wired connection or wireless access at any cost.

Thursday 9 September 2010

Life With Internet

Internet was born long before we have noticed it. Today we can not imagine life without it. There are many people who believe that internet is world wide web (WWW) and vice versa. Of course, web is only one of the services that are available over the internet. The fact is that our lives depend on the internet services even if we don't have a computer. It it difficult to imagine life without internet as it was difficult to imagine life without electricity thirty years ago. Internet is a worldwide network. And only a network. Services are what helps us to send a message or to do something.


Internet Service Providers
In order to connect to the internet you need a contract with some internet service provider (ISP). Access to the internet can be made available over analog phone line, ISDN, various xDSL technologies, fiber to the home (FTTH), wireless networks, cable network, etc. The speeds vary but it is not uncommon to have 100 Mbit/s or even 1 Gbit/s via FTTH. Regardless of the connection speed, internet access is a must.

Email
Sending electronic messages is one of the very popular internet services. In order to send and receive a message you need at least two mail servers, one for sending and one for receiving the message. Since email address is a prerequisite for any business, not just online, each company has at least one mail server. Fortunately there are many free email service providers. One of the most popular and famous is Gmail, free email from Google. If you don't have a Gmail account you are missing a lot.


World Wide Web
WWW is probably the most widely used internet service. This is probably the reason that www means the same as internet to many people. Having a homepage is a must. Not just for companies but also for communities, clubs, organizations and also individuals. Writing a blog is very popular these days. Because of this popularity there is a huge competition among web hosting providers. The prices are low, setting up a website is pretty simple and just anybody can have his own website.

Spam
Spam is not an internet service but it is a direct consequence of cheap or even free internet services. It is so easy to create an email and send it to any address. And sending the message to millions of worldwide email addresses is no different. Email spam is a big problem. There are many companies that are offering anti-spam products or services. One excellent anti-spam filter is used by Gmail. You will rarely receive a spam email. But there is also spam on the web. There are millions of pages that offer no useful content. When you are searching for something it is very likely that you will land on such page.

Of course, there are also other services on the internet. Sending IP packets is simple and every modern appliance or consumer electronic equipment has an ethernet connector. We can expect a huge expansion of communication over internet.

Tuesday 7 September 2010

Convert Your Old Computer to a Linux Server

Linux is a very popular platform. Not just because it is free but also because it is reliable and supports anything you can imagine. A popular setup is a Linux server without any graphical user interface. It can be used for web hosting, as a file server, as a database server, or for anything you need. Most people comfortable with Windows operating system are afraid to start thinking in a different way. In fact, installing and using Linux is pretty simple.


Once you decide to go for it you have already made the first step. The next step is to get some basic information about installing Linux. There are many Linux distributions. One that is very popular is Ubuntu. Simply Google for "ubuntu server" and learn about what do you need to install Linux. In general, things are pretty simple. You can install Linux on almost any machine. Your old computer that was replaced some time ago is a perfect choice for Linux. You only need some space on the hard drive, a CD or DVD drive, a network card and a lot of patience.

The first step is to make a bootable CD with the latest Ubuntu server image. Download the image file and burn it on a CD. Then you boot your computer with this CD and start installing Linux. It is a good idea to do this installation next to your main computer with internet accesses. This way you will be able to browse for any problem you may encounter. The most important thing you should know is that for every question you may have, there is an answer on some web page waiting for you. You only have to find it.

The installation process is pretty straight-forward. If you don't understand what the installer is asking you then simply select the default option. Of course, you can also ask Google for it and then choose appropriate option. You should understand that the Linux principle is very different from the Windows one. But once you become familiar with Linux shell and basic commands it will be very easy to work with Linux and to install and configure new software. In most cases the server will be located at some remote place and will not need a monitor. You will access it via network.


Having a Linux server is a great upgrade to your home network. This server will be your reliable storage for large peer-to-peer files, web server for website development or a computer to play with. And remember, sooner or later you will encounter a problem. Something will not work or you will not know how to change some setting. All you have to do is to search for the answer on the web. Web pages offer a giant encyclopedia on Linux.

Sunday 5 September 2010

Advantages of Joomla Content Management System

Most bloggers, internet marketers and individuals use the web to publish content. They usually don't care about HTML, CSS, JavaScript or PHP. Web content management system (CMS) is a complete platform to create a website from scratch. It provides the framework needed to communicate with the database, to store and retrieve data, to dynamically create pages, to deal with authentication and to do other tasks needed for any website. The main advantage of using CMS is that you can focus on content. You install the system, select template and few plugins and you are ready to go. One of the very popular web systems is Joomla.

Joomla is a universal free open-source content management system. Universality means that you can adapt it for almost any purpose. Bloggers usually choose Wordpress for the blogging platform. This is a natural choice, but you can also make blog with Joomla. The basic installation provides the backbone for a working website. The page layout can be set by installing appropriate template. You can choose among thousands of free templates or pay small fee for a professional customizable design. Additional features and functionality can be added with various extensions. Depending on the extension function they are available as components, modules or plugins. Extensions make Joomla extremely universal.


The beauty of extensions is that there is one for every imaginable function or feature. Whenever you need some functionality for the website you can go to the Joomla extensions page and search for the right module. Some extensions are available free of charge, for some you will have to pay. And if for some reason you can not find a suitable extension you can modify an existing one or you can create a new one from scratch according to your needs. For this you will need to know a little about HTML and PHP but this is not a problem since both languages can easily be learned from the existing code examples.

Another nice feature of Joomla is customization of the core system. With code overrides you can change default layout of most page types. If this is not enough you can modify any Joomla file to achieve the desired functionality. For this kind of customization you need to take into account that any upgrade may overwrite modified files. But if you are a skilled web programmer this will not be a problem. Since Joomla is an open-source platform you can modify it any way you like. Most customizations can be done without knowing any of the web languages, but if you know them you will have a very easy and pleasant task.


A simple example of Joomla CMS in practice is the http://fullspectrumbulbs.org/ website. This is a small site about full spectrum bulbs which uses basic Joomla system with a lightweight template. One of the benefits of such simple layout is fast page loading. Of course, there are few extensions installed but they don't provide any crucial functionality for visitors.

Joomla is a fantastic content management system. With few extensions you can make a website for any purpose. But if your goal is a blog you will probably use Wordpress. Not because it is better than Joomla but because it is easier to create a blog with it.

Friday 3 September 2010

Details About Typical Television Resolutions

Television resolution is information about the number of picture elements (points or pixels) that make up TV picture. The higher the number the better and more sharper is the picture. In general there are two groups of television resolutions. The standard definition television (SDTV) as we know it for the last 50 years has few different standards that didn't change since they were defined. The only "upgrade" was the addition of colors which did not modify anything related to the picture resolution. The high definition television (HDTV) means picture with more lines and picture elements. There are even higher resolutions (beyond HDTV) but the technology is still in infancy. Currently we are mainly talking about SD and HD television.


Standard Definition Television - SDTV

There are few different standards that were defined in the age of black and white television even before the World War II. Those standards define parameters like the number of lines, the number of pictures per second, and some other technical details. In general, the number of picture lines together with frequency bandwidth reserved for picture defines the television resolution. NTSC (the color television standard used in USA and Japan for SDTV) has resolution of 480 lines each with 720 pixels. In Europe and many other countries where PAL was used the resolution was 576 lines with 768 pixels in each line. Because this resolution was quite satisfactory and there was no easy way to enhance it, the old analog technology survived until recently when we started a transition to digital broadcasting.

High Definition Television - SDTV

Digital broadcasting allows us to use many advanced services and one of them is television picture with higher resolution--high definition television. HDTV means any TV resolution higher than SDTV. There are few standard resolutions that are supported by professional and commercial equipment. Each resolution has two modes: interlaced and progressive. Interlaced means alternatively displaying odd lines in one frame (still picture) and even lines in the next one, while progressive mode displays all lines in each frame. The basic HD resolution is 1280x720 which means displaying 720 lines with 1280 pixels in each and the highest HD resolution is 1920x1080. There are also few intermediate resolutions which are mainly used by smaller display devices. This means that the highest HD resolution is 1920x1080p which is supported by larger LCD and plasma displays but currently there is no terrestrial television that broadcasts this way. Most HD televisions use either 720p or 1080i mode.


Of course, HDTV is not the last step in television resolution. As the technology will evolve there will come higher resolutions that will need higher bandwidths, new transmission equipment and new displays (TV sets). This is called development.

Digital terrestrial television in Europe uses DVB-T as transmission standard. Some countries use MPEG-2 coding standard while other use even more efficient MPEG-4 standard. An additional improvement can be achieved by using statistical multiplexing which dynamically allocated bandwidth to each service according to current needs.

Wednesday 1 September 2010

A PC Based Digital Storage Oscilloscope

Digital Storage Oscilloscope (DSO) is an indispensable tool to analyze complex signals or debug electronic devices. Because the signals are sampled and stored into memory it is very easy to examine them and their relations. DSO is in fact a special computer with many fast A/D converters and software to analyze and display signals. With DSO the problem of displaying short-time signals and glitches is history. It only depends on your settings and trigger conditions to capture the right moment. Storage capacity is rarely problem unless you would like to store longer intervals.


Because of the computer-based design of the DSO it is pretty simple to convert ordinary PC into a DSO. PC Based digital storage oscilloscope is in fact software which runs on a PC with suitable interface for analog signals. This interface is usually a small box with fast A/D converters connected to the parallel or USB port. PC based oscilloscopes have many advantages over classical DSOs. Since the capabilities of the software are limited only by the interface and computer resources it is possible to track many channels in real time, analyze FFT spectrum or decode buses like SPI, I2C, JTAG or UART.

The physical limitations of a PC oscilloscope are mainly defined by the external hardware that is used to sample analog or digital signals. The main parameters that define PC DSO capabilities are number of channels, bandwidth, sample rate and sample memory size. The sampled data is stored into buffer from where it is transferred to the PC where it is analyzed and displayed. Therefore, at least in principle the buffer needs only to be large enough to store samples until they are transferred to the PC. Only in principle because the parallel port is not fast enough to transfer huge amounts of data in real time.

A big advantage of PC based oscilloscopes is upgradeability of the software. You simply install new version and you get bugs fixed or new functionality. Some PC oscilloscopes also allow you to write your own plugins for custom decoding. The average price of PC oscilloscope is significantly lower than the price of a real DSO but it is still higher than the price of the PC where it will run. Nevertheless, PC DSO is a compact, cheap and universal solution for hobby or professional electronics laboratory. It allows you to analyze arbitrary signals, to decode popular serial protocols and to store signals for later processing.


A special version of PC based oscilloscope is the I2C analyzer. This is a simplified device which samples few channels and decodes I2C, SPI, UART and other popular protocols. An important function is the I2C host adapter mode which allows you to send I2C or SPI packets and act as a master device.

I2C analyzers are much cheaper then PC based DSOs and many such devices can also be used as a logic analyzer with basic functionality.