Friday, April 25, 2008

A+ Certification

A+ Certification is gaining a reputation as a new tool for employers to narrow the search for qualified employees. IGT has acknowledged this certification for not on the IT department employees but also the field service technicians.
This certification guarantees the basic knowledge that used to be included in entry level training. It certainly saves companies money not only in training but, depending on the position, some employees can begin being productive is a much shorter period of time. I wouldn't be surprised if it became a requirement for some PC related industries.
As technology moves forward faster each day, PC knowledge is quickly becoming a necessity in almost every job. Even if you don't work directly with the internal components of your PC, just knowing how it functions behind the cover is invaluable.
In addition to numerous PC industry related jobs, there are also endless numbers of people that own home PC's that are clueless about how they work. As I have found out, you could conceivably stay busy 24-7 just helping friends and family with their equipment.
My philosophy is "Keep it fun". If you work on or with PC's and it's just a "job", you might find that it gets old fast. If you find it a challenge and you enjoy a good challenge, then it's no longer just a "job". Doing something fun and challenging and getting paid for it is an awesome combination.

Friday, April 18, 2008

Mulitimeter

The multimeter is probably the bench technicians best friend. These instruments can be portable or bench top and range in price from a disposable $10 to over $7000. The older analog meters have all but been replaced with the digital devices and frankly I don't miss them. The analog were much more difficult to read and you always had to be aware of the range setting and the lead polarity. The newer digital meters are multi-ranging so you don't have the fuse issue you used to have. Not to mention bent needles and having to zero the meter before each use.
There is good news for analog fans though, they are affected less by RFI(Radio Frequency Interference). Unfortunately, I've never had an issue with this and I use an EMI receiver to test for radio frequency events rather than a hand-held meter.
The digital meters are also extremely accurate. I prefer the digital LCD over guessing whether the needle is between two points. I use an older Fluke 187. It's not the top of the line and I don't believe they even support that model any more, but I like the abuse it can take and it never complains. I also have the option of an IrDA connection to my PC which is invaluable for data logging on long tests. You can set several parameters and plot curves or graphs over any length of time. That saves me from hand logging test data all day.
The higher end Keithley bench top meters will run into the thousands of dollars but I think they are the best choice for critical measurements. The accuracy is phenomenal and assuming you can pry it from the engineers hands, make circuit analysis much more enjoyable.
The leads used on your multimeter can also play a big part in your measurement success or failure. The spring loaded retractable clip leads are my favorite followed by the extreme point probes depending on the application. I dislike the alligator clip leads as they have a tendency to pop off at the wrong time and land in the wrong place. A good set of leads is a must since chasing an open circuit with a broken probe wire will send you over the edge. Making this small investment and checking them frequently is my best advice.

Docking Station

Docking stations are commonly used to connect laptops and other portable devices to peripherals such as printers, storage drives, and speakers. The new trend is away from the home and office environment and into your car, boat, motorcycle, and recreation vehicles. Some models allow hot swapping and some require the power to be off or require the system to be in standby to connect.
Another type of docking station is the port replicators. This allows the notebook user to connect to printers via a parallel port, USB device, ethernet, sound, and video (VGA and DVI) at a reasonable price. Port replicators can supply DC power and some have converters to enable you to use European voltages if you travel.
There's quite a market for the mobile docking stations these days. Law enforcement has cut hours of endless paperwork by processing reports right at the scene instead of at a PC at the station. They can now run my expired tags and issue me a slip, sending me on my way in record time. That's great!
Of course, MIL-STD has recommended specifications for the mobile docking station manufacturers. MIL-STD 810E and 810F were developed in conjunction with the Department of Defense and other government agencies which involve testing materials and components to insure they function in various environmental conditions. This would include variations in temperature, vibration, humidity, and impact. There are more tests but we get the idea. For a manufacturer to sell their docking stations and stands to the military and public safety departments, these devices must prove they can withstand difficult to dangerous operating conditions without failure.
Several companies offer the mobile docking stations with anti-theft devices but I'm still all for locking the equipment in the trunk.

PDA

The first PDA I remember using was the early Palm Pilot. They could store meeting information, some short notes, had a calculator, address book, calendar, and a clock. Pretty basic. They were just one notch above the tape recorder and compared to the Black Berry, fairly barbaric.
With Bluetooth capabilities, Wifi, and IrDA, the Internet is accessible from virtually anywhere with the new portables. Gone is the stylus (still available) replaced by the touchscreens and very small keyboards, and the need to carry a brick cell phone.
Several options have been added to the PDA to adapt to car use as well. They have GPS (Global Positioning System) capabilities which are being installed in new cars and systems that can be added to the older cars. I personally enjoy having one when I travel out of town. I can't read a road map and drive unlike some of our out-of-town visitors.
The medical field is also benefiting from the new technology. The access to reference materials and patient chart updates have enabled some medical professionals to diagnose conditions and prescribe the most up-to-date drug treatments within minutes. Physicians can also dock their PDA and download an entire days worth of notes on patient visits, communicate with other physicians, or update their clinical database.
Updates for your PDA are as simple as downloading them from the Internet just as you would for your desktop PC. Many options for memory stick upgrades can be purchased as well as small keyboards. Newer models even have USB ports to connect directly to your desktop without needing a docking station.
The new touchscreens are even starting to incorporate the keyboards into the displays, but they are still new and have some accuracy issues. There are also some touchscreens that use translators attempting word recognitions for faster entry, but again they face accuracy issues. Given time, these will be standard features in the future.

Thursday, April 3, 2008

Video Capture Cards

Video capture cards, or TV cards, are designed to be installed into PCI slots, PCI express slots, AGP slots, or USB. The internet is the most popular use for streaming video. Surveillance is another popular application. the analog signals can be converted to digital signals and stored with on a hard disk drive, CD, DVD, or other storage device.
Performance of video capture cards can be adversely affected by which type of mother board and CPU you're using. Care should be taken in selecting a card suited for your particular system.
Video editing can be accomplished with these cards and software designed to allow the rendering of the video. this also includes audio dubbing and some cards have more than one audio channel for more sound options.
For the laptops, these are USB, Firewire, and PC interface cards that can be used. They have the same functions but are designed more for portable systems. Some camcorders can actually output the video to the computer in digital format with some editing function built in.
I have dabbled in the video capture game only briefly but with good results. I have a Plextor USB digital video converter installed on my XP Home system. I've copied old home movies from VHS onto DVD and saved many hours of video that is quietly disintegrating in the box. There's limited sound on some of the tapes and the quality is awful. I've been able to clean up some of it as well as cut some useless video that might have been taken by the kids. Several minutes of feet and sky.
The Plextor unit does have some really bad limitations though. It apparently does not play well with Windows Vista. If you're going to invest in video capture and editing equipment, you should ask around to see what problems other people have had. You could spend a lot of money buying devices that say they work across all platforms but really don't.

Solid State Drives

Several companies, dating back to the late 1970's, designed various solid state drives. Storage Tek, Santa Clara Systems, Sharp, Amiga, and Apple were a few of the first. M-Systems designed the first flash-based device which we know as SanDisk now.
Solid state drives are seen by the system as just another drive. These drives are most useful in portable devices since there are no mechanical parts. They operate solely on semiconductor circuitry so there's no real danger in bumping the device. They mostly are produced using nonvolatile flash memory but some are produced with DRAM volatile memory.
As a cost reduction measure, flash memory devices have gone from NOR flash to single-level cell (SLC) NAND flash and multi-level cell (MLC) NAND flash. Each chip is capable of being manufactured with more storage and uses about the same size footprint as older chips.
Solid state drives are faster and have lower access time since they have no moving parts, they use less power, make no noise, are reliable up to between 300,000 and 500,000 write operations, and take up considerably less room than a standard hard disk drive. There are down sides to solid state drives though. The price is still high, the storage space is low (but climbing), power disruptions and ESD are more hazardous, they will not necessarily last longer than a hard disk drive, and they have slower write speeds.
That being said, the future of hard disk drives is slowly being reduced except for large computer systems and servers. The development of longer lasting and larger capacity solid state devices are being pursued by many manufacturers and the cost is dropping accordingly.

Saturday, March 15, 2008

LabView

If you're going to find yourself working in a research or test lab any time in the future, spend a few days or weeks exploring the LabView program from National Instruments. I have had the pleasure of working with it for over two years now and I think it's a great user friendly programming environment. I have just upgraded to LabView 8.5 and the new utilities are awesome.

Instead of line-by line program entries, you use "G" programming or graphical representations of program instructions. You can control instruments and analyze data from just about any piece of equipment that can be connected to a PC. It runs on a variety of platforms including Windows, PDAs, Mac OS, and Linux. You can custom design your reports and can export data and spreadsheet information to programs such as Excel and Microsoft Access.

Programming can also be done to create virtual instruments to simulate real instrument I/O. For R&D applications it's a must. You also have remote access capabilities so you can operate your test from home over the Internet. This may sound like a sales pitch, but this stuff really works!

Some of the functions of this program include measuring pressure, strain, temperature, displacement, PH, and more. You can design programs using logic levels for digital circuits or design popular control panels with displays for user inputs. In the R&D environment it's useful for conceptual designs, to prove they work or they don't. This can be a great cost saving measure by reducing wasted prototype materials.

At this time, I am in the process of automating several repetitive tests we run on a variety of test measurement equipment. LabView can control each test bed with a minimum of operator intervention and even send me an e-mail when it's complete. This keeps me available for other tasks instead of sitting and waiting for a lengthy test process to run.

Since we are a test lab for gaming jurisdictions in the U.S., Canada, Europe, and other countries, LabView will give us the opportunity to automate tests and produce professional looking reports for each machine. These can then be linked to our Website and viewed by any department withing IGT. This will all but eliminate the tons of paper reports we generate and distribute each year. When we receive certification from Underwriters Laboratory to test to their standards and self-certify our machines, LabView will be an important part of the entire safety testing process.

If this program is something you might want to view, you can download an evaluation copy for a 30 day trial at the National Instruments Website ni.com.

Boot.ini

The boot.ini files tells the computer where the operating system resides on the drive. It will also indicate to the system, any other operating systems installed. The Ntldr (NT loader) files checks the boot.ini for the location of the operating system(s) and will either launch the operating system or present a menu for the user to select the operating system. The boot.ini file is also given the system, hidden, and read-only attributes so it's not visible to the user until you indicate to Windows that you want to view hidden files.
If you have a computer with two operating system installed, the boot.ini file will list each one and a menu will be displayed at startup so you can select which one you want to run. I have used this option for several years when changes needed to be made in older programs that could only be done in Windows 3.1 or Windows 95. I would make the necessary corrections and re-start the system in Windows XP. Of course, these older programs have now been revised but there was a time when dual operating systems were really handy.
One of the computers I have lists the boot.ini file as:
C:\boot.ini
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(1)\WINNT
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)/WINNT "Microsoft Windows
2000 Professional" /fastdetect
The first part of the file is the boot loader information. It specifies the timeout=30 which indicates the amount of time before the default operating system loads. The next line tells the system where the default operating system is which in my example would be Windows 2000 Professional. This is where ntldr gets the location of the operating system. This location is setup using the ARC or Advanced RISC Computing method.
The second part of the file is the operating system. This section shows the operating system(s) and any other boot programs and their location. My system uses a switch called /fastdetect which tells the operating system to skip peripheral inspections. If you have a dual operating system this is where it will show up. The names in the quotes are the menu items that appear for you to select which system you'd like to run.
If you need to make changes to your boot.ini files, you can use a text editor such as Notepad or simply type MSCONFIG from your cmd.exe screen. Make sure you backup your original boot.ini file before you make any changes in case you're prone to typos. If you need to make other system changes the MSCONFIG option will save you some time by displaying tabs for other system files.
The boot.ini files as we know it, has been alter in the Windows Vista operating system. This information is now stored in the Boot Configuration Data (BCD) store and can still be modified.

Boot Loader

The term boot loader can refer to several programs your computer may use to begin. it can be one or a combination of programs such as Ntldr, BTX, MILO, LILO, BIOS, Coreboot, EMILE, Redboot, Yaboot, PC Booter, Quik, Bootman, GNU GRUB, Elilo, Klibc, Loadin, SILO, Boot X, Gujin, Das U-Boot, SYSLINUX, PALO, and System Commander to name a few. These boot loader programs can be executed automatically or configured to run by command or event.
In some systems the first set of instructions will load the Ntldr program from the root location on the specified boot drive. This will prompt the StartUp module to set the CPU to protected mode which enables the 32-bit memory access which it then creates the initial Interrupt Descriptor Table, Global Descriptor Table, page tables, and enables paging. This fundamental structure is required for the operating system to function. The StartUp module goes out and begins loading the operating system by launching the operating system loader.
The next step in the process includes the boot.ini. The contents of this file are read to get the information contained in the system volume. This will indicate the location and name of the operating system directory. If you have a dual operating system installed, a boot menu will be presented and the user can select which system to run. After the selection the booting process continues and the CPU is switched to Long mode which will enable the 64-bit addressing.
If you are running Windows the Ntoskrnl.exe and Hardware Abstraction Layer (HAL) are placed into memory. From here the boot-time device drivers get loaded but are not initialized. These are loaded into the HKLM/SYSTEM registry. After the controls are retrieved and the correct file system, such as FAT or NTFS is loaded, the kernel takes over.
The kernel creates the structure for memory and the CPU interrupt gets initialized as does the memory manager. Kernel then looks for system drivers and initializes the devices. Once this step is complete, the Session Manager Subsystem is started. The smss.exe file starts the Autochk routine which initializes and checks each drive. This then creates the environment settings in the registry. Windows then starts from the winlogon.exe files and you're on your way.

Directory Service

The directory service is for use by network administrators. It allows them to set access controls on the domain for each user account. This is an abstraction layer allowing the administrator to customize each users requirements while keeping sensitive information out of reach. this is a service rather than a physical directory on the hard disk drive. The data pertaining to it is stored in files which store information about Preferences, Subscribers, Devices, Content, namespace, and more. The users are granted permission by the administrator for these types of information.
The directory service is a components of the Network Operating System (NOS) and is an information sharing system used for finding, organizing, administering, and managing network information such as user groups, printers, folders, files, and other resources.
Over the years, standards have been developed by the International Telecommunications Union (ITU) and the International Organization for Standardization (ISO) to provide interoperability across several vendor platforms. These are standard systems of organizing objects in logical order which is called the X.500 by ITU and ISO/IEC 9594. These standards apply to mail exchange and looking up names.
Many companies have adopted these protocols and have systems to handle directory services such as Apache which has a service called ApacheDS, and Novell which offers eDirectory. Windows has the Active Directory which is installed on the Windows 2000 and 2003 Server systems.
Identity management programs create another layer of security on objects that can be identified such as devices, applications, countries, or organizations. Each object is identified by certificates encoded within each object, and each certificate contains an issuer and a subject. Without these correct certificates, users within a network our outside the network can be denied access.

Friday, February 29, 2008

Fragmentation

File fragmentation occurs when files can no longer be placed next to each other or contiguously on a disk. Whey you purchase a PC, usually it contains an operating system and some additional files loaded at the factory. These files are mostly loaded one at a time and are fairly contiguous. Over time as you install and uninstall programs and data files, blank spaces appear between files that can be used for storage operations or new programs that you install. If these files are not exactly the same size as the free space, which most are not, the system places them in spaces anywhere on the disk that there is room. This means that files can be spread all over the disk slowing the system down because the seek time increases. This used to be called checker boarding, but I haven't heard that term used for a hundred years.
Since our hard drives have increased in size every 6 hours, the amount of fragmentation has also increased. Data files and program files can be scattered literally from one end of the disk to the other. Defragmenting the drive is an absolute necessity especially when your storage capacity increases. Just because you have more room doesn't mean your system will continue to run quickly forever. It just means you have more room for fragmented files.
Check your drive(s) frequently to see how much space is being underutilized. This can be done in the System or Administrator Tools menus. Select Disk Defragment and click on the analyze button. This will tell you is you need to defragment your drive or not. If you see mostly red lines, you have a mostly fragmented drive.

ATTRIB

The ATTRIB command can be used to make files hidden, read-only, system, and archive. To change a file that is visible to a hidden file, you would enter ATTRIB +H and the filename with it's extension at the C:\ prompt in the directory that contains the file. Such as C:\ATTRIB +H visible.doc. To make a file read-only prevents any unwanted changes from occurring. At the C:\ prompt you would enter C:\ATTRIB +R readme.doc. This command will deny access to anyone trying to make changes or delete the file. To remove the read-only attribute from the file, enter C:\ATTRIB -R readme.doc.
Some of the switches associated with the ATTRIB command are:
+H to hide a file in a directory
-H to view a previously hidden file
+A to set a file flag so it can be archived during a backup or XCOPY
-A to remove the archive flag
+S to set a file as a system command
-S to remove the flag for a system command file
+R to set the file up for read-only
-R to remove the read-only attribute

You can also use multiple switches with the ATTRIB command. For instance, if you wanted to set a file up as a system file that is hidden. You would enter the ATTRIB command: C:\ATTRIB +S +H helloworld.exe. To reverse these settings you would substitute the +S +H with the -S -H switches.

Batch Files

Batch files can be created, edited, and run from the command prompt window. You can also run a batch file from the RUN command by entering the location and filename. As the name implies, you can run several executable files from a filename you have created with the .bat extension. These files will be run in the order that they appear and can also have more batch files nested within the original batch file. This command originated in DOS and has several lines of test you can enter for a variety of tasks.
As previously mentioned, if you would like to run a batch file from a batch file, you would use the CALL command with the second batch filename directly after. You can also enter some limited logic such as the IF function to check for a condition. These functions may direct the system to a different batch file or to exit depending on the result of the condition. After the secondary function is complete, except for an exit command, the system will return to the next line of the batch program until it's completed. This would be a good tool if you have several files that need to run repeatedly.
There is also an option for creating an executable file from you batch file. This requires you to purchase a separate compiler program.

Clusters (Lost and Cross-linked)

The easiest way to describe cross-linked clusters is: two or more clusters that the File Allocation Table (FAT) or the Master File Table (MFT) points to that belong to more than one file. The clusters or allocations units are all the same size across the disk. The File Allocation Table has the same number of entries as there are clusters on the disk. This is how the FAT keeps the information on where the clusters are physically located on the disk. The FAT also stores the filename, date, time, size, file attributes, where the beginning of the file is located, and whether the cluster is at the end of the file (EOF) or what the next cluster number is.
Lost clusters are usually caused by files that are not completely deleted or files that have cross-linked clusters that are completely deleted taking the cross-linked cluster with it. This causes the remaining file that the cluster was cross-linked with to be missing that data. Also, corrupted data in the File Allocation Table itself can cause clusters to become lost. If this occurs you have a major problem. There are no provisions in the FAT system to help you recover data errors.
The recommended cure for clusters that are cross-linked, however, is to delete the files that both require the same cluster. Removing one file will not cure the problem as mentioned above.

Task Manager

The easiest way to open Task Manager from Windows XP Home Edition, is to press the Ctrl + Alt + Del keys which will open the Windows Security window. Select Task manager and a window opens showing Windows Task Manager. Depending on the operating system you are using, you will see folder tabs that will show Applications, Processes, and Performance. On my system it also shows a tab for Networking and Users.
Starting with the first tab, Applications, you will see any programs that you have selected to run from you windows desktop. This screen is useful when you have launched a program and it seems to be stalled. You can open the Task Manager Applications tab and see if the program is actually running or shows "not responding". If the program is locked up, you can select the End Task option and Windows will close the program and return you to the desktop. You can also see the some additional information at the bottom of the window showing Processes currently running, percentage of CPU usage, and Commit Charge which shows the virtual memory size and the Peak amount of virtual memory you are using.
The Processes tab shows the complete list of processes that are running on your system. This is useful if you suspect a virus or other unwanted programs may be running on you system.
The third tab is Performance. This is a technicians best friend as far as live system feedback. This will display a running graphical image of the system. You can also select from the View option, the Show Kernel Times and this will be added in red to the graph of CUP usage. The lower graph shows the Page File Usage History and at the bottom there are summaries of the different Handles, Threads, and processes as they run. There are also statistics on the memory and cache totals and availability of each type. This information can be particularly useful if you're running at maximum capacities in any of these areas. You may want to make changes or add memory based on how much you're using.
The next tab is Networking. This shows the network utilization in percentages on the graph. If the Options are set with a check next to Always on Top, you can log on to the TMCC website and view your network usage. This is not as useful a tool for a single user but if you're a network administrator with several workstations accessing the Internet through a server, it can become a useful screen for checking access peaks. It can show if your system is responding slowly or if the Internet service provider is returning data slowly.
The last tab is Users. This screen shows who is currently logged onto the system. This is also useful is you are running a system backup and one of your employees has forgotten to log off before going home. You can disconnect this user and continue with system maintenance. This has happened several times to me and I do enjoy disconnecting people

The Registry

You can find the Windows XP and 2000 registry editor by typing "regedit" from the Run command line or opening the Command Prompt window and typing "regedit.exe". What is displayed is all hardware, users, preferences, operating system programs, and PC settings in a tree format. Windows 2000 and XP Pro for instance displays the following Keys under My Computer:

+ HKEY_CLASSES_ROOT
+ HKEY_CURRENT_USER
+ HKEY_LOCAL_MACHINE
+ HKEY_USERS
+ HKEY_CURRENT_CONFIG

These keys contain subkeys, where there is a "+", and yet more subkeys can be nested within these subkeys. The data stored in each key are called values which consists of names and the associated information. Each group is split into Hives which are labeled with "HKEY" which simply stands for Hive Key and a filename indicating what information is stored there. For instance HKEY_LOCAL_MACHINE has the settings for Hardware, SAM, Security, Software, and System. These contain information about the settings for Windows and hardware device drivers to name a few. Some of this data is changed each time the system in booted and is not stored in the files permanently but by each session.
The information stored in the registry files can be edit manually. Unless you are intimately familiar with these files and their contents, it would be advisable to backup each file before editing. These files can easily become corrupted and cause your system to crash. If on the other hand, you venture into editing these files, after backing them up, you can easily delete leftover portions of uninstalled programs that left files behind. This can slow your system down considerably over many months of adding and deleting programs. Some older uninstall programs will leave data in the registry without the use knowing it. Several companies now offer registry cleanup programs and I would recommend visiting a trusted website like Consumer-Review.org obtain a legitimate program to delete unwanted data.

Event Viewer

The Event Viewer window can be accessed through Start, Programs, Administrative Tools, Event Viewer. There are three categories of logs that you can view. The first is Application, which shows all the events for the Windows applications. These log entries will indicate errors if an application did not start correctly or if it ended abruptly.
The second log is the Security events. This log is useful for log in errors if you're a network administrator. it can show failed attempts to log on and how many times they happened. This could indicate someone is attempting to discover passwords or you may just need to re-train an employee in the use of log in names and password security. The only program that has access to write to this file is the lsass.exe also known as the Local Security Authority Subsystem Service.
The last log is the System events. This file is an event list the operating system uses. Some of the information written to this file references operating system information and errors that may have occurred. It also displays general information on programs as they enter running states such as Windows Firewall.
Occasionally you will want to view these events to verify your system is running correctly without errors. If you do see an error, you can highlight the error and right click the mouse button. When the menu appears, click on Properties and it will give you a description of the error. This will show information on the file or device that generated the event. You can also select the "more information" web link to Microsoft and it can give you a more detailed description and possible actions to take to correct the error.

Friday, February 22, 2008

Multithreading

The thread part of multithreading goes back to the kernel which is the heart of the operating system. The kernel controls the interaction between the hardware and the software. Keeping this in mind, processes are also included in the kernel. The operating system assigns the resources to the processes and they include the device handles, memory, file handles, and windows.
Threads are contained in each process. There can be as few a one or as many as the system needs to complete a required task. These threads can be run in parallel on some systems and as single operations on other systems. It's all considered multithreading. On single processor systems, threads are run one at a time but are sun so quickly the user is unaware of this fact. With the new dual core and quad core processors, threads can be run simultaneously on each core.
Multithreaded programs can run faster on systems with these new CPU's since the programs can be divided into several tasks running at the same time. Unfortunately, programs written to take advantage of this capability have to be constructed in such a way that the threads do not attempt to use the same resources at the same time. This can result in bus contentions or deadlock issues.
Several types of multithreading are in use today. Block multithreading consists of a single thread running until it's blocked. If a call to a memory location that is not in the cache is to be made, it may take several CPU cycles to retrieve the data. In this case, that thread is blocked and the thread processor could allow another thread to run in its place until the memory fetch is complete. This causes the hardware to switch register sets and adds time to the execution of the thread.
Interleaved multithreading reduces the number of CPU cycles down to one thread switch per CPU cycle. In this case the thread processing time is considerably less and each thread is executed separately from on another.
Simultaneous multithreading goes one step further. Each thread consists of multiple instructions per CPU cycle. These threads still contain program counters and are used in superscaler processors.
Hardware and software designers are always attempting to take advantage of the thread scheduler capabilities. The most efficient design would achieve the most thread instructions issued in the least number of CPU cycles while avoiding thread blocking altogether.

HAL Hardware Abstraction Layer

A hardware abstraction layer (HAL) is embedded in software between the computer hardware and the operating system. HAL enables several different computer systems to access programs such as Window NT without having to write a separate operating system for each brand.

The hardware abstraction layers communicate directly with hardware devices such as motherboards. It serves as an interpreter for high level languages to enable them to interact with the lower level component hardware.

Abstraction layers are better visualized by graphics programs. They translate simple program commands into complex screen graphics. Another example of abstraction layers has to do with an older networking process called the OSI (Open Systems Interconnection) model. Seven abstraction layers make up the OSI model; Application, Presentation, Session, Transport, Network, Data Link, and Physical.

The application layer consists of the applications software which is what we use to initiate the communications process. The presentation layer will convert the data to code, compress the data, and encrypt the data. The session layer handles the information exchanges. The transport layer controls the flow and any errors that may occur. It can also retransmit segments that fail. The network layer receives the data from the transport layer with the sequence numbers identifying the segment data. It will address the data with Internet Protocol (IP) addresses, encapsulate the data, routes the data, and decapsulates data as it returns.

The data link layer allows data exchanges between devices. It converts the data into frames. The final layer in the process is the physical layer. This encodes the data into binary signals that are then transmitted to the receiving device.

We obviously do not see these steps as they occur but the next time you use email keep in mind that without hardware abstraction layers to handle the task required, the data would be meaningless.

ACPI Advanced Configuration and Power Interface

ACPI consists of several components that are designed to manage power usage in you PC. For something that seems fairly minor in theory, it is very complicated in its application. There consists several elements such as the ACPI register set, ACPI BIOS, ACPI table, and two operating system elements; the OSPM system code, and the ACPI driver with the AML interpreter. The following chart indicates each of the elements of the ACPI. Courtesy of the ACPI specification Revision 3.0a.



Before Windows 98, the power management system relies on BIOS to control it. Since the release of Windows 98, more control has been granted to the operating system by way of the AML or ACPI Machine Language embedded in the BIOS firmware. This gives the operating system the means to control the low-level information relating to the hardware thus giving it more control over the various states devices can be placed in to control power usage. Older systems had no way to interpret the AML and had no clue how to handle the ACPI registers. ACPI specifications outline standards to which manufacturers must conform to insure that all devices including motherboard chipsets, operating systems, and CPU's can support current ACPI designs. The current ACPI specifications that I've based some of this outline on are from Revision 3.0a.

There are several states of power management described as global states, processor states, device states, and performance states. There are also several layers within each state that the computer system can be placed in.

Global state G0 is the working state. The software is running, there is no latency (delay in response) and the power consumption is at its highest. The computer should not be disassembled in this state due to the risk of electrical shock and system damage.

The G1 state is considered sleeping which is further divided into modes S1 to S4. In S1 mode the CPU stops running and all unnecessary devices are shut down. S2 mode goes a little further than S1 in that the CPU is completely powered down. S3 also called SUSPEND to RAM (STR) or standby in Windows. The only ting powered in this mode is RAM. The system will still maintain all data and will resume without rebooting. S4 or Hibernation by Windows, the data in the main memory is saved. This means that if the computer looses power the only loss of data would be any unsaved documents.

The G2 also known as S5 Soft off, are almost the same as the next state G3, except the system is still using a minimum amount of power. The system must go through a restart and there is a long latency. The computer should not be disassembled in this state.

The G3 state is achieved by powering down the system and no power is being consumed except for the real-time clock. A hard boot is required to return the system to working mode. This is the only state that the computer can safely be disassembled after removing the power cord.

The next set of states are the Device states. There are only four levels D0 through D3. D0 is the state when the device is operating and consuming power as it needs to. D1 and D2 are entered as required by each of the different devices. D3 requires no power and will no communicate with the system. This level requires the device to be reinitialized in order to be used.

Processor states have only four levels as well. They range from C0 to C3. C0 is full operation. C1 also called HALT, the CPU is not executing any instructions and can be returned to working mode in the shortest amount of time. C2 also known as Stop-clock, takes longer to wake up than C1 and uses less power than C1. C3 is known as Sleep and takes the longest to return to working mode and uses the least amount of power.

Performance states are mostly defined by the manufacturers of processors. They range from P0 to P16. In the P0 state, the processor or device is at its highest level and will use the most power. The remaining states are all lesser levels then P0 and will vary by processor.

The complete ACPI specification can be obtained online and contains over 600 pages of additional information including specific syntax requirements for device designers. If you have the time, it's well worth a look and a copy can be downloaded at www.acpi.info/.

Friday, February 15, 2008

Blog 2 Assignment ESD

ESD events occur naturally when two materials with different electrical potentials attempt to balance out. Human skin is one of the most easily charged materials second only to air. The charge is derived from an excess of electrons being stored. When we come into contact, or proximity, with a device or substance with a fewer number of electrons the balancing process can be seen as an arc. These static events can range in voltage from a few millivolts to several thousand volts. This voltage is not necessarily responsible for curcuit damage rather the current and how quickly the current passes defines the extent of the damage done.
Large amounts of current can cause complete component failure or a partial failure that gets worse over time. Complete failure is fairly easy to diagnose. Partial failure can cause odd component reactions intermittently which are never easy to diagnose. See photos below courtesy of SRI, Source Research Inc.

As indicated by the photos, an ESD event has damaged a board trace and a capacitor. The trace should fail more quickly than the pit showing on the cap. This could cause a slow failure over time.

Components can be damaged even before they are populated on the board. Proper ESD prevention can reduce the number and severity of these events. Never take a part from someone without touching them first to equal the potential voltage between you. Preferrably, touch the hand that does not contain the part, or use and anti-static bag or foam to transport parts. Grounding bracelets, mats and anti-static bags are the most popular preventative devices.

ESD can be destructive to high speed digital circuits even if you can't feel a shock. Keeping this in mind will keep you from adding more problems than you are trying to fix.

Blog 2 Assignment POST

I have selected the POST and ESD items from our list. Please drink lots of coffee before viewing.

Power On Self Test better known as POST, is the first sequence in the pre-boot routine. This self test program is run from the BIOS and verifies, through diagnostics, that all the systems in the computer are functioning properly. If a system passes POST depending on the type of system, you will hear one or two beeps as the boot program continues to run.
On a hard boot, the first system checked is the power supply. This check verifies the correct voltages are present by way of the power good signal. Some systems use a reset instead of a power good signal. Either way, the condition of this signal tells the system to continue or hold. The next check is the BIOS itself. It must report a valid checksum to indicate that it does not contain corrupt data. From here, different systems may alter the following sequence but the checks must all be done.
The CPU must indicate that it has completed its power on reset and is capable of initializing communications with memory. Read and write tests are done to the first 64KB to verify bus, module and controller function. The CMOS must also pass a checksum test to insure there is no corrupt data. The last series of tests check the I/O functions are working properly including the video. Again, a series of read and write operations are preformed.
The soft boot routines are not quite so extensive. Much of the information from the hard boot are saved and do not need to be re-run. After for POST has run the operating system takes control and finished initializing the hardware with the installed device drivers.
I won't bore you with each beep code and description since they will be different depending on the type of BIOS driving the system. They will show the obvious errors such as BIOS and CMOS checksum errors and some not-so-obvious errors like errors with the system clock, PCI bus communications errors, coprocessor erros and configuration errors. I have seen these errors personally and wouldn't wish them on any repair technician.
POST does not perform all the startup routines though. During startup, it may pass the initialization of certain devices off to other programs. In summary, the POST is not only a necessity for the computer to become operational, but is also the best tool you can use to troubleshoot a system.