Thursday, June 26, 2008

ADAPT: a knowledge-based decision support system for producing zoning schemes


Abstract. Few computer programs have been devised to assist local government planners draw up zoning schemes, despite the ubiquity of zoning schemes for expressing land-use plans. When one program, LUPLAN, representative of a broader class of plan-evaluation programs, was used to produce a zoning scheme, it was found to be fundamentally unsuited to what is essentially a political task. Perhaps the major difficulty lay in the reduction in LUPLAN of what is a highly complex decisionmaking task to a series of numerical manipulations.
The ADAPT program, which has been specifically designed to overcome this and other problems, leaves most of the decisionmaking to the planner, but assists by providing relevant knowledge and data about each conflict-choice situation as it arises, and by keeping a record of decisions made and their underlying reasons. The reasons for the development of ADAPT are reminiscent of those that have led to the recent development of decision support systems (DSSs) for organizational management purposes, and many of the techniques used by ADAPT are similar to those used in knowledge-based systems. DSSs and knowledge-based systems are both described in some detail since they form a basis on which ADAPT, and similar decision-aiding programs, can be further developed.

Cite as:
Davis J R, Grant I W, 1987, "ADAPT: a knowledge-based decision support system for producing zoning schemes" Environment and Planning B: Planning and Design 14(1) 53 – 66
Read More...

ABSTRACT - Web-Based Production Information System on PT BINA ILMU Surabaya

The aim of the Tugas Akhir is to make a web-based production information system on PT BINA ILMU Surabaya which covers receiving print order, order printing process, and material stock analysis, order and reception.

Method used to make the web-based production information system on PT BINA ILMU at first step is by analyzing the system from the printing production, and then designing the system and database. Data Flow Diagram (DFD) and hierarchy chart making are processes used to design the system. While Conceptual Data Model (CDM) consist of 16 master tables, 4 transaction tables and 10 supported tables which generated into the Physical Data Model (PDM) is used to make the database, whose result is transformated to the database with supporting software MySQL. The next process is designing the input and output form and also the algorithm script program. At the last section is implementing the algorithm program which folded into the program using PHP 5. Web-Based Information System Production there is a feature of maintain primary data such as adding, editing and also erasing a data like customer’s record, employee’s record and complement material’s record and others.

By making this design system a number of fundamental issues on information system PT BINA ILMU has been solved. To facilitate material furnish analysis process provided by graphic facility and feed report. This web-based production information system can quickly and accurately gives some reports such as commander of production order and material orderings report because all data which required have integrated in database.

Read More...

Tuesday, June 24, 2008

Self service software

Self service software is a subset within the Knowledge Management software category and which contains a range of software that specializes in the way information, process rules and logic are collected, framed within an organized taxonomy, and accessed through decision support interviews. Self-service software allows people to secure answers to their inquiries and/or needs through an automated interview fashion instead of traditional search approaches.

Self Service Software Functionality
Self service software allows authors (typically subject matter experts) to readily automate the deployment of, the timeliness of, and compliance around a variety of processes with which they are involved in communicating without having to physically address the questions, needs, and solicitations of end users who are inquiring about the particular process being automated.
Self service software primarily addresses closed-loop inquiries whereby the author emulates a variety of known (finite) questions and related (known) responses on hand or required steps that must be addressed to derive and deliver a final answer or directive. Often the author using such software codifies such known processes and steps then generates (publishes) end-user facing applications which can encompass a variety of code bases and platforms.
Self service software is sometimes referred to decision support software and even expert systems. It is typically categorized as a sub-topic within the knowledge management software category. Self service software allows individuals and companies alike to tailor and address customer support, technical support and employee support inquiries and needs in an on-demand fashion where the person with a question (need) can interface with the author's generated application via a computer, a handheld device, a kiosk, register, or other machine type to secure their answers as if they were directly interacting (talking to) the author.
References
• Bontis, N., Dragonetti, N., Jacobsen, K. and G. Roos. (1999) "The Knowledge Toolbox: A review of the tools available to measure and manage intangible resources", European Management Journal, 17, 4, 391-402. 1 Read More...

Sunday, June 22, 2008

Create a risk management plan

Select appropriate controls or countermeasures to measure each risk. Risk mitigation needs to be approved by the appropriate level of management. For example, a risk concerning the image of the organization should have top management decision behind it whereas IT management would have the authority to decide on computer virus risks.

The risk management plan should propose applicable and effective security controls for managing the risks. For example, an observed high risk of computer viruses could be mitigated by acquiring and implementing antivirus software. A good risk management plan should contain a schedule for control implementation and responsible persons for those actions.

According to ISO/IEC 27001, the stage immediately after completion of the Risk Assessment phase consists of preparing a Risk Treatment Plan, which should document the decisions about how each of the identified risks should be handled. Mitigation of risks often means selection of Security Controls, which should be documented in a Statement of Applicability, which identifies which particular control objectives and controls from the standard have been selected, and why.

Implementation

Follow all of the planned methods for mitigating the effect of the risks. Purchase insurance policies for the risks that have been decided to be transferred to an insurer, avoid all risks that can be avoided without sacrificing the entity's goals, reduce others, and retain the rest.

Review and evaluation of the plan

Initial risk management plans will never be perfect. Practice, experience, and actual loss results will necessitate changes in the plan and contribute information to allow possible different decisions to be made in dealing with the risks being faced.

Risk analysis results and management plans should be updated periodically. There are two primary reasons for this:

  1. to evaluate whether the previously selected security controls are still applicable and effective, and
  2. to evaluate the possible risk level changes in the business environment. For example, information risks are a good example of rapidly changing business environment.
Read More...

Removable Hard Disk Drives

Although any hard drive can obviously be removed, the term removable hard disk drive refers to hard drives designed to be removed and reinstalled easily, without opening the case or disconnecting and reconnecting cables. There are two distinct types of removable hard disk drives:

Cartridge-based drives

Cartridge-based drives such as the Iomega Jaz and Castlewood ORB use a self-contained, sealed cartridge about the size of a thick 3.5-inch floppy disk. The cartridge contains only the disk itself. The head mechanism resides in the drive. You insert the disk into the drive much as you would a floppy disk. Inserting the disk causes a shutter on the disk to open, allowing the drive's head mechanism to read and write the disk. The Iomega Peerless system instead uses a cartridge that is essentially the HDA (head-disk assembly) of a standard hard drive. Cartridge-based units are available in internal and external versions, using IDE, parallel port, SCSI, USB, PC Card, or FireWire interfaces.

Cartridge-based drives have always been niche products, but are now obsolete in practical terms. Their raison d'être, transferring moderately large data sets between systems, is now better served by a DVD writer or similar industry-standard writable optical drives. For most purposes, cartridge-based drives are now too small, slow, proprietary, and expensive. The Castlewood ORB is the only cartridge-based drive that remains in production.

Frame/carrier-based drives

These drives are actually just modified drive bays that allow a standard hard drive mounted in a carrier assembly to be inserted and removed easily. The frame resides permanently in an external drive bay, and is connected permanently to power and to the IDE interface or SCSI host adapter. The carrier assembly contains power and data cables, which remain permanently attached to the hard drive. The rear of the carrier assembly contains a custom connector that routes power and data signals from the frame. The connector that mates the carrier to the frame is designed for durability, and is typically rated for 2,000 to 50,000 insertions and removals.

These devices are simply physical modifications that allow easy removal and insertion, so the system sees the drive as just another hard disk drive because it is just another hard disk drive. Frame/carrier assemblies are available for any hard disk interface, from IDE to Ultra320 SCSI. More sophisticated units support such functions as hot-swapping, sparing, and RAID, if your host adapter, drivers, and operating system also support those functions.

External Hard Disk Drives

External hard drives are a related class of storage device but do not qualify as true removable hard disk drives. They are similar to removable hard disk drives in that they allow large amounts of data to be moved between systems. They are dissimilar in that they do not use removable media.

External SCSI drives have been around for years, of course, but they have always been a niche product. External Plug and Play drives with USB or FireWire interfaces (or both) are becoming increasingly popular, particularly with notebooks. In effect, these devices are simply standard IDE hard drives in an external enclosure with a USB or FireWire interface.

The drives perform as you would expect a modern IDE hard drive to perform. In the past, the problem was the interface. FireWire was fast enough to use as a hard disk interface, but few computers had FireWire ports and the cost of adding FireWire to both a PC and notebook made this solution quite expensive. USB 1.1 was ubiquitous but too slow for reasonable hard drive performance. In 2002 systems began shipping with USB 2.0 interfaces, which are more than fast enough to support any current hard drive.

Pioneered by Maxtor with its Personal Storage 3000-series drives, external USB 2.0 hard drives proliferated as USB 2.0 became common. Competing models are available from Maxtor, Western Digital, Iomega, CMS, QPS, and others with capacities as high as 250 GB or more. Although these drives can be used just like any other hard drive, they are marketed as backup/archive solutions. Makers generally bundle software such as Dantz Retrospect that allows backing up your internal hard drive to the external drive with just the push of a button. As the drive fills up, it's an easy matter to delete old backup data to make room for new. We have serious concerns about using an external hard drive as your only backup solution, but it's undeniable that these drives make backing up fast and easy.

Read More...

USB Communications

The first PCs shipped in 1981 used serial ports and parallel ports to connect external peripherals. Although the RS-232 serial and Centronics parallel technologies had improved gradually over the years, by the mid-'90s those technologies had reached their limits. In terms of connectivity to external devices, the PC of 1995 differed very little from the PC of 1981; the ports were a bit faster, perhaps, but they were fundamentally similar.

In the interim, the bandwidth needs of external peripherals had increased greatly. Character-mode dot-matrix and daisy-wheel printers had given way to graphic-mode page printers. Modems were pushing the throughput limitations of RS-232. Also, it was obvious that emerging categories of external peripherals—such as digital cameras, CD writers, tape drives, and other external storage devices—would require much more bandwidth than standard serial or parallel connections could provide. Neither was bandwidth the only limitation. Serial and parallel ports have the following drawbacks for connecting external peripherals:

Low bandwidth

Standard serial ports top out at 115 Kb/s, and parallel ports at 500 Kb/s to 2 Mb/s. Although these speeds are adequate for low-speed peripherals, they are unacceptably slow for hi-speed peripherals.

Point-to-point connections

Standard serial and parallel ports dedicate a port to each device. Because there is a practical limit to the number of serial ports and parallel ports that can be installed in a PC, the number and type of external devices that can be connected are limited.

Resource demands

Each serial or parallel port occupies scarce system resources, in particular an IRQ. A PC has only 16 IRQ lines, most of which are already occupied. It is often impossible to install the required number of serial or parallel ports because insufficient interrupts are available.

Ease-of-use issues

Connecting devices to serial or parallel ports may be complex and trouble-prone because cable pinouts and port configurations are not well-standardized. Serial ports in particular accept a wide variety of different cables, none of which is likely to be interchangeable with any other. Parallel ports use more standardized cable pinouts, but various parallel devices may require different port configurations. In particular, attempting to daisy-chain parallel devices via pass-through ports often introduces incompatibilities. Also, serial and parallel ports are always located on the rear of the computer, which makes connecting and disconnecting them inconvenient.

What PCs really needed was a fast bus-based scheme that allowed multiple devices to be daisy-chained together from a single port on the PC. SCSI had the potential to fulfill this need, but its high cost and complexity made it a nonstarter for that purpose. IEEE-1394, also called FireWire, might have been suitable, but FireWire is a proprietary Apple technology with, at the time, high licensing costs that motherboard and peripheral makers refused to pay. The PC industry had long been aware of the need for better external peripheral connectivity, but it was not until 1996 that vendors finally began to address it. Their solution is called Universal Serial Bus (USB).

USB is aptly named. It is universal because every modern PC or motherboard includes USB and because USB allows you to connect almost any type of peripheral, including modems, printers, speakers, keyboards, scanners, mice, joysticks, external drives, and digital cameras. It is serial in that it uses serial communication protocols on a single data pair. It is a logical bus (although the physical topology is a tiered star) that allows up to 127 devices to be daisy-chained on a single pair of conductors.

One convenient way to think about USB is as an outside-the-box Plug-and-Play bus. All connected USB devices are managed by the USB Host Controller Interface (HCI) in the PC, and all devices share the IRQ assigned to that HCI. Devices can (in theory, at least) be plugged or unplugged without rebooting the computer.

Although nearly all PCs and motherboards made since 1997 have USB ports, for a long time those ports were nearly useless, for three reasons:

· USB requires native operating system support to provide full functionality. Until Windows 98 and Windows 2000 began to proliferate, that support was lacking. Windows NT 4 and early Windows 95 releases have no USB support, although a few peripheral makers provided custom drivers to allow their devices to work under these operating systems. Windows 95 OSR 2.1 introduced limited support for a few USB devices, but using USB under Windows 95 is an exercise in frustration. Windows 98/98SE/Me/2000/XP support USB 1.1. Windows XP supports USB 2.0 natively if SP1 or later is applied, although you may need to download the latest release of the USB 2.0 driver from the Windows Update site. Even with the latest service pack installed, Windows 2000 does not support USB 2.0 directly, although you can download native Windows 2000 USB 2.0 drivers from the Windows Update site. For more information about USB 2.0 support under Windows 2000 and Windows XP, see Knowledge Base articles 319973 and 312370, respectively. The Linux kernel has included USB support since 2.2.18. The Linux 2.4.20 or later kernel supports USB 2.0 directly.

Only Windows 2000 and XP officially support USB 2.0, but many PCI USB 2.0 interface cards are available that include Windows 9X USB 2.0 drivers supplied by the hardware vendor. The only PCI USB 2.0 interface card we have used is the Adaptec USB2connect, which operates properly with the supplied drivers. But our readers report that many other brands of PCI USB 2.0 adapter cards provide fast, reliable USB 2.0 support under Windows 9X.

· USB peripherals were hard to find prior to 1999, and were often more expensive than versions that used legacy interfaces. By 2000, that situation had reversed itself, with USB peripherals readily available and often cheaper than peripherals with legacy interfaces. As of July 2003, nearly all mainstream external peripherals use the USB interface, and old-style serial and parallel peripherals are becoming hard to find.

· Early USB ports and peripherals often exhibited incompatibilities and other strange behavior. Removing a connected peripheral might crash your system, or a newly connected device might require a reboot to be recognized. Some peripherals demanded that their drivers be reinstalled every time they were disconnected and then reconnected. Some peripherals drew so much power that other devices on that USB port would cease operating or the system would refuse to boot until the offending device was disconnected. And so on. In fact, these conflicts and incompatibilities remain a problem with more recent USB interfaces and devices, although the problems are less severe. As of July 2003, it appears that the teething pains USB experienced during its early days have largely been overcome, although even some very recent motherboards and chipsets continue to cause problems.

Despite these problems, by mid-2000 USB had achieved critical mass. With Windows 98/SE/Me and Windows 2000 available and USB peripherals shipping in volume, USB transitioned from a developing standard with great potential into a real-world solution, albeit a flawed one. USB has now largely replaced the legacy connectors that clutter the back of recent PCs.

Legacy-reduced motherboards that began shipping in 2000 replaced or supplemented serial and parallel ports with additional USB ports—usually four rather than the previously standard two. Legacy-free motherboards provide nothing but USB ports for connecting external peripherals (other than perhaps video), and are usually equipped with six USB ports—four at the rear and two on the front panel. A few legacy-free motherboards also include IEEE-1394 (FireWire) ports. Most external peripherals now have only a USB interface, as serial and parallel peripherals now teeter on the edge between obsolescent and obsolete.

Despite its slow start and the nagging problems that still sometimes plague it, USB has moved from being the wave of the future to being the current standard. This chapter tells you what you need to know about USB.

Read More...

Saturday, June 21, 2008

To See A Knowledge Based Society in Indonesia

Onno W. Purbo
(onno@indo.net.id)

A simple vision, “to see a knowledge based society in Indonesia”, has been the base of most of my activities in the last ten (10) years. It is a challenging endeavor to be pursued.

Back in early 1990, while completing my Ph.D degree in EECS at University of Waterloo, Canada, Internet is in its infancy. Computer based discussion groups & share resources over the network were a common mode interaction in network platform. Study abroad with heavy network exposure has broadened my view on various aspect of technology as well as social, culture & life.

Being a licensed ham radio with callsign YC1DAV/VE3 has opened an opportunity to explore various alternatives in implementing a TCP/IP network over the radio based on VE3UOW the club station at University of Waterloo, Canada. Interaction with Indonesian YB colleagues was done via amateur store and forward message on 300bps network & sometimes via amateur low earth orbit satellite. Knowledge gain in VE-land is distributed back to Indonesia via the slow radio network, but it provides sufficient ingredient for my fellow YB-land ham radio to start digging the technology & working on building the network lead by Robby YB1BG, Ben YB0EBS, YB1HR, YB2SV etc.

Merging these two (2) experiences, it was clearly shown the impact of knowledge transparency run on information / digital platform towards the society empowerment. Education level & open system has significantly shown its power on society empowerment. In most cases, community based technologies may be a challenge to the inefficient incumbent operators. Transparency in technology & knowledge on IT would be the major key in building community based technology & networking to create information access infrastructure for the Indonesian people to gain knowledge & information needed to move towards the knowledge based society. These ideas & way of thinking have gradually accumulated into a clear vision, strong motivation & direction of my future life.

In 1993, after the completion of my Ph.D degree, I served as one of the young lecturer at Institute of Technology in Bandung (ITB). I worked with the students to build the TCP/IP based radio 1200bps network among schools, various institutions in Indonesia as part of the early Internet in Indonesia. No funding is received from ITB, World Bank, Indonesian Government all are self-financed. Consequently, used 286 machines, homebrew radio modem connected to handheld 144Mhz transceiver would be common configuration among us. An amateur radio KA9Q NOS software is used to serve the mail server on a 286 machine. Free access to the Internet was provided by University of Indonesia (http://www.ui.ac.id) & Ministry of Science & Technology (http://www.iptek.net.id). Knowledge distribution is the key in expanding the network, give away all documents & software (including the source code) for free is common among us. It provides significant motivation for others (close to hundred institutions & personal node) to join the bandwagon of the early Indonesian Internet. It’s free anyway.

In 1995, FreeBSD & Linux was deployed to serve as major network OS on our servers & routers. Campus MAN is build at ITB, as expected, no money from Gov’t & institution were used most of the funding was self-financed by the society as demand grows. Thus, demand created supply strategy is used rather than supply created demand as it costs much less & suits for community based development approach. As expected in some cases, we had to face the authoritarianism from the campus authority as we bypass their bureaucratic network.

Knowledge sharing & interaction among us was mostly done through various Internet mailing lists. The early major mailing list server in Indonesia was hosted at ITB run on majordomo@itb.ac.id. Currently, most of knowledge sharing & cyber community building in Indonesia are done through yahoogroups.com based mailing lists.

Experienced gain in implementing community-based network was published in many articles & papers in various Indonesian media. Knowledge was disseminated generously in various seminars & workshops. Currently, on the average of 3-5 seminars on Internet / week is organized & would be a hectic activity for those who gives talk on Internet. Most (if not all) of our papers, articles, slides since the early ’90 can be downloaded for free from various sites such as http://www.bogor.net/idkf/ & http://louis.idaman.com/idkf/. Copyleft noticed is attached to enable others who have no access to the physical meeting occasions to gain the knowledge.

In September 1996, our group leads the connection of ITB to Asia Internet Interconnection Initiatives (AI3) lead by WIDE Project in Japan. A Ku-Band ground station was installed for the first time in Indonesia to get 2 Mbps Internet access to WIDE Network as part of Asia Pacific Advanced Network (APAN) part of Asia Pacific Information Infrastructure (APII) & connected directly to StarTAP & Internet 2 in the US. Heavy Internet research activities were performed. The broadband Internet access in our research network is used to expand it to integrate various universities & schools in Indonesia. More than 20 institutions joined at their own expense with no significant funding from the Gov’t. In addition to international conference papers, most of the research results were published in popular Internet books written in Indonesian languages. Currently, close to 20 books on Internet have been published by our group (Computer Network Research Group ITB) & helps the Indonesian to under Internet technology.

In 1999, ITB asked me to lead the ITB’s main library. It is a challenging task as the average salary of their ~90 staff is only US$20-30 / month. While trying hard to increase staff income via various self-finance seminar activities, injecting network culture into the staffs & building the infrastructure – a big picture of “knowledge management” is emerged. Provide a free access to the computer facilities at the center, the students start playing with it & gradually build the ITB Digital Library at no expense from the institute. Lead by Ismail Fahmi (my ex student) of Knowledge Management Research Group ITB, Ganesha Digital Library is now connecting more than 20 other digital libraries in Indonesia & received funding approx. CAN$60.000 from IDRC Canada. It is their aim to accumulate knowledge generated in Indonesia & make the knowledge cycle more efficient as it would be one of the main ingredient toward the building of knowledge based society.

In February 2000, I resigned as a lecturer at ITB & Indonesian civil servant. I would like to dedicate my life to educate the Indonesian people not limited to few students at ITB. Since then, I work for no one & spent my time at home writing articles, active on the mailing lists & give talks. Focus is set towards the Internet access infrastructure for the Indonesian people as it would be the most strategic entry point to enable the society in gaining access to information & knowledge. High telecommunication tariff, expensive computing facilities, lack of knowledge in IT would be the major barrier for common people in accessing the Internet.

Internet sharing technology such as the one used in Internet Café would be the most ideal solution for most part of Indonesia. Several books in Internet café technology were published. More detailed information such as its simple business plan (in Excel) was distributed for free. Community is build based on Internet mailing lists, such as asosiasi-warnet@yahoogroups.com, asosiasi-warnet-broadband@yahoogroups.com, indowli@yahoogroups.com. These interactions really help raise the number of Internet café in Indonesia to ~2500 (in early 2001) from less than 100 in 1999. An investment of less than US$10.000 most of the time will be paid off in less than one (1) year. Investors, online media & dotcommers are now looking at Internet café as main alternative knowing the 60-70% internet users out side Jakarta accessing Internet through these facilities. Myohdotcom, for example, is planning to invest on ~6000 Internet café in Indonesia.

Due to expensive & low quality telco’s infrastructure, some of these Internet cafés are now using wireless broadband 2-11Mbps based on 2.4GHz. Currently, ~200-300 Indonesian Internet cafes are using the 2.4GHz 2-11Mbps infrastructure. Some even cooperate among them to buy a direct international Internet bandwidth via satellite such as via Interpacket, GlobalOne etc. Thus, no slow & expensive service from the incumbent operator is used. In early 1998, the 2.4GHz technology is developed by my students at CNRG ITB & is distributed through various media.

I understand, the effort is still in its infancy. Currently, with ~2 million Indonesian Internet users in Indonesia out of 200 million population, the task ahead of me to realize the vision is not easy. People’s education would be the primary strategy. Self-funding scheme would assure the sustainability of the movement. At least five (5) awards have been awarded thus far. I believe God will always help those who do good deeds.

I am fortunate enough to be able to life this far & share my knowledge to others with no permanent jobs (my choice of life). While living happily with my wife Lina & our five (5) children Ito, Reza, Atik, Darry & Dsaq in our tiny & simple house.

Read More...