CLICK HERE FOR BLOGGER TEMPLATES AND MYSPACE LAYOUTS »

Monday, October 5, 2009

Factors for Publishing Research in Top-tier Journals

To publish is to make content publicly known. However, for an article or paper to be published, there are certain criteria that it should pass. Reviewers or critics normally have factors that are being considered in choosing a paper to be published. Relating to research works, top-tier journals like ACM have high-standards that are set in accordance with how to select a paper to be published. ACM sets high standards of quality for its publications. It is paramount to uphold these standards in the development and selection of content.

According to an article of Kate Devine (2001), “Writing a paper that will get published”, Jeff Skousen, professor of soil science, West Virginia University, points out that there are varying levels of journal prestige and not every paper qualifies for the most well-known. Top-tier journals usually reject more than 50 percent of the papers submitted to them, and some have rejection rates as high as 70 percent, Skousen says. These journals are rigorously edited and require very sound science and results that have meaning and application in the field. Other journals have a much lower rejection rate and are not as tightly edited, but they generally contain good research.

While originality can be a persuasive factor, "People are still able to get their work published even if it seems similar to previously published work," observes Skousen. Although one may think that most of the pertinent questions in a subject area might be answered after long periods of testing and experimentation that does not seem to be the case, he continues. "I'm surprised that there are not that many new ideas in our journals today compared to past decades," he remarks. "Sure, we get new instruments and tools that allow greater precision or accuracy of measurement, but the ideas are not that dissimilar, nor are the results that dissimilar after data collection and interpretation." Flower-Ellis predicts that more and more papers will be assessed as "valuable confirmations" rather than as "original contributions to knowledge." Another consideration is the manuscript topic. A hot topic "is more likely to be published than is an equally sound paper dealing with a currently unfashionable subject," says Flower-Ellis The scientific community does display some of the proverbial characteristics of lemmings, in publication no less than in choice of research area."
The criteria publishers use as measures for accepting a paper vary a lot more than is sometimes realized, notes Lewenstein. Publication is not a cut-and-dried process--it's infinitely variable and flexible. In particular, "peer review" is not a simple criterion, he continues. Some journals may send an article to three to five reviewers, and the editors make an informed judgment by weighing all reviews. Other journals may send an article to a single reviewer and make simple yes/no decisions based on one review. Some journals may do a lot more editorial work with an author, while others take manuscripts more or less as submitted. Although the review process can be flexible, acceptance criteria are relatively standard. Experts consulted offer simple advice for optimizing publishing success. Many say influential factors include the need for clarity, originality of thought, novelty of finding, organization, completeness, and good writing. The experts' advice may seem evident. Skousen, however, states that the most elegant research is usually simple and direct. According to Byrne, who published an article last year on common reasons for manuscript rejection, flawed or poorly planned study design and lack of detail in methods were the two elements most often leading to rejection.
In addition, in a blog article, it was stated that some factors that are most observed by most top-tier journal in recognizing a qualified research paper for publishing in top-tier journals are uniqueness, relevance, presentation, and credibility.

Uniqueness or originality, in the sense that a research topic is uncommon, something that is somehow yet to be known and something different from the average topic studied. The relevance of the paper is also a great consideration for publishing. It entails the significance it delivers to the society. How a paper is presented is also an issue. The paper should be in a form of proper documentation. Another factor is the researcher’s credibility to conduct a study. To some, it is important to know that the researcher is of reliable researcher because usually many people normally give credit or recognitions to the researcher’s previous successful studies.

Evaluation of Research Work - Good or Bad?

Various criteria are set to how a research study should be properly assessed. There are different standards on how to define and identify if a research work is good. It varies accordingly to reviewers of diverse analysis and method of evaluation.

There are different kinds of research works, however the decisive factor for evaluation are comparable in ways. According to answers.com, a good research is well-informed, thorough, intelligent, and systematic, allows for the possibility that one may be mistaken and allows for verification. A good research should be knowledgeable with its subject. It should also be detailed with information regarding the focus of the study. The research must take on specific methods; it must be organized and ordered. Moreover, it should be falsifiable and should considerably be confirmed and be verified.

The study should be able to answer and support all questions related to the topic. It should contain sufficient information and resources to back up the study. If it takes on statistical matters, it must be precise and accurate. Figures and diagrams have to be explained and elaborated in manners that it can be understood. The terminologies to be used should be simple and clear.

Taken from the reportbd.com (2009), a research should satisfy the certain criteria. The purpose of the research should be clearly defined and common concepts be used. The research procedure used should be described in sufficient detail to permit another researcher to repeat the research for further advancement, keeping the continuity of what has already been attained. The procedural design of the research should be carefully planned to yield results that are as objectives as possible. The researcher should report with complete frankness, flaws in procedural design and estimate their effects upon the findings. The analysis of data should be sufficiently adequate to reveal its significance and the methods of analysis used should be appropriate. The validity and reliability of the data should be checked carefully. Conclusions should be confined to those justified by the data of the research and limited to those for which the data provide an adequate basis. Greater confidence in research is warranted if the researcher is experienced, has a good reputation in research and is a person of integrity.

According to faculty.goucher.edu (2009), how the research paper is evaluated is based on four (4) criteria. These are sources, thesis, audience and mechanics and documentation. This article is mainly intended for the research paper evaluation of students. With regard to the sources, it inquires if the paper uses the right kinds of scholarly or popular-scholarly sources to support its claims? It refers to the consideration if the resources used in supporting the paper are intellectual and logical enough. Is the paper based on at least some recent article-length sources? Articles are the sources of the most recent and most tightly focused analysis on your topic. Having updated source of information are also significant. Gathering latest information and updates on related topic should be observed. Are the sources recent enough to be persuasive? Scholarship in the social and natural sciences becomes outdated quickly. Conclusions based on out of date evidence fail to persuade. Students who want to succeed in these majors must become persistent enough researchers to seek out the most recent and authoritative sources on their topics. Humanities sources have undergone immense theoretical upheavals in the last decades of the Twentieth Century, and for many fields, secondary scholarship written much before 1980 can be suspect or unacceptable because its analytical methods are controlled by theoretical assumptions that are no longer acceptable. The fields cannot engage in wholesale book-burning and web-site erasure to eliminate these problematic sources, but an early part of Humanities' majors' upper-division work involves becoming familiar with the currently acceptable theories and analytical methods, and with the sources from earlier scholars work which are still acceptable.

Next is about thesis. Is the paper organized by an independent thesis which at least uses reasoning and/or evidence from one article to contribute substantively to the reasoning and/or evidence in any other article, thus avoiding mere summary of the research? Is the thesis carefully composed to avoid claiming absolute knowledge if its evidence supports only possible or probable conclusions? Is the thesis supported by logically sound reasoning?
These questions are asking whether the author has moved beyond the stage of merely reporting what others say, and into the stage of being able to think creatively about the topic. Early attempts to do this may be tentative and uncertain. To protect your reputation for careful thinking, make sure you distinguish clearly among certain, probable, and possible conclusions. Be content to claim your conclusions are "possibly" correct unless you can eliminate many of the contending conclusions to claim they are "probably' correct. Do not claim your conclusions "certainly" explain the evidence unless you have eliminated all alternative explanations. Logical fallacies often arise because writers unconsciously struggle to force their research to support to their earliest intuitions, guesses, hunches, or hypotheses about what is true. (Think of how often you heard high-school writers say "I'm going to do some research to get sources that support my thesis.") Beware your own prejudices about what you think the evidence will reveal before you've impartially examined it. Let the evidence speak and you can hardly go far wrong.

III. Audience: Does the paper address a scholarly audience and correctly estimate the level of knowledge that audience can be expected to possess? Does it avoid telling experts obvious things, like defining terms of art or basic concepts, providing needless "background," and identifying experts to each other with unnecessary specificity (e.g., "the biologist Lewis Thomas" in a paper addressed to biologists)? Does it always specify the source of generalizations about evidence by correct citations of scholarship?

IV. Mechanics and Documentation: Does the paper use standard academic English usage and sentence construction, coherent and well-ordered paragraphs, logical paragraph transition, and a fully functional title, introduction, and conclusion? Does the paper accurately and consistently use a documentation style appropriate to the discipline (MLA, APA, CBE, or U. Chicago), or does it at least use MLA style accurately and consistently?

Be especially careful when using terms of art and jargon from the discipline you're just entering. As an "apprentice," you may make mistakes that a more experienced scholar would not make, and they're the kind of mistakes that damage your authority, so you should pay special attention to those peculiar kinds of words and phrases.
Double your efforts to proofread your final draft in order to catch these old errors that will come back when you least want them to appear. You can prevent one typical source of dangerous errors if you start your paper's first draft with a list of sources as you accumulate them in your research, properly formatted in the documentation style appropriate for your topic's discipline. This is far to important to leave for the last five minutes of the writing process, and if you develop the habit of doing it early you will save yourself countless disappointments in later papers. Just build the paper on top of that source list, and add to it every time you develop a new source, and you can spend your last hours polishing your prose rather than worrying about documentation format.

The citations I have itemized are some references of how a research work is evaluated. As I have mentioned, there are various ways of evaluating a research work and it usually varies on the critic or reviewer. To further analyze and weigh up the appropriate procedures of evaluation, additional comprehension of the range should be performed.

http://wiki.answers.com/Q/What_are_the_qualities_of_good_research
http://www.reportbd.com/articles/57/1/Criteria-Qualities-of-Good-Scientific-Research/Page1.html
http://faculty.goucher.edu/eng105sanders/research_paper_evaluation_criter.htm

Research ?= Future Career

What do you think is the role of research topic in deciding your future career?

This has been by far, one of the very toughest questions I have encountered. This had actually made me profoundly think of many matters. It had me reflect on the significance of a research study in our future profession. It even made me thoroughly analyze if there is a relationship between the two ideas.
I am no genius. Only an average, a mediocre type, I supposed. And what I have attained at the moment are the products of my determination and hard work. I am not degrading or looking down at myself and my capabilities. It is just that after reading the question, it made me think of how my future career would turn out to be and it made me evaluate my competence and potential particularly in the field of IT.

I have thought of numerous research topics but I suppose none was suitable enough to be considered. It was a frustration. I was not able to come up of a good research topic. That was when thoughts hit my mind. Am I not capable enough? Do I suit to be in this field? Do I have any interest at all in IT? Well, it was really bothersome. I guess this is where the connection of the research topic in settling on my future career in IT.

Research is basically developing and improving an existing initiative. It is to conduct further search and investigation on a particular development or discovery. It is through which critical thinking is being practiced. In my perspective, an individual possessing an adequate knowledge regarding a certain field of expertise should be able to come up with research ideas relating to it. With the scope of familiarity, awareness, and intellectual capacity of a specific field, it would heighten the probability of originating or putting together a topic for exploration and analysis.

To be able to think up or formulate an exceptional subject matter for research is to be able to shape out the opportunities that would be set down to someone. With a good research study, possibilities and chances or even breaks may be prearranged to the researchers. A breakthrough of a study will possibly lead to one’s successful path of career. I believe this would define and assess your capacity and range of knowledge. Your research topic may be the determinant in which area you are most engrossed with and through this, validate where your field of interest lies. This may determine and establish a definite forte or strong point of an individual particularly in the field of IT.

With the establishment of a research study, its conduction, principally its completion would be a great contribution to the society of information technology. This would also be representation of an individual’s involvement in the progress of IT. Through this, new information and further learning would be made available to the community particularly IT society.

Not only that an individual may be able to have contributions to the society with having a research study, but this will also provide great chance of being compensated for the development of an initiative. Everything today is of value. A topic itself possesses its own worth. More on if the study will be productively undertaken. There is always a valid equivalent value for all that is shaped out of a person’s potent mind. These ideas are owned and of value to any individual. And we are all undeniably aware that these are possessions and owners have certain legitimate rights over the property. We normally refer this right as the intellectual property right. It is crucial for the developers and researchers to be familiar that formulating a topic would significantly cost enough.

We should be mindful with the understanding of the significance of a research topic alone. It may be of little value but it is the stepping stone in proceeding and conducting a research study. It is the part where the commencement of undertaking a study is. It is an element of a research study which is the root and origin of the whole process. This section of research would notably shape and influence an individual’s upcoming career thus we should give it value.

Tuesday, August 4, 2009

Computer Science Research Current State

Computer science is an extensive field of information technology. In this fast paced time, progressive researches on computer science are rampant. With regard to this, there are a variety of articles and information regarding computer science research, its state, its technical issues and concerns that various companies or organizations have notions and opinions with their individual activities and undertakings. One of it is the IBM’s Almaden Research Center.

The Computer Science group at IBM's Almaden Research Center leads the next generation of research in information-based computing, user productivity, healthcare IT and the theoretical foundations of computer science. They invent new algorithms and architectures for finding, integrating, managing, analyzing and protecting information, and explore and prototype new modes of user interaction. And also, they are creating the technical underpinnings for a national healthcare network and do fundamental research into game theory and mechanism design, lattices and the theory of semantic mappings. The innovations they have played a role in creating entirely new disciplines such as relational database management and have also led to novel research areas such as information mining, schema mapping, data disclosure management and activity management.

The said group explores these areas from theory to systems, and from the laboratory to practice. They apply their technology to problems faced by companies in a broad range of industries including finance, healthcare, telecommunications and retail, directly interacting with clients and leveraging IBM's unparalleled technical sales and services teams. They pursue a mix of medium- and long-term research, including high-risk/high-reward projects with the potential for disruptive and revolutionary business impact. They measure their success by their strong publication record and their impact on IBM’s business and the world. The IBM’s Almaden researchers are active and respected members of their scientific communities and collaborate with universities.

Another article is the summary report of Professor Krithi Ramamritham presented by the Office of Naval Research, Asian Office about Computer Science Research in India. It embarked on the matter of India’s state of research in computer science. Has India realized its potential for high-caliber computer science research? The summary paper addressed this question in the context of computer science. It stated that India has shown the potential for high-caliber computer-science research because of the basis that India's pool of technical man power is one of the largest in the world, the growth rate of India's software industry has been tremendous in the recent past, and the demand in the West for students from India's top science and technology educational institutions has been very strong.

Many computer science researchers in India have endeavored to carry out high caliber research in spite of limited infrastructure and resources to conduct and communicate their research. Many of the researchers are involved in collaborative activities with institutions abroad, primarily in the US. Given the untapped potential that exists, researchers abroad may find it profitable to seek collaborators within India to further their research goals.

Indian computing industry must play its part, providing challenging opportunities that will bring out the potential in its employees. It cannot afford to sit on its current laurels for too long. The country needs technically-oriented leadership in the government to make this possible. Areas requiring attention include better incentives and appreciation for good research productivity, prudent management and use of research funding, enhanced salary levels for researchers at all ranks, appropriate means to set and demand accountability, and improved opportunities for peer interactions within India as well as for interactions between industry and academics.

In reference to the College of Engineering of NC State University, their key research areas are in Theory (Algorithms, Theory of Computation), Systems (Computer Architectures and Operating Systems, Embedded and Real-Time Systems, Parallel and Distributed Systems, Scientific and High Performance Computing), Artificial Intelligence (Intelligent Agents; Data-Mining, Information and Knowledge Discovery, Engineering and Management; eCommerce Technologies; Information Visualization, Graphics and Human-Computer Interaction), Networks (Networking and Performance Evaluation), Security (Software and Network Systems Security, Information Assurance, Privacy), Software Engineering (Requirements, Formal Methods, Reliability Engineering, Process and Methods, Programming Languages), and Computer-Based Education. The department has a number of teaching and research laboratories, centers and other facilities that support its educational and teaching mission.
Based on Wikipedia’s definition, Computer science, or computing science, is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. And there are different branches of computer science which are theory of computation, algorithms and data structures, programming languages and compilers, concurrent, parallel, and distributed systems, software engineering, computer architecture, communications and security, databases, artificial intelligence, computer graphics, and scientific computing. From these, we could derive the concept of the organizations of the state and their individual future undertakings in the field of computer science.

Based on the article “The Future of Computer Science Research in the US”, it was affirmed on May 12, 2005 at Washington, DC the Computing researchers informed a receptive congressional panel that the nation's dominant leadership position in information technology is at risk from cuts in research funding and changes in focus at federal mission agencies. The Computing Research Association, in written testimony endorsed by five other computing-related organizations, told the committee that the changing landscape for federal support of computing research threatens to derail the “extraordinarily productive” research enterprise that has enabled the innovation that drives the new economy.

The joint testimony notes a number of factors that imperil U.S. long-term leadership in IT, including DARPA's withdrawal from its historical support of university-based computer science research and cuts to the proposed IT research budgets at NIST, NASA, the Department of Energy and the National Institutes of Health.
"The impact of IT research on enabling of innovation resonates far beyond just the IT sector," said James D. Foley, Chair of CRA and professor of computer science at Georgia Institute of Technology. The computing research community testimony concluded with a call for the U.S. to maintain leadership in IT. "The U.S. still has the world's strongest capability in fundamental research in IT, and the most experience in how to leverage that capability towards economic growth," Foley said. But there are risks in letting uncertainty about funding that research linger.

Numerous conditions and situations of computer science in diverse organizations are varying in the researches’ extent of productivity and significance. As we look at their different states, we could run through that there are computer science researches that are ramping and mounting in investing and developing their researches while others are deteriorating and some are taking less consideration on its importance and advantage in the society.

http://www.almaden.ibm.com/cs/
http://www.fas.org/nuke/guide/india/agency/krithi2.html
http://www.csc.ncsu.edu/research/
http://www.cra.org/govaffairs/blog/archives/000334.html

Saturday, July 25, 2009

Sample Scientific Researches

Which database is more secure?
Oracle vs. Microsoft

David Litchfield [davidl@ngssoftware.com]

Summary:

The paper examined the differences between the security postures of Microsoft’s SQL Server and Oracle’s RDBMS based upon flaws reported by external security researchers. Only flaws affecting the database server software itself have been considered in compiling this data. A general comparison is made covering Oracle 8, 9 and 10 against SQL Server 7, 2000 and 2005.

The number of security flaws in the Oracle and Microsoft database servers that have been discovered and fixed since December 2000 until November 2006. Graphs indicate flaws that have been discovered by external security researchers in both vendors’ flagship database products – namely Oracle 10g Release 2 and SQL Server 2005. No security flaws have been announced for SQL Server 2005. It is immediately apparent from the result graphs that Microsoft SQL Server has a stronger security posture than the Oracle RDBMS. The conclusion is clear that if security robustness and a high degree of assurance are concerns when looking to purchase database server software, given the results one should not be looking at Oracle as a serious contender.

Evaluation:


In my standpoint, I believe having conducted such research is of assistance to users of database in opting which database is more functional in terms of security. Comparison of two particular databases’ security stance could provide acquaintance and information of how these databases perform security. Assessing the paper’s format and flow of study, I could say that it was more of a statistical study. I am not certain on how the results and data are acquired. The fact is, the paper did not provide proper definition of its methodology as well as its abstract. However, regardless of that issue, the study is a competent research. The study is definitely of great significance and contribution to the concerned database users but I would like to suggest that further enhancement on the construction of the paper should be practiced.

Analysis of an Electronic Voting System
Tadayoshi Kohno, Adam Stubblefielf, Aviel D. Rubin, Dan S. Wallach
February 2004

Summary:


The study is concerned with U.S. federal adopting paperless electronic voting systems. Analysis showed that this voting system is far below even the most minimal security standards applicable in other contexts. Researchers identify several problems including unauthorized privilege escalation, incorrect use of cryptography, vulnerabilities to network threats, and poor software development processes. The most fundamental problem with such a voting system is that the entire election hinges on the correctness, robustness, and security of the software within the voting terminal. They concluded that the voting system is unsuitable for use in a general election. Any paperless electronic voting system might suffer similar flaws, despite any “certification” it could have otherwise received.

Using publicly available source code, an analysis was performed of the April 2002 snapshot of Diebold’s AccuVote-TS 4.3.1 electronic voting system. Significant security flaws were found. Based on analysis of the development environment, including change logs and comments, an appropriate level of programming discipline for a project was not maintained. There appears to have been little quality control in the process. The model where individual vendors write proprietary code to run elections appears to be unreliable, and if the process of designing the voting systems is not changed, there will have no confidence that the election results will reflect the will of the electorate.

On the other hand, an open process would result in more careful development, as more scientists, software engineers, political activists, and others who value their democracy would be paying attention to the quality of the software that is used for their elections. Alternatively, security models such as the voter-verified audit trail allow for electronic voting systems that produce a paper trail that can be seen and verified by a voter. In such a system, the correctness burden on the voting terminal’s code is significantly less as voters can see and verify a physical object that describes their vote. They suggested that the best solutions are voting systems having a “voter-verifiable audit trail,” where a computerized voting system might print a paper ballot that can be read and verified by the voter.

Evaluation:


In accordance to perform a steadfast election, concerns on what and how a voting system is implemented is always being considered. With regard to the study, I deem that conduction of this kind of research is significant to the public and to the assurance of trustworthiness of an election. In conformity with evolving technology, an electronic voting system is being manipulated to try out the reliability of security of adopting paperless electronic voting system. Testing and simulation of the said system is done to be able to examine its security assurance. I actually find this research complicated to perform. Findings showed that such system may be unreliable and recommendations of exploiting open process and other particular system is advised. The study is commendable and I would like to propose that to further elaborate the function of the study, I think a number of systems should be taken into consideration to become subjects of the study.

Open Standards, Open Formats, and Open Source
Davide Cerri and Alfonso Fuggetta
CEFRIEL - Politecnico di Milano
January 2007

Summary:


The paper proposed some comments and reflections on the notion of “openness” and on how it relates to three important topics which are open standards, open formats, and open source. Often, these terms are considered equivalent and/or mutually implicated: “open source is the only way to enforce and exploit open standards”. This position is misleading, as it increases the confusion about this complex and extremely critical topic. The paper clarified the basic terms and concepts. This is instrumental to suggest a number of actions and practices aiming at promoting and defending openness in modern ICT products and services.

This paper concentrated on some of the issues and claims associated with open source. In particular, it will discuss the relationship among open source, open standards, open formats, and, in general, the protection of customers’ rights. Indeed, many consider open source as the most appropriate way to define and enforce open standards and open formats. In particular, the promotion of open standards and open formats is confused with the open source movement. Certainly, these issues are interrelated, but it is wrong to overlap them. For these reasons, the ultimate goal of the paper is to provide a coherent, even if preliminary, framework of concepts and proposals to promote the development of the market and to address customers’ needs and requests.

Evaluation:

It has always been an arguable issue about openness, open source and its relevant concerns. We too have discussed and tackled these issues. I believe the impact of this kind of study is favorable. It has identified a number of definitions for the term “open standard”, based on the different practices in the market. Moreover, the paper contains some proposals to deal with the different issues and challenges related to the notions of openness, customers’ right, and market development. The study used some historical data in compilation of various definitions of open standard. It is an evaluation or overview of related subjects of open standard. This study is somewhat a descriptive research.

COMDDAP Annual IT Event 09

COMDDAP (Computer Manufacturer Distributors and Dealers Association of the Philippines) have just presented an IT event, The COMDDAP Davao Expo 2009 scheduled on the 2nd on to the 4th of July this year and that was held at the Grand Ballroom Hall of the Apo View Hotel. The said affair exhibited various new-fangled innovations and technology. They also held seminars, workshops, and presentations sample some software and systems. The event was filled with different participants and spectators, from students, teachers, critics, businessmen, media or journalist, and people whose field are IT related or even non-IT people who just took interest on the said event.

July 2, first day schedule of the event, we went to attend the seminars that we signed up online to. Since the slots that were offered and available were limited, first come, first serve basis of registration online was imposed. Luckily, a lot of IC (Institute of Computing) students from USEP (University of Southeastern Philippines) made reservations on the seminars held on the 1st day. In fact, we were dominating in number of participants who took part in the event. I was really amazed for the reasons that many up-to-date and latest versions of well-liked and trendy software and hardware were showcased in the affair. There were also open source software displayed there. And the idea that I was most fond of was there even some models and endorsers present and ooh believe me, they were very eye-catching.
The main event and the purpose of our participation and involvement is our attendance in the seminars. I believe our foremost intention of being there is to participate on the seminars. The seminars were handled and sponsored by The Nexus Group (TNG). The seminars comprised of presentation of different topics.

During the session, the first presentation was presented by Mr. Celmer L. Santos was about ERIC DMS (Dealer Management System). JSI (Jupiter System, Inc.) is an expert in software development, business consulting, e-business enablement, specializing in ERP for manufacturing, automotive dealership and distribution companies. JSI’s world class creation is Enterprise Resource Information and Control System (ERIC). It is an integrated financial, distribution, manufacturing and personnel software application. The ERIC DMS is an end-to-end software solution for automotive sales parts and service businesses. It covers full range of dealer activities from customer prospecting and vehicle services. It supports full functionality for vehicle dealership business from Pre-Sales, Vehicle Sales and Administration, Service Management and Post Sales.

Well, basically, it is more beneficial for car businesses. At first, I was misled of what the presentation was all about. I know there are variety of systems specialized for this kind of IT field, but I did not think that it would be one of the topics that will be presented. Anyways, I was not really paying attention from the moment I was not able to catch up with the car stuffs he was discussing (rude much). In general thought of what I have perceived, the system entails overall package of the operations.

The second presentation was presented by Mr. Leonardo Zapa was about The HP’s new product: Thin Client. It is a petite or a thin sized system unit that is quite portable. Unlike any other typical system unit, it has no hard drive. It uses flash storage, IDE flash specifically. This new innovation supports data security, reliability, easier management and virtualization. Through out the discussion, we have come up of an idea that thin clients are best suited to be utilized in a network because of its client virtualization and remote client solution specializing in connection over networks. Aside from that, there are numerous number of advantages like having hardened embedded operating system and free management tools, with ¼ failure rate, 10x higher MTBF, in a solid state, provides longer life span. It is also up to 80% lower power or 11 to 20 watts, it has space savings feature that takes less than 1/15 volume, reduced foam up to 75% and board packaging up to 40% and it is of minimal heat generating device. What I have mentioned are just few characteristics of a thin client. With the idea of promoting green computing, I guess that this product would be helpful in environment conservation.

To end this, I consider that credits should be conferred to the individuals who made the event possible. It has been an informative affair. I look forward to any other events relating to IT that would feed us students with supplementary knowledge about the hip and newest technology.

University New Enrollment System Assessment

At most every semester as we have observed, the University’s enrollment system is always changing. Yeah, we could say that every revisions made are for the better. Of course it is, it SHOULD be. But as we always have come to notice it (how can we not?hehe), thanks to our instructor (Sir Gamboa) who persistently aids us to give attention to every detail of the ever-changing (tma man db? hehe) USEP (University of the Southeastern Philippines) enrollment system due to the involvements and contributions of some of our very own faculties.

To assist students to ease up and to be able to keep on the right track in the process of enrollment, the university posted tarpaulins in various particular locations where it entails the information of step-by-step processes of enrollment with respect to different type of students (new, old, transferees and shifters). Frankly, I was a bit amazed by the university’s act on contributing assistance to the students/ enrollees. The university really put up efforts on initiating that operation. With the size (they were huge hehe) and number of the tarps situated almost every corner of the school, they actually reserved budget for the implementation of the action. Though, we (higher year/ old students) may have overlooked its use during the enrollment since we are already familiar of the process of enrollment, I think it was very handy for the freshmen students and transferees. But if you will really give attention to every factor of the diagrams presented, we could somehow assess that there are some inaccuracy and error. Yes, surely a difference was made by having those tarpaulins as a guide. Other than, as we have examined through it, we have all agreed that there are possibilities that there were misconceptions and confusions shaped with the diagrams presented.

Part of what I went through all these years in IC (Institute of Computing) as an IT student, constructing diagrams and charts is no longer a new-fangled thing related to my field. With regard to evaluating the university’s enrollment system, any individual either possessing knowledge in this aspect or not, could perceive erroneous or flawed illustrations and figures in the diagram. There were figures of arrows that are not consistent in size. Consistency and uniformity should be observed. Some are pointing to directions that are of no sense, well some are pointing at no direction at all. There are also figures of numbers that show inexplicable indications (2/2, 1/1, etc.). When this will be thoroughly evaluated, I assume there will still be some flaws that will be spotted. This subject should be methodically constructed to avoid misinterpretation. I know this could still be improved.

In what I have experienced in this semester’s enrollment, it was sort of the same as the previous enrollment system. The first is the payment of miscellaneous and local fees. In the case of scholar students, they still need to validate their scholarship card at the OSS (Office of Student Services). Then, the submission of requirements for the evaluation by the adviser is made. A PRF (Pre Registration Form) is given to be able to proceed to the encoding of subjects. Next, you get your COR (Certificate of Registration) and validate your account in the cashier through paying the tuition fee. It is despair. Cashier line up is lengthy and unbelievable. Whew! There were just some irritating and disturbing changes made anyway. Did you know that you actually have to go over upstairs in the 2nd floor of the CAS building just to line up in either registrar or cashier? Honestly, it was very and so totally troublesome (Geez! That ruined my day). Couldn’t they think of another option? There was actually an extra alley, why did not they utilize it? Enough about that. The other thing was that all the students from all and different colleges were merged turning out a helpless almost never ending (exaggerated lang) line up in the registrar. The previous system was way better than the new one if the transactions in the registrar would be the basis. It is where submission of all the requirements in enrollment is passed (lots of paper.receipts.tsk). When you get through the registrar, you are now officially enrolled. And an added step is the validation of library card (for old students) or applying for a library card (freshmen and shifters). The processes that are taken by the new students are just more complex and take longer than old students. Loads of requirements are being dealt by the freshmen as well as the transferees compared to old students. And concerning that they are new in the university, having a guide would be very useful, a well defined guide. That is why it is important to somehow lay notice on the diagrams made.

In the matter of the enrollment system, that software itself, as we are all aware of that there are some errors and still needs to be run through testing and debugging. I am not questioning the expertise and proficiency of the developers, who are also IC’s faculty. The fact that the time frame that was given to them that was allotted for the development of the system is insufficient, is enough valid and suitable grounds of why the system has some errors and requires adjustments and improvements. But I must admit with these flaws in the system, we have encountered many problems. There is some information in our CORs that are not right and it caused hassle and time consuming since it was redone and caused delay. It is also agreeable that the university hired its very own in-house resources in taking over the managing the university’s enrollment system rather than outsourcing. Though there may be negative views in the said act, like the IC students losing efficient instructors, I guess it is still beneficial for the university having diminished the cost of handing the enrollment system project to outsource.

To top it all, the new enrollment system provided betterment and enhanced the operation. It aimed to maximize the operational efficiency, and for the least, yes, it did. It is just inevitable that there are possibilities of problems and errors. But with improving and modifications, the system will surely be progress and develop into an even better one.

Wednesday, July 1, 2009

USEP's Decision about Utilizing In-house Resources - Was it a good decision?

Many aspects are considered and thought-out to decide whether to outsource or to utilize in-housing. These considerations mainly involve expertise and proficiency of the workforce, monetary funding, time exploitation and observance of confidential information of the organization. Certainly any organization desires to be able to come up with a sensible and money-wise decision. Effective and efficient output is expected at a right comparable price. Various are anxious about keeping the privacy of their organization’s data. Of course, the company’s information is one of the valuable possessions.

Well, I think the university decided to make the most of in-housing mainly to be able to diminish the expense and costs of the project. Since the university’s personnel are being employed, I guess the rate of charge or simply the compensation of the employees would be less costly than hiring outsourced workforce. Next would be for the cause of concealing classified records and information of the university. In the case of opting in-housing, the university’s private data would be more secure and protected when the personal employees of the university take over the making of the project.

Was it a good decision? I have heard many opinions regarding the issue. Some say that it was not a very good decision for the reasons that various practiced and proficient IC (Institute of Computing) faculty are being put up to do the task, leaving a number of major subjects with no instructors to take over; the time frame that was given to the in-housed resources was not sufficient that the output of the system project is not yet that unstable and have some flaws; and for this cause, the IC faculty member would be occupied again with the system project thus being able not to attend to several number of subjects, and since part of developing a system is its maintenance, we should expect that those faculty members who were assigned to the task would be imprecisely and wholly unavailable to be teaching again and be doing their responsibilities as instructors.

Some would say that the university made the right choice since the cost of developing the project is less pricey than using outsourcing. I guess that would be true somehow. I suppose that the university would not take such a crucial step of making this decision if it would not do any good to the university itself. But we have to keep in mind as well that the possibility of missing some important points regarding making decision about the said issue is inevitable. I, for myself think that at some certain angles of view, the university did make a commendable decision utilizing the in-house resources and at some point, the university one way or another missed some significant details and considerations in making the decision.

I greatly believe that the developers of the system project or the IC faculty members that were tasked to develop the project are the ones who are of great benefactor to the said action of the university. They still get to earn loads of subject to handle and they are being paid by the university for the development of the project.

So I guess if I have to assess and take into consideration these benefits made, I would say that it was in some way a good decision. But I also believe that it could be improved more with appropriate and accurate balance and deliberation of the issue.

What I have stated possible reasons are simply assumptions of mine. There are neither compelling nor verified backup nor support to the enumerated reasons and basis. I suppose to validate or to clarify the valid reasons behind the in-housing decision made by the decision-makers of the university, is to address these questions directly to the board of the university.

Monday, April 20, 2009

OJT?

On the Job Training

Just a thought...

...what do you expect to gain from this so-called real work experience?

In my perspective, this is the phase wherein we SHOULD be able to apply and practice the theories and lectures we've acquired from school. The stage wherein we could have our real work understanding and involvement in our chosen field. The episode of our schooling wherein we really get to learn what we need to learn TO PREPARE us to what lies ahead.

But is this really what we normally get from having OJT?

Do you think my expectations are just way too high for this particular aspect of learning?


Well, I believe CERTAIN ACTIONS must be done by CERTAIN PEOPLE in order to get things where they should be.

I hope this will be a key opening to solve this issue.

Friday, March 27, 2009

Latest Updates

Yey!We did it.
We're on our Design Phase in our paper in SAD (System Analysis and Design1).
Wiw. That was really something.
We were really tense. Glad we got through.
But there will be more to come.
Hope to get through it all.

Goodluck to the other groups.
Yaka ni!
Go,go,go!

We're finally going to have our OJT this summer!Yey

Wednesday, March 25, 2009

SAD Updates

Our group is currently working on the analysis phase of our major paper in SAD1. Holdup in the process of selecting which information gathering technique we are going to apply. At first, we chose to have questionnaires. I believe almost all the groups enrolled under the said subject picked out the same technique to be used. We thought it was simple and easy to implement this kind of technique but we were tangled and held up in this method we chose since the verification and validation of questionnaires takes a lot tough assessment. To ease up the delay of our paper, just recently, we have also decided to have interview as our information gathering technique. At the moment, we are waiting for our analysis phase paper to be checked. Yey! Good luck to us.

Saturday, March 14, 2009

USEP Enrollment System - Activity Diagram and Use Case Description

ACTIVITY DIAGRAM


(Use Case Description)
Sequence of Events in Enrollment:

1. The student pays the local fees (which consist of collegiate headlight fee, OCSC fee and local council fee)to their respective collector/treasurer .
2. Once payment has been made, the student should go to their particular college advisers for the evaluation of grades and subjects.
3. After the evaluation, he is given a PRF by the adviser.
4. With the PRF, the student goes to the encoder for the encoding of subjects.
5. The non-scholar student is to proceed to the cashier to pay the tuition fee. Scholar student is to proceed to the bookkeeper for verification.
6. Subsequent to verification of scholarship or payment of tuition fee, the student will now proceed to the registrar's office.
7. Later, the student should go to the library for validation of their library ID card.
8. Lastly, student is officially enrolled.

USEP Enrollment System - Data Flow Diagram (DFD)

SAD proposal submission and consultations

Well for starters, I think each group really did work hard on coming up with project proposals. With all the rejections and revisions that are being requested, indeed it was a tough one. In the system proposal phase, we were tasked to look for a certain company or organization to be the subject of our study. Having unable to find a suitable company to conduct our study, we were given options of studying the existing in-housed system. With this, we are able to somehow learn the university’s system status. And through this, we hope to contribute to the welfare of the university.
Our group in particular, chose to study the university collegiate headlight and we have come up of an idea of having an online college publication with a different twist from the usual flow of the system.
In the consultation issue, I find it hard and challenging on how Sir Gamboa meticulously checks and reviews our paper. With this, I anticipate to build perseverance and be able to cope with the challenges. And I hope that we will all be able to finish and pass this subject with flying colors to be able to proceed to our goal.

Steps to Expendite the Implementation of the IS Plan

Expedite is to accelerate or speed up something. Having constructed an IS Plan should entail the details on how it is to be implemented in effective and efficient ways. Steps of which involves:

 Coordination with the personnel involve
- The workforces/peopleware itself are the main element in carrying out the IS plan. They are the brain and core of the system. Without which, the plan would not be put into action. This entails people management with their particular function and responsibility.

 Assigning of responsibilities and setting timelines
- Distribution and allocating of task should be thoroughly thought out. Constructing a timetable or outline of what-must-be-done is very crucial for it will be the basis for monitoring means of how the plan is going.

 Utilization of resources
- It is of practical act to properly make use of all the resources available for the system. Analysis of the resources should be made. A university should look out for any source accessible. Overlooked resource could be a big lost for one’s organization.

 Monitoring of the growth of development
- To keep track with the project schedules must be observed. Any progress should be made recognize. Updates and improvements must also be identified and evaluated if it has been going well with what has been planned for the system.

 Assessments and revisions of plans
- It is an evaluation and comparison of the actual system implementation with the original ISP. Changes would not be inevitable and so as with what has been intended or planned for the ISP. In this phase, with what is good for the betterment, alteration or modification of the original plan is being done.

These are some guidelines based on some article (PDF from Sir RSG). These are subject to any changes depending upon one’s point of view of what is better. Supplementary information with regard to the matter is open and welcome.

When Is Outsourcing Advantageous to our University ?

Outsourcing is a modern day boon. Outsourcing grants businesses the freedom to dump non-core, yet important sectors of its administration on companies specializing in those very individual aspects. The most enticing advantage of outsourcing is the cost effective factor.

With regard to the current issue of in-house employees developing the university’s enrollment system, employing some IC faculty in the said project greatly affects the students, the faculty and the department as well.

Having hired the in-housed personnel, long search of substitute teachers have caused interruption and delayed of classes of some major subjects. Having finished developing and implementing a system does assure the end the developers’ job. The system should be maintained and improved every now and then. With this fact, there is no definite assurance of having those selected faculty to teach again. As what Sir Gamboa said, there is actually no end or finished system. For as long as the developers still exist and it is still being utilized, the system itself will not cease from functioning.

True that the university is spending a large amount on using an outsourced enrollment system but if I were to have to say something about it, I think I would propose to the university the utilization of outsourcing. In the present state, the department and students are suffering and adjusting for having mislaid some of the faculty, I guess it would be better to lay off the job to others. But I think if it would be thoroughly thought out, maybe the university will be spending more having the system in-housed.

To end with, I think it is better to leave the decision-making to the ones who are good at this, to the analyst of this kind. It should be comprehensively be thought out first the advantage and disadvantage it could bring. Outsourcing’s main edge is to be cost effective, so until the day the cons of outsourcing outweigh the monetary factors, outsourcing as a reasonable and lucrative way to do business, is here to stay.

Most Radical Type of Organizational Change?

Revolutionary measures are of constant in an organization. In implementing developments and improvements, changes should be put through. Perhaps the most asked but least answered question in business today is “What can we do to make our business survive and grow?” The world is rapidly changing into something too hard to easily predict, with a hundred opportunities and pitfalls passing by every moment.

We are moving at an accelerated rate of speed, our knowledge and awareness expands and our state of thinking and ideas is transforming and transcending so as the differences to be made for the expansion and progression of the organization’s state.

A Paradigm Shift is when a significant change happens usually from one fundamental view to a different view. In most cases, some type of major discontinuity occurs as well. It could have dramatic effects whether positive or negative. Therefore, I think among the organizational changes, the paradigm shift would have greater impact and would be more vital among others. According to Kuhn, paradigm shift is revolutionary science. “Revolutionary science is usually unsuccessful, and only rarely leads to new paradigms. When they are successful they lead to large scale changes in the scientific worldview.” If it would fall short, then it could entirely cause failure to the organization. It could greatly affect an organization since wide changes is being employed in this change. A thorough thought over about implicating this would be considered and required.

http://www.taketheleap.com/define.html
http://www.organizedchange.com/decide.htm
http://ezinearticles.com/?What-is-a-Paradigm-Shift?&id=68933
http://en.wikipedia.org/wiki/Paradigm

Tuesday, January 20, 2009

How to get there?

In creating a system, we usually follow a model. A SDLC model to be exact. Both predictive and adaptive approaches use this model. And if I were to construct my own SDLC model, it would go like this:

PLANNING & ANALYSIS

It would all start with plans. What I want, how would I achieve it and what are the approaches I would use to get to the top. And these plans would need careful and thorough analysis.

TESTING & IMPLEMENTATION

After the creation of plans and analysis is the testing and implementation. In this phase, everything that has been planned shall be implemented and tested, to enact the things I've decided to carry out.

EVALUATION

After it has been tested and implemented, one should analyze the effectiveness of the plans and course of actions that has been done. In this phase, the results should be examined.

MAINTAIN & IMPROVE

This would be the last phase of my model. It is where I would maintain the things and plans that are effective and efficient in achieving my objectives in life. And to improve those that fail.

If I were to choose what approach I'd use.
Predictive or Adaptive?
I think I'd prefer the adaptive approach.
Yes, planning is a great factor in achieving one's objectives in life but
as I've mature, I've realized that everything doesn't always go with one's plan.
There will always be unexpected circumstances thus one needs to adapt with whatever life has brought.
Changes, that's what we call it. The only constant thing I guess.
It ain't being pessimistic, it's about being practical.
"Go with the flow." Align your next move in the current situation. But do not lose sight of what you aim to achieve. Adapt. But it is wiser to predict and adapt. So I guess the integration of both Predictive and Adaptive approaches would be best used.

Data Gathering Options

In gathering data, it would be best if the techniques that would be used are suitable on the person collecting the information. And I guess, if it would be based on my personality, among the data gathering techniques, I think I’d better off with handling document review. I am not most comfortable of interacting people with regard to important matters. I do get freaked out especially when I am to have a conversation with an honorable and an achiever person. In having document review, it would be less pressure and all I have to do is comprehend what I’ve read then analyze. But based on what I have observed, the usual documents are of prehistoric, if not, well, some are imprecise. And so I have thought that conducting an interview associated with reviewing documents would be the finest combination of data gathering techniques. I can do interviewing. I’ve done it many times. I’m not that lousy in interrelating with people through verbal communication. I can speak English fluently but I have to be put on much pressure to actually get through it. It’s a good thing after all. It can enhance one’s skill of communication and dealings with people in the real world.

USE CASE Diagram (USEP Enrollment System)



Herewith is a use case diagram of USEP's enrollment system.
It shows the entities and processes involve in the system.

Monday, January 12, 2009

Data Gathering Techniques - which suit you best?

In gathering data, it would be best if the techniques that would be used are suitable on the person collecting the information. And I guess, if it would be based on my personality, among the data gathering techniques, I think I’d better off with handling document review. I am not most comfortable of interacting people with regard to important matters. I do get freaked out especially when I am to have a conversation with an honorable and an achiever person. In having document review, it would be less pressure and all I have to do is comprehend what I’ve read then analyze. But based on what I have observed, the usual documents are of prehistoric, if not, well, some are imprecise. And so I have thought that conducting an interview associated with reviewing documents would be the finest combination of data gathering techniques. I can do interviewing. I’ve done it many times. I’m not that lousy in interrelating with people through verbal communication. I can speak English fluently but I have to be put on much pressure to actually get through it. It’s a good thing after all. It can enhance one’s skill of communication and dealings with people in the real world.