July 5, 2011

SPARQL: a flexible querying language.

Posted in Databases, software testing at 5:19 pm by mknight1130

As I have been using SQL to query databases, I have wondered about alternatives to query among multiple data sets. Is there a more intuitive way to retrieve information that is not so tied down to a particular meaning of a field? Say I was looking for a new mobile phone and wished to query different vendors for the best price. Do I search for “cellphone”, “cell phone” “mobile phone”, “mobile device” , or “phone” ? All these terms can refer to a cell phone. It would be much easier if I could query one term and results close to the meaning of the first term would also appear. I would then get what I meant by the search. SPARQL is a query language that has such power.  SPARQL is a language that accesses  RDF or resource description framework in order to retrieve closer search results from a graph of possible meanings and pattern matching. As I am investigating this language, I have found several resources that I would like to share.

The W3, or World Wide Web Consortium 3,  provides an excellent SPARQL technical document and recommendations at Query Language for RDF ( http://www.w3.org/TR/rdf-sparql-query/) . The W3 is a group of web experts, pioneers, and interested contributors that develop web standards . The group is well regarded and is used as a reference upon developing web pages. This technical document starts by introducing SPARQL, giving some sample queries, providing details on the syntax, constraints, and defining the testing framework.

For the newbie,  XML.com provides a down to earth introduction in XML.com: Introducing SPARQL: Querying the Semantic Web http://www.xml.com/lpt/a/2005/11/16/introducing-sparql-querying-semantic-web-tutorial.html .   XML.com is a resource published by O’Reilly, a publisher of technical training materials.  The article talks about main points associated with SPARQL, simple queryies, other querying forms such as construct, describe and ask. The article describes the background needed to understand SPARQL, the context, tools and how to use patterns.

SPARQL is receiving attention from developers. Microsoft is looking for ways to implement this robust query language. See SPARQL-DL( http://academic.research.microsoft.com/Publication/5476728/sparql-dl-implementation-experience ) . This language is also making its way into the clinical space see Zynx Health Incorporated (http://www.zynxhealth.com and https://trak.baiworks.com/application/jobdescription.aspx?q=leSEDqZdwZ4gKo4Ayjbxnfq6W3IaFJTL4ysCRnjIn8wgDur%2fwJfrM72BIrQ5%2b2NeybHA4dEkh0U%3d). SPARQL has promise to query a wide range of information and bring back better matches in search results. The next time I look for a mobile phone or do any complex query, I may just use SPARQL.

Advertisements

June 17, 2010

Where to get a list of great websites for free

Posted in Directories at 11:24 pm by mknight1130

There are many tools on the web that do not provide a list of great websites for free. For example, Google charges companies for the placement of the website on a list after a search is conducted.  LexisNexis charges a hefty subscription fee to access to all sorts of articles and websites from . Where are the best places to go to get an unbiased view of great web sites?

Aboutus.org, http://www.aboutus.org/ provides short lists of great websites, created by community members who are passionate about a topic. The website recommendations are provided for free and are free to the searchers. Aboutus searchers need not scroll many pages to find the sites they need.. I have contributed a list of resources to Aboutus.org,  5 great sites you won’t find in Google .

The ODP or Open Directory Project, http://www.dmoz.org/ , is a directory that allows editors, who wish to volunteer their recommendations, by listing and  describing websites that fit specific categories . No one pays to search for the websites on a topic nor are the websites promoted or recommended based on payment from the website creators.

I recommend using Aboutus.org and the ODP to locate websites for free and provided by the community for the common good.

May 14, 2010

Musing About the Mobile Marketplace

Posted in Mobile Devices at 3:21 am by mknight1130

Last Thursday, at InnoTech Oregon, attendees were abuzz about mobile applications and websites; just a click away on the cell phone. The buzz turned to a roar during the panel: The Mobile Application Marketplace. Four panellists, with hands on experience with the mobile application business, spoke about opportunities and challenges in this industry. According to these experts, mobile applications were exploding; especially social applications, like Facebook. Free applications with ads or bundled with a cell phone provider seem to be the new business model. But, finding a standardized language to develop the mobile applications has been a challenge; especially with a wide variety of phones (e.g. iPhone, Android, Nokia, etc). While it was fascinating listening to the panel, I thought I would search the web to learn more. The librarian in me likes to check facts based on more than one source.

The 2010 Statistical Abstract, http://www.census.gov/compendia/statab/cats/information_communications.html is an excellent source to start to get a handle on the mobile marketplace. Granted statistics still need to be collected specifically for mobile applications and websites. But, the site does contain documents of interest and relation to the mobile marketplace (e.g. Cellular Telecommunications Industry; Household Internet Usage by Type of Internet Connection, and State; Wired and Wireless Telecommunications Carriers–Estimated Revenue, etc..).  Each document opens in an Excel format or a PDF format; displaying data covering the United States. The charts paint pictures on potentional revenue expanding opportunities in the mobile marketplace.

Another resource, a must read, puts a legal perspective on the opportunities and challenges in the mobile marketplace. The U.S. Federal Trade Commission, published a report, in April 2009, called  Beyond Voice: Mapping the Mobile Marketplace , found at http://www.ftc.gov/reports/mobilemarketplace/mobilemktgfinal.pdf . The report was written from a town meeting held in May 2008, to discuss consumer protections in the mobile market place. The report first covers an overview of the Mobile Marketplace, Session 1. Then the report unfolds into 6 meaty sections: Mobile Messaging, Mobile Applications, Location Based Services, Mobile Advertising and Marketing, Managing Your Mobile Device, and Children and Teens. The report considers action with a section on Best Practices and another on Mobile Security. This report will leave the reader with a sense of how revenue from the mobile marketplace may be impacted by privacy and legal controls.

After exploring the 2010 Statistical Abstract and the Beyond Voice: Mapping the Mobile Market Place, I have become more excited about the potential of the mobile market place. As one of the panellists said at InnoTech Oregon, this is a new frontier another gold rush. It is debatable how the mobile marketplace landscape will emerge. In the meantime I am grateful for my Android. I find places with the GPS application, I keep my schedule on it, and search the Internet by voice. I keep finding a myriad of uses I just did not see before and considerations for testing software. It will be interesting to see where the mobile marketplace emerges next.

April 13, 2010

Worthwhile Resources for Wicked Problems

Posted in development at 9:22 pm by mknight1130

On Wednesday, March 31st, I went to a talk given by Bill Gilmore, called, “Applying the Concept of Wicked Problems and 7 Principles for Dealing with Them”. Bill defined a wicked problem as a problem that does not have a solution and can be hammered out indefinitely. Getting stuck in a wicked problem is analogous to a Sisyphus rolling the stone up the hill, only to find it just rolls back further. The wicked problems are made even more complicated by their ambiguity and social interactions between various stakeholders. These types of problems are persistent in software development and in business. Fortunately there are some strategies to address wicked problems, such as documenting possible solutions and assessing their value. To wrap my arms around wicked problems I found 2 worthwhile resources.

NASA gives a down to earth explanation of wicked problems in its article “What is Wicked Problems” http://www.nasa.gov/offices/oce/appel/ask-academy/issues/ask-oce/AO_1-4_F_wicked.html . The article describes the older traditional linear approach, waterfall method, to solving problems. The article, a page long, goes on to explain how people solve problems in a more agile way, by defining and solving the problems at the same time. There is a link to learn more about the theory of Wicked Problems. This article is a good overview.

Another worthwhile resource from Unidata, http://www.unidata.ucar.edu/staff/caron/collab/wicked.html , gives alternative perspectives of wicked problems. Unidata is a program associated with an organization comprised of 160 universities, the University Corporation of Atmospheric Research (UCAR). UCAR supports education and resources in the oceanic and earth sciences, fields known for their complex problems. The Unidata page on wicked problems provides supporting evidence and criticism concerning the wicked problem approach. This is a balanced viewpoint.

I learned from my reading and the Bill Gilmore talk that problem solving and defining the problem at the same time, is one way to address wicked problems. It is not a perfect approach, but it is less likely to lead to a Sisyphus result. I intend use the knowledge while testing software and in some of the messiness of everyday life.

March 22, 2010

Exploratory Testing

Posted in software testing at 6:32 pm by mknight1130

Last Wednesday, I went to a talk on Exploratory Testing, sponsored by the Agile group in Portland Oregon. With the need for quick turn around for testing results; exploratory testing offers some promise. Exploratory testing, according to Jon Bach’s speech, is a process of thinking, learning, and adapting while testing. The trick, however, is to obtain useful feedback in a timely way. Jon advocates using session based time-boxes (e.g. 90 minutes) resulting in a debriefings based on a series of missions. These session based testing facilitates accurate testing, flexible scheduling, and course correction. Jon Bach has additional information in his blog: http://jonbox.wordpress.com/ . This talk has prompted me to research different exploratory software testing perspectives and papers on the Internet.

An overview on Exploratory Testing, is written  by Andy Tinkham and Cem Kaner blackbox.cs.fit.edu/a/explore.pdf . Both authors teach and write at the Florida Institute of Technology (FIT). Their article, written in 2003, describes different exploratory testing perspectives among experts and the knowledge and skills applied by exploratory testers. They describe, very succinctly, the similarities and differences between different exploratory testers and offer some thoughts on how to train exploratory testing. The reader comes away from the article with perspectives on how exploratory testing may be conducted in real time projects and how to develop this as a testing strategy.

Sam Guckenheimer, from IBM,  provides an automated/programmer testing approach to exploratory testing in his interview with Brian Marick: Test-First Design, Exploratory Testing, and Agile Process: An Interview with Brian Marick http://www.ibm.com/developerworks/rational/library/2833.html . The interview contents advocates for software developers to test first and software testers to initiate exploratory testing to provide quicker and more concrete feedback on potential usability issues. The exploratory testers find the bugs that the automated tests do not find. The article concludes specifying that many more testers will need to be programmers of some sort and programmers will need to be testers. The author’s main point is that programming and exploratory testing intermingle very closely.

The two resources mentioned above and the talk by Jon Bach clearly explain how a type of testing tool, exploratory testing, works best and its role. Especially in an Agile environment, these types of tests are handy in a crunch and for programming flexibility. As mentioned in the 2 articles above Exploratory testing does have its limits; which is why other scripted test types are used (To understand the limits of exploratory testing see “How to Choose between Exploratory and Scripted Testing” by Andrew Thompson at http://www.stickyminds.com/sitewide.asp?ObjectId=6271&Function=edetail&ObjectType=ART However, my better understanding of exploratory testing will only serve me and others well as I continue to test software.

March 19, 2010

Fishing for Information in the Pacific Northwest

Posted in fish, information resources tagged at 1:00 am by mknight1130

When I walked into StreamNet Regional Library, for a librarian’s lunch, I was awed at the amount of information available on fisheries and the Pacific Northwest watersheds. Waves of journals, books, and maps stretched out to be greeted as I meandered down each aisle. This collection, enough to fill the first floor of our floating home, is only a tip of the iceberg. The librarian, Lenora, talked about the on-line as well as future digitization of the collection. Closely connected with the Inter-tribal Fish Commission,  The library provides rare materials and perspectives about fishing conservation and watershed recovery. It is a unique gem in the heart of Portland. Upon reflecting on my visit to Streamnet, I was wondering about other types of fishing information and perspectives.

I perused the Internet Public Library, http://www.ipl.org/ and found 2 interesting websites on Pacific fisheries and related watersheds.

The first is the Pacific Fishery Management Council http://www.pcouncil.org/. The council, formed by the Magnuson Fishery Conservation and Management Act of 1976, comprises of state, tribal and federal advisers creating and disseminating fishing policies; e.g. through the website. The website is well organized with options to search by type of fish (e.g. salmon) or by fishing management concepts. Fishing laws are spelled out and are current, making it easy for a fisherman to wrap his/her head around the legal fishing limits.

The second is the Center for Columbia River History,  http://www.ccrh.org/comm/river/index.php . The center is a collaboration between the Washington Historical Society; Portland State University, and Washington State University in Vancouver. The site provides information on treaties and legal decisions made in the 19th and 20th centuries. Links to histories of dams, traditional equipment, Canadian documents and other documents open pages with clear descriptions and relevant information. A photo archive and oral history are available too. For the person researching a historical perspective on fishing management, this website is extremely helpful. For example, there is evidence provided on the salmon fisheries and the decline in salmon population.

From what I am reading and seeing along the Columbia River, salmon stock is pretty low and the fishing is very competitive. I see fishermen in boats an arm length away. Hopefully more rain will come to fill the Columbia river and increase the current. Then the fish will flow faster downstream.

January 24, 2010

Fact Checking

Posted in facts, information resources at 7:06 pm by mknight1130

If I could go back to one of my favorite jobs, I would be a fact checker. A fact checker takes a product (e.g. a report, a publication, a website, etc) and verifies that the information presented is correct. While deadlines and budgets can be tight, efficient fact checking means a better quality product and enhances the trust among product users. As an information analyst for a firm providing pharmaceutical information, I developed a system to reference drug information that was correct and information that needed correction. In some cases, there are questions due to the product context and the information being presented. I like working with project managers, editors, and product creators to better define what the information means.

Here are 2 sites on Fact Checking

Fact Checkers, http://parklibrary.jomc.unc.edu/factcheckers2004.html . Although this page is from 2004, Barbara P. Semonche (the Park Library Director), provides a good introduction to the fact checking process and some helpful links to reliable resources. For example, the SEC provides a database of corporate annual reports; a birds-eye view into a company. This government site is free and listed by this Fact Checkers site.

Katrina O’Brien, owner of MyResearchNeeds has put together a nice site on the benefits of fact checking and the processes, http://www.together.net/~ktob/pages/fact_checking.htm. She is a freelance fact-checker and researcher who has experience working with all sorts or organizations, from universities to Southwest Airlines. Her web page is well written and simply put. She explains who could benefit from fact checking and what the return is on the investment.

Fact-checking is a great job because the information verified benefits both businesses and their customers. It assures that the information provided is valid and trustworthy to use in making decisions.

September 20, 2009

Floating Homes

Posted in Floating Homes tagged at 7:09 pm by mknight1130

Finally there is some time to blog. It has been a busy summer where I moved from a townhouse to a floating home. A floating home is a house on the water. Our house is supported by some cedar logs and has a beautiful view of the river. There is resident duck that waddles on our dock and a heron.

There are a couple of great resources about floating homes. One is http://www.floatinghomes.org/info.htm. A glimpse at the website reveals the following: “The Floating Homes Association is a volunteer civic group which represents the interests of the residents of the 400+ homes in five floating home marinas on Richardson Bay.”. The site describes floating homes, in plain English, to the curious and goes into detail about rules and regulations. Although the legal information is most pertinent to California, the information provides insights for floating home and potential floating home owners.

The Floating Home Association of the Pacific, http://www.floathomepacific.com/ ,  is another nice resource about floating homes. Although the link to articles is broken, the site provides information on floating home standards, related services (e.g. insurance), and connections to some Pacific floating home communities.

Of course, for those who want to dive into the floating home community, there is http://www.houseboatmagazine.com/ . The site sponsors a rich set of forums for houseboat owners. renters, and admirers. There is information about maintainence and resources on buying a floating home. In addition there are comprehensive directories of manufactuers, brokers, insurers, financers, renters, and service people who deal with houseboats or floating homes.

We, Scott and I, have decided to rent our floating home, to try it out before buying. We have loved our summer on the houseboat and look forward to the changing river patterns through the autumn, winter and beyond.

Estimation and Software Projects

Posted in software testing tagged at 7:05 pm by mknight1130

In early September, I attended a Rose City SPIN seminar on Estimation. Todd Williams spoke about assumptions workers and companies make on estimating development and QA hours for a software project. Mainly, that workers who make excellent estimations use past history on similar types of tasks. Although each project is different, the workers are able to transfer a ballpark timeframe to get the job done based on experience. In addition, according to Todd Williams, companies that use estimation well do not treat them as quotes. These companies expect the timeframe will be 70% accurate and that a project may finish a sooner at times or later at other times. Williams states, if the estimate of a project generally swings too early or too late (e.g. an accuracy of 35%), then companies and, perhaps the workers , would benefit from some estimation coaching.

An perspective on estimation, is provided by David T, at http://softwaresurvival.blogspot.com/2006/11/dynamics-of-effort-estimation-in-most.html

David is a software developer with over 10 years of experience. In his post, he discusses factors that influence estimating and some general tips. He caustions that the amount of padding to an estimate needs to be proportional to the risks; …as agressive as possible while remaining reasonable..” David provides some excellent points about why over estimation is not so good. In some places, it may lead to unproductive use of the time. This is in line with Todd Williams view that over padding an estimate does not benefit a project.

Like David, Todd Williams  writes a blog about estimation, http://ecaminc.com/index.php/blog . Todd’s blog is called Back from the Red blog, emphasizing giving a potentially doomed project a make over. This is a good site to learn more about how to problem solve potential road blocks in software projects.

David and Todd;s view points have helped expand thinking about estimating and turning out successful software projects. I am still using the information in my current contract as a software tester to keep on track and influence success.

June 1, 2009

Metrics and Software Development

Posted in development, software testing tagged at 2:22 am by mknight1130

About a month ago, I went to a talk, sponsored by Rose City Spin and Pacific Northwest Software Quality Conference  (PNSQC), about metrics and software development. Steven Borg gave the presentation about how to use numbers and thinking about software development in terms of problem solving. He talked about velocity (the output software developers provide in a time frame); throughput (the time an issue is found in software to the time it is closed); and customer satisfaction. As it was a very robust talk, I have researched some sites to provide an overview on Metrics and Software Development.

One great overview document is called Introduction to Conventional Software Metrics at http://curriculum.sv.cmu.edu/ms/metrics_se/downloads/IntroductiontoConventionalSoftwareMetricsRev2.0.doc . This chapter covers types of metrics, different contexts for looking at metrics, how metrics unfold during the course of developing a software product and how to apply the metrics.

Steve Borg provides an overview of using process improvement metrics in his blog Best Practice: Where Technology Meets Teamwork at http://blog.nwcadence.com/category/best-practice/ . In the posts, Practical Process Improvement; Borg defines some of the basic concepts of software metrics and how to make the best use of them.

The two sites, mentioned above, provide good overviews in understanding metrics and software development.

Previous page · Next page