IEEE Conference on Technologies for Homeland Security (HST ’12)

blog pic 1

Recently I attended the twelfth annual IEEE Conference on Technologies for Homeland Security (HST ’12), held right here in our neck of the woods, Waltham, Massachusetts. The conference aims to bring together innovators from leading universities, research laboratories, Homeland Security Centers of Excellence, small businesses, system integrators and the end user community to provide a forum to discuss ideas, concepts and experimental results. I gave a poster presentation on our Semantic Technologies for Civil Information Management in Complex Emergencies within the Attack and Disaster Preparation, Recovery, and Response area, as well as gave a paper presentation on our development of A Social Agent Dynamic Honeynet for Attack Modeling within the Cyber Security track. Both presentations generated lively debates and discussions on the challenges of applying technology solutions these problemspaces. 

With regards to our social agent honeynet research, here we were presenting initial findings from an effort to develop an agent based dynamic honeynet that simulates user interactions with social networks for the purposes of developing attack models. You can check out our demo here. Our solution allows security professionals to create networks simulating user activity for companies and government entities through the provision of a set of parameters. Our research pointed to the importance of instantiating a social dimension to our virtual agents, providing the agent with the ability to interact with a variety of social networks. For this purpose, we developed influence models to learn patterns from actual users’ activity on social networks to improve the effectiveness of the social agents.

One of the questions from the audience was why use agents to collect attack data when regular users in the course of interacting with social networks get attacked enough as it is? Our response was that a deception network enables us to feed false information to the adversary as needed, track adversarial movements to learn attack patterns and attributes, and use the information collected during the attempted infiltration for the purposes of building more robust defenses and developing more targeted offensive operations. Additionally, deception networks force our adversaries to expend resources attacking our fake network. Another line of questioning asked if we were wasting people’s time who decided to follow our fake agents since about 50% of the followers of our agents were real and 50% were found to be malicious. This generated a lively debate, whereby someone else in the audience responded with the idea that identifying these people might be useful for preventative defense. Maybe these are people who are more vulnerable and would be more likely to click on spam and that perhaps Twitter or others might want to know this. A further question had to do with how do we know that the users following our agents are malicious? This is fairly straightforward because the users attempted to pass us links that are associated with known bad actors. As a future effort we plan to automatically parse the tweets and see if the embedded links are already in a black list which would trigger alerts. We maintain what we believe to be the world’s largest intelligence database on botnets to cross-reference our malicious entities as well. You can check out that project here.  

There were several ideas that came out of the collaboration at this conference related to our agents. One idea was to use our agents to collect and harvest social media artifacts for the purpose of understanding Arab Spring-like events. Additionally, our agents could potentially interact with users to explore the shaping of opinion, collaborating with users beyond just posting information to Twitter and following other users. We will definitely be exploring these avenues in the near future, so keep your eyes peeled for developments in this space.

One of the most interesting presentations I attended was from Laurin Buchanan of Secure Decisions who was involved in the CAMUS project, Mapping Cyber Assets to Missions and Users. This project was very relevant to our Commander’s Learning Agent (CLEARN) and Cyber Incident Mission Incident Assessment (CIMIA) work, which is an existing capability developed as part of an AFRL SBIR Phase II Enhancement that automatically learns the commander’s mission while bringing contextual knowledge and assigning priorities to resources supporting the commander’s mission in Air Operations planning and execution support. CLEARN/CIMIA monitors the workflow of operations personnel using Joint Operation Planning and Execution System (JOPES), the Air Mobility Command (AMC) Global Decision Support System (GDSS), Consolidated Air Mobility Planning System (CAMPS), and Global Air Transportation Execution System (GATES) to learn the resources necessary for each mission, and recommend workarounds when one or more the resources become unavailable.

Our semantic wiki work also generated interest during the poster session. One presentation that was interesting and tangentially related was SPAN (Smart Phone Ad Hoc Networks) by MITRE, which utilizes mobile ad hoc network technology to provide a resilient backup framework for communication when all other infrastructure is unavailable. I thought it was pretty neat that this was also an open source project. This research was interesting given our work in using mobile devices for data collection in austere environments during operations and exercises in the PACOM AOR in our MARCIMS (Marine Corps Civil Information Management System) project. Pretty cool to see all of the developments in this area.

SemTechBiz 2012

201206161532.jpg
201206161532.jpg

I attended SemTechBiz 2012 in San Francisco last week. This annual conference on semantic technology, which is in its eight year, does a nice job in balancing the interests of research vs. commercial communities. This year the conference was tilted towards commercial vendor interests after all the vendors do sponsor the event although the product pitches were confined to a clearly identified solutions track. Here are my semantic annotations about this semantic technology conference.

Given our focus on open source platforms, I enjoyed the session on wikis and semantics. In this session, Joel Natividad of Ontodia gave an overview of NYFacets - a crowd knowing solution built with Semantic Mediawiki. Ontodia's site won the NYC BigApps - a contest started by Bloomberg as part of his grand plan to make NYC the capital of the digital world. NYFacets has a semantic data dictionary with 3.5M facts. Ontodia's vision is to socialize data conversations about data, and eventually build NYCpedia. I wondered why public libraries don't take this idea and run with it: Bostonpedia by Boston Public Library, Concordpedia by Concord Public Library and so on.

Stephen Larson gave an overview of NeuroLex - a periodic table of elements for neuroscience built with SMW under the NIF program. They built a master table of neurons and exposed as a SPARQL end point with rows consisting of 270 neuron classes, and columns consisting of 30 properties. NeuroLex demonstrates the value of a shared ontology for neuroscience by representing knowledge in a machine understandable form.

In the session - Wikipedia’s Next Big Thing, Denny Vrandecic, Wikimedia Deutschland gave an overview of Wiki Data project, which addresses the manual maintenance deficiencies of Wikipedia by bringing a number of the Semantic Mediawiki features to its fold. For instance, all info boxes in Wikipedia will become a semantic form stored in a central repository eliminating the need for maintaining the same content duplicated on many pages of Wikipedia. Semantic search capability will also come to Wikipedia to the applause of folks who maintain Wikipedia list of lists, list of lists of lists by replacing these manually maintained huge lists with a single semantic query. One of the novelties of Wikidata that it will be a secondary database of referenced sources for every fact. For instance, if one source says the population is 4.5M while another says 4,449,000, each source will be listed in the database, thus enabling a belief based inference.

It was nice to see several evangelists of linked data from the government sector at the conference. Dennis Wisnosky, and Jonathan Underly of the U.S. Department of Defense gave a nice overview of EIW Enterprise Information Web. It was refreshing to hear that DoD is looking at linked data as a cost reduction driver. Given the Cloud First mandate of the Defense Authorization Act 2012, the importance of semantic technology in the government will accelerate. In another session, Steve Harris of Garlik, now part of Experian gave an overview of Garlik DataPatrol - a semantic store of fraudulent activities for finance. I could not help wonder if someone from the Department of Homeland Defense was in attendance to hear the details of this application. Steve found no need for complex ontologies, reasoning, and NLP in this large scale application, which records about 400M instances of personal information (e.g. Social Security Number mentioned an IRC channel) every day.

Matthew Perry, and Xavier Lopez of Oracle gave an overview of OGC GeoSPARQL Standard, which aims to support representing and querying geospatial data on the Semantic Web. GeoSPARQL defines a vocabulary such as union, intersection, buffer, polygon, line, point for representing geospatial data in RDF, and it defines an extension to the SPARQL query language for processing geospatial data using distance, buffer, convex hull, intersection, union,envelope, and boundary functions.

Linked data being essentially about the plumbing of semantic infrastructure, it is hard to give engaging presentations on this topic. Two presentations bucked this trend. The presentation by Mateja Verlic from the Slovenian startup Zemanta rocked. Zemanta developed a DBpedia extension - LODGrefine for Google Refine under the LOD2 program. Google Refine supports large transformations of open source data sources, and LODGrefine exposes Refine results as a SPARQL endpoint. Mateja managed to give two impressive live demoes in ten minutes. The other rock star presentation was by Bart van Leeuwen - a professional firefighter, on Real-time Emergency Response Using Semantic Web Technology. Everyone in attendance got the gist of how FIREbrary - a linked data library for fire response, can help firefighters in the real world with a presentation sprinkled with live videos of fire emergency responses. It was instructive to see how semantic technology can make a difference in managing extreme events such as a chemical fire as there are no plans by definition for these types of events.

Bringing good user interface design practices to linked data enabled applications was another theme of the conference. Christian Doegl of Uma gave a demo of Semantic Skin, which is a whole wall interactive visualization driven by semantics. Siemens used it to build an identity map of their company. It uses Intel Audience Impression Metrics Suite to detect the age gender, etc. of the person walking in front of the wall for personalization of content driven by semantics. Pretty cool stuff.

GFIRST 2010: social malware, insider threat, fast flux botnets ....

I will not be able to cover some of the really interesting presentations in this public forum due to the sensitivity of the topics, but here are a couple of tidbits for general consumption. "Emerging Threats in 2010" by Dave Marcus, Director of Security Research and Communications, McAfee Labs was one of my favorite presentations of the conference.

Read More

Multi-Criteria Decision Modeling for Complex Operations

Next week we will be presenting a paper at the International Conference on Cross-Cultural Decision Making in Miami, Florida. I am looking forward to participating in a highly informative and interesting session, bridging modeling and simulation disciplines with socio-cultural data for military operations. In our paper entitled “Geospatial Campaign Management for Complex Operations”, we report initial findings from a research effort to understand the complexity of modern day insurgencies and the effects of counterinsurgency measures, integrating data-driven models, such as Bayesian belief networks, and goal-driven models, including multi-criteria decision analysis (MCDA), into a geospatial modeling environment in support of decision making for campaign management. Our Decision Modeler tool instantiates MCDA, a discipline for solving complex problems that involve a set of alternatives evaluated on the basis of various metrics. MCDA breaks a problem down into a goal or set of goals, objectives that need to be met to achieve that goal, factors that effect those objectives, and the metrics used to evaluate the factor. Since the selection of metrics for specified objectives and data for computing metrics are the biggest hurdles in using MCDA in practice, both the metrics and associated data are part of our tool's library for user reuse. Below is an image of the MCDA structure. Click on any of the images in the post to see more detail. Our decision modeling tool also incorporates a weighting system that enables analysts to apply their preferences to the metrics that are most critical for the mission. Linking these decision models in a shared space within the tool creates a repository of knowledge about progress along lines of effort in an operation, providing a source for knowledge transfer for units rotating into and out of the theater. The alternatives considered in the decision model are different courses of action that can be evaluated against metrics to determine the optimal action for accomplishing the commander’s goals. Of course, working in a complex human system such as the one found in counterinsurgency and stability operation environments, our tool is not meant to be a ‘black box’ model that simply reports to the user what to do, but rather the decision analysis provides insight through both qualitative and data-driven models about what courses of action will set the conditions for a more successful outcome based on the commander’s intent.

In evaluating our tool with users, we determined that one of the most important features involves the visualization of the tradeoffs for various courses of action in the decision model. To address this, we compute the uncertainty of data based on its distribution and propagate its effect analytically into the decision space, presenting it visually to the commander. A greater dispersion represents more uncertainty, while a clustered set of data points indicates more certainty regarding the cost and effectiveness metrics for a particular course of action. In this way, we are able to represent the high levels of uncertainty inherent in socio-cultural information without negatively impacting the ability of our tool to calculate a decision model. By incorporating a visual representation of uncertainty in the model, scenarios can then be played out to determine optimization for various courses of action based on data inputs and user preferences, translating model outputs into a form that can more readily be used by military users.

To demonstrate an example of how the visualization of uncertainty would work in the tool, in the image below we have analyzed two potential courses of action relating to the essential services line of effort with the objective of supporting healthcare initiatives in an area of operations. In this case, we are deciding where to focus our efforts, comparing two districts, Arghandab and Anar Dara in Southern Afghanistan. Here we are only examining a few potential metrics: the cost of building healthcare centers proposed by local development councils; the number of basic healthcare centers already in the district; and the number of people that identified a lack of healthcare as the major problem facing their village, a question that is collected in the Tactical Conflict and Assessment Planning Framework (TCAPF) data. Our MCDA tool would compute and display the effectiveness versus costs data points from metrics corresponding to the two proposed courses of action. We want to determine which district would optimize our goal of restoring essential services with the objective of supporting healthcare initiatives by leveraging the data inputs. In considering the uncertainty, we have represented the distribution in the ellipsoid around the data point. This allows a military planner to visually analyze and evaluate the potential courses of action based on cost versus effectiveness metrics, while accounting for the uncertainty of the data. In addition, the weighting system, sliders shown on the right hand of the image, allows a military planner to experiment to determine how a change in metrics will affect the proposed courses of action.

One of the key benefits of our approach is that it allows for real-time knowledge generation. By updating the model with new data the Decision Modeler will re-evaluate the outlined courses of action against the new information, allowing the user to view trends over time in the effectiveness and cost metrics for particular courses of action. In the example below, perhaps the cost estimates went up for the proposed course of action in Anar Dara given deterioration in the security situation that affected the ability of hiring contractors to execute the project. In Arghandab, the metric could have changed according to our collection of TCAPF data, emphasizing that more people responded that healthcare is the major problem facing their village, therefore, increasing the effectiveness against our objective if we built a healthcare center there. Given the increased need, the villagers have offered to provide labor at decreased cost and will contribute a certain percentage of funds to the project, therefore representing the decreased costs associated with Arghandab data points. In this way the tool will provide course of action forecasting based on an analysis of data for the purposes of proactively planning operations that optimize the commander’s objectives.

We will be presenting a more detailed analysis of our research results at the conference, so keep an eye out for links to our papers and presentation.

ECPR 5th General Conference

Last week we attended and presented a paper at the European Consortium for Political Research (ECPR) 5th General Conference in Potsdam, Germany. ECPR is a scholarly association focused on the training, research and cross-national co-operation of political scientists. From our  viewpoint, the percentage of papers dealing with fragile states was significantly smaller than papers dealing with inward issues (i.e. EU) in contrast to the situation that we would normally see on our side of the Atlantic. In terms of exhibitors, the Bartelsmann Transformation Index (BTI) was of particular interest to our research on complex operations. BTI, which is published bi-annually, promotes democracy under the rule of law and market economy with social safeguards. For instance, Uruguay joined the top 10 performers while Poland fell out of this group in the most recent edition. Another exhibitor GIGA, which has a Focus Afrika publication, indicated that they will soon start publishing their data, which is great news to the research community. One of the interesting sessions addressed the question: Is a workable peace-building concept possible? Gilles Carbonnier's paper on the role of non-state actors in resource-rich fragile states in the context of the Extractive Industries Transparency Initiative. The paper defined a set of criteria such as proportionality, non-discrimination, neutrality and independence for humanitarian assistance to differentiate from development assistance. Although indicators for these metrics are sparse, the provincial distribution of economic aid can be effectively used a proxy for measuring these metrics. Thomas Biersteker's paper on peacekeeping in theory and practice gave a nice overview of the process in building the UN Peacebuilding Commission (UNPBC), which was created to address gaps in the global response to armed conflict and conflict recurrence. The commission's charter is to  support fragile societies recovering from the devastation of war within two years after the cessation of hostilities. Since its inception in 2005, UNPC has disbursed about $250M of funds mostly in African countries.

Our paper on rumors presented by Dr. Karen Guttieri was received well and generated several questions. Rumor - information that is unsubstantiated yet widely shared - is rife during social conflict. In this paper, we analyzed rumors reported in The Baghdad Mosquito after the United States-coalition invasion of Iraq in March 2003, and mapped rumor types against public opinion polling and timeline of events that includes both insurgency and inter-sectoral conflict. Our paper shows that rumors have the potential to develop actionable cultural intelligence. The analysis of rumors can identify specific concerns and fears of a population that explain behavior and affect local cooperation with US counterinsurgency efforts. Furthermore, rumors can be used to assess foreign public opinion and measure the effectiveness of a hearts and minds campaign. While we have focused on Iraq, the concept of incorporating rumors as an intelligence source is applicable to virtually any country as long as the content analysis and rumor remedies are tailored for the culture in which they occur.

Peter Kotzian's paper on social norms analyzed the importance of macro and micro level variables allowing the individual to change its beliefs about whether a particular norm is still valid or not. The empirical findings based on survey data from 24 countries show that there are no effects of social trust on norm compliance. What makes people comply with norms is not blind trust but the belief, based on information, that the norm is still effective; hence, it is rational to comply. David  Westlund's paper on rational belief changes for collective agents was an interesting formal model to study the emergent collective beliefs from the belief systems of individual agents. This model shows that the collective must believe exactly the same as at least one of its members. Dörte  Dinger's paper analyzed partner perceptions in German-Italian bilateral relations by studying the press coverage of the incident created by Berlusconi remarks.

Military Logistics Summit

We attended IDGA’s Military Logistics Summit held on June 8-10, 2009 in Vienna, VA. The focus of this year's summit is to support major deployment, re-deployment, and distribution operations. Milcord's presentation entitled Risk-Based Route Planning for Sense and Respond Logistics for the Military Logistics University covered the technology behind our Adaptive Risk-based Convoy Route Planning solution. Our presentation had a diverse audience ranging from logistics contractors in Pakistan to Logisticians at large System Integrators, from high level US Army officers to academic researchers. A logistics contractor posed the question: "I love your risk based route planning system. I wish we had a system like this. Most logistics material are carried by private subcontractors like us (under contract to a Prime like Mersk) in Pakistan and Afghanistan. Even if the Army has this system, it won't do us any good." It was an interesting question that shined a light on the lack of information sharing between DoD and second /third tier military contractors in the supply chain, and generated a nice discussion among attendees.

Another interesting question on our presentation was the concern about the predictability of a route. Minimal distance routes are deterministic and pose a security risk because they can easily be determined by the adversary. In contrast, minimal risk route is not deterministic (changes with events on the field), which gives a better protection against predictability by the adversary. The risk surface (computed per road segment) changes with every incident, intel report, weather, traffic, etc., which, in turn, affects the route minimal risk route.

Another question: "If a bridge is blown down the road, how long does it take the Urban Resolve data set to update itself? " This is an issue that even commercial COTS GPS tools struggle with random events like road closings due to construction. Our current solution gives a manual workaround for such conditions by letting the user define an intermediate way point and  dragging the route away from the bridge. Crowd-sourcing can also help address this issue by arming users with power to dynamically update road availability by adding road blocks on their GPS units.  Crowd sourcing also brings about data integrity issues in that user specified changes would not be put into the database as every soldier would have a different viewpoint.

There were several other interesting presentations and exhibitions. Dr. Irene Petrick's talk on Digital Natives and 4'th Generation Warfare generated an active interaction with the audience.  She presented survey results that compare the value systems of Traditionals, Baby Boomers, Gen X and Gen Y, articulated where Digital Natives can add value to warfighting, and pose challenges organizational management. On the gadget front, Safe Ports demoed an eye scanner  based on infrared so it even recognizes you through your sun glasses.

Notes from CSIIRW-09

We attended and presented at the Cyber Security and Information Intelligence Research Workshop, April 13-15, 2009 at Oak Ridge National Labs (ORNL). www.ioc.ornl.gov/csiirw/ . The audience numbered about 150 attendees, with academic and government representing the biggest segment, and a few representatives from government, systems integrator , and technology providers. In his keynote, Dr. Doug Maughan from DHS reviewed and assessed federal cyber initiatives from 2003 to the present. While noting that the amount of activity around cyber security is encouraging, Doug challenged the cyber security research community to be “bolder and riskier in their thinking”, to do a better job of capitalizing on the increased interest, and to come together on an agreement for a “National Cyber Security R&D Agenda”. In other featured presentations, Dr. Nabil Adam from DHS and Rutgers University introduced issues and programs at the intersection of Cyber and Physical Systems Security. SCADA and Smart Grid systems were highlighted. In his “Are we on the Right Road” presentation, George Hull from Northrop Grumman confronted basic challenges. With 5.4 million unique malware samples discovered in 2007, and companies like Symantec now doing up to 300 updates per day, signature-based systems don’t and can’t work. And as systems become ever more complex, the complexity works against security and reliability. Hull suggested that cyber security is not about the endpoints or the network. Rather, the real focus needs to be defending the information. Dr. Robert Stratton from Symantec presented findings from Symantec’s Internet Security Threat Report (April 2009). Of particular interest to Milcord was the finding that in 2008 “Symantec observed an average of 75,158 active bot-infected computers per day in 2008, an increase of 31 percent from the previous period.”

The panel discussions surfaced some points for pondering, including observations that as venture capitalists seem to be moving away from cyber security as an investment area the government needs to fill the void in R&D funding. Some questioned the effectiveness of some government cyber R&D programs like NSF, going so far as to refer to it as ‘welfare for scientists’, disconnected from real-world needs, and unlikely to produce innovation that results in deployable systems.

Milcord presented findings from its DHS-sponsored botnet research on the Behavioral Analysis of Fast Flux Service Networks. Specifically we discussed behavioral patterns of domains, name servers, and bots that we discovered from our FastFlux Monitor into the short-term behavior, long-term behavior, organizational behavior, and operational behavior of botnets that use fast flux service networks. www.csiir.ornl.gov/csiirw/09/CSIIRW09-Proceedings/Slides/Caglayan-Slides.pdf

Reflections on CATCH

I attended the Cybersecurity Applications and Technology Conference for Homeland Security conference on March 3-4, 2009 in Washington, DC. I had to leave on Sunday to escape the snowstorm but it was well worth the effort. The keynote speech American Crisis in Innovation by Pascal Levensohn was the most thought provoking presentation. (See related BusinessWeek blog.) Pascal articulated the broken ecosystem of innovation in USA, and argued forcefully about the need for promoting effective innovation partnerships between government and university research organizations, corporations, and entrepreneurs. Pascal quoted several statistics from Judy Estrin's book Closing the Innovation Gap. Estrin has empirically proven that America has relied too much on incremental innovation in recent years at the expense of the open-ended scientific research that eventually leads to truly breakthrough innovation. How true! NRL funded the development of GPS in 1970s when no one could foresee the applications it spawned today. How many American organizations are investing today in the GPSs of the future? More importantly, how many decision makers are heeding Levensohn's alarm? 

Another interesting session was the panel discussion on the second day. I was particularly impressed with the comments of DHS Cybersecurity Chief Rod Beckstrom, who called for the adoption of Web 2.0 platforms within the government and the development of a generalized model for sensorizing the Internet. I was sad to read that Rod Beckstrom resigned today. It's great loss for DHS.

Our presentation on Real-time Detection of Fast Flux Service Networks was received well. The presentation generated lots of questions, and considerable interest in our Fast Flux Monitor demo at the expo. Tina Williams of Unisys asked one of the more interesting questions: From the tens of thousands of IPs in your DB, what user segments (ISP, edu, enterprise...) have this problem? Is the solution policy or technology? There is no question that ISPs and universities in USA are most seriously inflicted with the fast flux problem. The enterprise has a botnet problem with its mobile workforce. The government has started doing a better job in protecting its machines being recruited into zombies. The solution is both technology and policy. You can't be aware of the problem without the technology. However, you still need to train your personnel for effective remedies.

One final note. Congratulations to Dr. Doug Maughan, who runs the cybersecurity R&D at DHS using a collaborative model. As Milcord, we have participated in this program for the last three years. Open collaboration did improve our botnet defense solution with the suggestions of our colleagues in this program. Collaborative research programs in information technology are rare within the government. I wish more Program Managers adopted such a philosophy.

DHS Conference on Cyber Security (CATCH)

How can an organization defend against cybercrime enabled by botnets operating as fast flux service networks? Milcord will present its solution for "Real-time Detection of Fast Flux Service Networks" and botnets at the Cybersecurity Applications and Technology Conference for Homeland Security conference scheduled March 3-4, 2009 in Washington, DC. Very soon afterwards we'll be announcing the beta release of our new product Fast Flux Monitor that was the foundation for our research investigation.  To find out more about our research, visit the [[Botnet Defense]] project page.

Here's the abstract:

Here we present the first empirical study of
detecting and classifying fast flux service networks
(FFSNs) in real time. FFSNs exploit a network of
compromised machines (zombies) for illegal activities
such as spam, phishing and malware delivery using
DNS record manipulation techniques. Previous studies
have focused on actively monitoring these activities
over a large window (days, months) to detect such
FFSNs and measure their footprint. In this paper, we
present a Fast Flux Monitor (FFM) that can detect and
classify a FFSN in the order of minutes using both
active and passive DNS monitoring, which
complements long term surveillance of FFSNs.

Milcord at 2007 Monterey Homeland Security Conference

Milcord exhibited its [[Botnet Defense|botnet defense]] technology in a poster session and presented a paper in the Infrastructure Protection session at the Naval Postgraduate School. The conference is a showcase for innovative research being performed at U.S. Academic and other research institutions, including National Laboratories and Federally Funded Research and Development Centers. [[Botnet Defense|more...]]

Milcord at MobiSensors'07

Milcord presented a position paper titled "A Commercial Perspective: Collaborating on Application Prototypes as anInfrastructure Provider"at the NSF Workshop on Data Management for Mobile Sensor Networks (MobiSensors).

Sensor data management and fusion is a technical component in a number of our projects across a range of applications and technologies, including: · Monitoring [[SPE|Earth Science]] Data – NASA · [[GEMI|Intelligent Video Surveillance]] – Army · Enemy [[Course of Action Forecasting|Course of Action]] Analysis – Army · Quality of Service in Tactical Networks – Air Force · [[Botnet Defense|Botnet Detection]] and Mitigation – DHS