Susan Smith has worked as an editor and writer in the technology industry for over 16 years. As an editor she has been responsible for the launch of a number of technology trade publications, both in print and online. Currently, Susan is the Editor of GISCafe and AECCafe, as well as those sites’ … More »
January 9th, 2011 by Susan Smith
On January 6 and 7, Esri brought together a meeting of the minds at their GeoDesign Summit held in Redlands, Calif. at Esri headquarters. The event brought together both GIS professionals and architects and engineering professionals in a think-tank setting to discuss how the two technology sectors and cultures might converge in order to make the best of both of them in shared settings.
Some definitions for the term “GeoDesign” which was coined by Esri to describe the convergence of geography and design:
From Wikipedia comes the definition:
Geodesign is a set of techniques and enabling technologies for planning built and natural environments in an integrated process, including project conceptualization, analysis, design specification, stakeholder participation and collaboration, design creation, simulation, and evaluation (among other stages). “Geodesign is a design and planning method which tightly couples the creation of design proposals with impact simulations informed by geographic contexts.”
From other notable professionals:
“Geodesign is a design and planning method which tightly couples the creation of design proposals with impact simulations informed by goegraphic contexts.” – Mike Flaxman
“Geodesign is changing geography by design,” Carl Steinitz
“GIS is about is, geodesign is about what could be.” Tom Fisher
January 9th, 2011 by Susan Smith
Geodesign Accomplishments through 2010–
The concept of “GeoDesign” was one year old last week when Esri CEO and president Jack Dangermond kicked off the GeoDesign Summit held in Redlands, Calif. His question to the audience: How do you want to interact in the future to make things better?
He spoke about new modalities and how we used to use CAD to generate maps, but now with GIS we can all look at and interact with the map simultaneously.
He said that GIS is going through “another massive shift with real time information, with distributed services and bringing things together dynamically, the whole lifecycle of design and processes is birthing here.” The new paradigm is about creating alternative futures, evaluating them quickly and seeing the conseqences of them.
Dangermond sees that as the world is becoming digital, GIS is becoming pervasive and in the future we will be able to measure “nearly everything that moves or changes.” On top of those measurements we will be able to sketch design alternatives.
Half of the time of the designer and engineer is spent on collecting data.
Michael Goodchild of the University of California spoke on GeoDesign accomplishments through 2010.
a. A research agenda for this area and development.
b. Personal perspective
c. Needed a definition of the field and now have a Wikipedia page.
New networks have been created such as the Geodesign Consortium spearheaded by Karen Hanna and the SDS Consortium by Naicong Li.
Online resources –
Participatory geodesign network – defining geodesign as it relates to public participation.
GIS and Science bibliography on Esri GIS & Science website
Selected readings –
Jack’s talk at TED 2010
GeoDesignWorld.org – Jason Lally and Drew Dara-Abrams
Literature – Regional and Urban GIS: A decision support approach by Esri Press
Goodchild’s almost published – “Towards GeoDesign: repurposing cartography and GIS? email@example.com
Goodchild said we need to close what have many have perceived as a growing gap between GIS and design.
“Now more than ever we need a technology to distinguish between small-d and Big-D design,” said Goodchild. “Design consists of the formulation of an optimization problem with objectives and constraints, the collection of data, the execution of a search for the optimum solution, and its implementation.”
His definition of the two “d”s was as follows: Small-d —In this simplistic view implementation is seen as inevitable. Big-d sees the process complicated by disagreements among stakeholders.
The Lightning Talks presented at this event were 10 minutes long. A couple of the more enlightening ones are outlined below:
Chris Pyke of the U.S. Green Building Council said that “Green building is not about buildings. It is about this curve – a systematic movement devoted to changing the prevalence of practice – by creating best practices. The curve is not spatial, temporal or data driven. The USGBC put in place a collection of people and practices to move the curve.”
One manifestation of green building is buildings, said Pyke. At least 30,000 buildings are in the pipeline, which represent decisions made about water, stormwater, lighting, air space, space, etc.
Over the last decade, people have understood we have a curve, and we try to remove it by adopting best practices, while a building might last 50-200 years. The curve is made up of these decisions over time.
The next 15 years of green building practice is going to be
USGBC has created a portal to understand spatial and temporal dimensions. The portal can expose “augmented reality” information of different actual real projects on the ground. It can capture real information on a real building, so that other projects can be measured by it and come up to its standards. This technology can also be accessed through mobile BGIG Analyst.
Nicholas de Monchaux, assistant professor of Architecture and Urban Design UC Berkeley talked about “creating a robust nervous system for the cities of today.” The digital tools of today allow us to contemplate this new paradigm.
Constance Bodurow, Lawrence Technological Unviersity,
Studio [Ci] a design lab in the College of Architecture, presented the topic “Convergence of Intensity: How to Use Geodesign Tools to Shape A City.” She said we are urbanists, and interested in the future of urban form, and they believe cities should be the most desirable place for human habitation.
A new urban geography and ecosystem are required which leverage the assets and complex combinations of social economic and environmental factors.
Their Studio (Ci) integrates Esri with Google SketchUp to generate unique outcomes. The Convergence of intensity (CI) is a value based approach which builds on value densification and recommends the new geography of the city. It proposes specific criteria of the revitalizing of the post industrial city. “We create 3D extrusions, the city can see it better and have thousands of datasets,” said Bodurow.
The afternoon was devoted to Idea Labs on special topics. The one I attended was entitled BIM/GIS Integration led by Stu Rich of PenBay Solutions, Ihab Hijazi, Danny Kahler and Fred Abler.
The discussion addressed an ongoing debate about Industry Foundation Classes (IFCs), an object oriented file format for interoperability between CAD and now Building information modeling (BIM) files. Now they are working on an interoperability platform between BIM and BIM, and want to use it to apply to the BIM/GIS conversation.
Participants asked the questions: What are use cases, what are problems we are going to solve, and what are we going to pull out of BIM to put in GIS and vice versa?
The day wrapped up with a talk by Kimon Onuma, architect, evangelist for the integration of BIM and GIS and president of Onuma, Inc. has been using BIM since 1993. His clients include the GSA, U.S. Coast Guard and U.S. Army Corps of Engineers–to name a few.
Onuma remarked that the economy slump is the best thing that has happened to the industry – the people who didn’t have time to look at BIM now are looking at it. On the downside, BIM models have become very heavy and users cannot extract valuable information from them.
Onuma’s viewpoint about technology is that it should be simple, “if we don’t keep it simple, we can’t solve the problem,” he said. A solution should be like an online travel website where you book an airline flight. You ask a question, it gives you an answer.
Onuma has created the BIM Model Server which embodies cloud computing, BIM and GIS, facilities management and other data in real time. It is fast and simple, and allows numbers of people to access the information simultaneously.
He took the audience through the virtual design of a building in Hong Kong, where everyone in the room could click on a link on his site and begin adding design elements. This type of brainstorming way of designing and pulling in information is called a BIMStorm. What the audience did with Onuma in one hour is what is a quick example of what is generally done with an organization in a day or several days of working together on a real project.
He said the intersection of GIS and BIM is “where it explodes.” Multiple servers talk to each other, and with cloud computing you can create mashups. The building is in a city, the city is part of the world and that’s how it connects together.
January 5th, 2011 by Susan Smith
Free online, free* 30-day trial for AutoCAD LT 2011 software from Autodesk.
Autodesk offers the January ADN plugin of the month: DrawOrder by Layer for AutoCAD
GstarCAD announced the pre-release of GstarCAD 2011 version for public evaluation.
Thermafiber, Inc. and ARCAT have developed AutoDesk Revit BIM objects for Thermafiber’s mineral wool insulation products. These objects are available for free download on the ARCAT site and also accessible on Thermafiber’s website.
January 3rd, 2011 by Susan Smith
The annual Computer Electronics Show (CES) 2011 in Las Vegas will host “the year of the tablets” as nearly every PC manufacturer will be unveiling an Android or Windows 7 tablet, with Android being the winner.
December 6th, 2010 by Susan Smith
Carl Bass on infinite computing….
What will we see in terms of cost for infinite computing after it’s in place?
You have two things going on simultaneously: you have a deep curve into the climbing price of computing – computing is the only asset that’s going down in price while everything else going up. From the commercial perspective we’re shifting some of the costs from customers back to us. Generally people providing this today are not as computer intensive – like Salesforce.com.
We’re affordably doing it; you can now try AutoCAD LT running off the cloud.
Right now the spot price for cloud computing is at 3 cents an hour.
If I’ve got infinite computing available, when and where do I make the decision to use it?
We’re going to have a hybrid computing model. Because of the tablet, there is incredible computing power and you don’t need to be connected. You’ll continue to have local devices – and the cloud for compute intensive jobs. We don’t build out our own cloud, for most of them we are trying to use commoditized resources, if you need an answer within short period of time you pay more; there are some models like this. What if people are able to solve problems they were never able to solve before?
We think the cloud is a choice. Some customers no longer want the local choice, where they need power and resources; they want another choice of deployment. Choice is available to all customers. Pricing models are changing; mobile devices are putting pressure on the market. The way we can use infinite computing is by offering different models for those who only need this software two hours a month.
I’m not sure if it has any fundamental pressure on pricing in general, what pressure it does introduce is offset by greater capability. The price of fundamental resources goes down while capabilities go far up.
What kind of delivery models will you see?
You’ll see electronic software downloads rather than boxes, some people deploying through streaming, etc., and other services that purely exist in the cloud only. You’ll have a variety. We’re looking at our subscription program for people to get information on options.
What about Autodesk’s growth?
Our business without acquisitions is no better or worse than other years, we have 12-15% growth rate in 2010, and that can be changed by economic conditions and by acquisitions. We have factored in the idea of infinite computing but at a low level.
Are you addressing multicore?
We have done a lot of multicore work on our products. It works only when you’re doing a lot of the same thing, like sorting a lot of data items. Our studies show it accounts for only about 15 percent of what engineers do. That’s why the breakthrough is making the cloud available. We can run a larger analysis process across more iterations.
We have some amount of work in foundation stuff, there are some ways to do things in a multithreaded way. It’s a valuable technique, not quite as valuable in general purpose computing as you might think. We’re much more interested in what allows you to optimize an answer to a question.
What about the consumer market?
Our customers are mostly professionals, 1 percent top account for 30 percent of our revenue, 70% of customers account for other revenue. Historically we haven’t done much with consumers, SketchBook Pro is way past 2 million people who have downloaded it, and it has done amazingly well. It’s phenomenal in what it’s been able to do in terms of generating awareness. Selling SketchBook at $8.99 is not a way to make profitable business but it has done a great job of raising awareness, to understand also what people are looking for. There is a greater influence of the consumer market going back into the professional market.
We need to pay attention to the consumer market and see what is going on, such as the community that gets created around Flickr, that social community around professionals. I don’t think our business will change to become a consumer business, although we have more people coming in at the entry stage as new users and students, a feeder population, and are getting people interested in design and math.
We need tools that everyone can take advantage of.
People are more interested in moving things to mobile devices. Open source was the end of an era – commodization. There is still open source software out there successfully deployed in server based environments, but most of our software doesn’t fall into that category.