Analysis of IFC 2×3 CV2.0 Import Certification

Introduction

Before I begin, I should explain for those who don’t know me that I am a passionate supporter of IFC, COBie and BCF formats commonly referred to collectively as Open BIM formats. These formats are developed and maintained by the international organisation buildingSMART.

As a company we are users of these formats for our daily work and I personally spend a lot of my time testing and understanding file exchange between a whole host of solutions. We are buildingSMART UK members and I personally sit on the Technical Group for buildingSMART UK. Over the past 7 months we have begun to see certification emerge for buildingSMART’s IFC 2×3 Coordination View 2.0. This certification is split into export and import. The available certification documentation for both the export and import can be found here (there is also a link on our Interoperability resource page).

Analysis

For me robust certification is absolutely imperative for the standard to be used reliably in daily project use. This was one of our primary drivers for joining buildingSMART UK.

For the purposes of this analysis I have looked solely at the import certification. The import results began to be published in September with 6 solutions certified for import to date (in addition there are also 8 solutions certified for export). I originally only chose to look at the documentation as I was keen to see how our software performed and how other software performed against our own authoring tool. Straightaway I noticed that trying to compare tests results was not as straightforward as I had hoped.

Test files

The first thing that becomes clear is that each software solution has had different files used for the testing process. From the 6 software solutions that have been certified for import to date, only 12 files have been tested in all software solutions. This is from a total of 59 files that appear in the test result documentation. There may of course be more files but these are the ones identified in the test results released to date. So of the 47 files remaining, files appear to have been tested in various combinations with some tested in just one solution. Here is a full breakdown of what files have been tested in what software:

IFC2x3 Coordination View 2.0 Analysis.xlsx

Click on the above image to enlarge the results in a separate window

The big issue here is there is no clarity about why some files have been tested but not others. This makes it difficult to make direct comparisons between the various software solutions. For me, it is simple, all files should be tested in all solutions and if there is a good reason not to test them then this should be stated in the documentation. Alternatively there should be a separate document about the process of the testing regime so that we can understand what the approach is to only testing certain files. This clarity and openness about the testing is important for users to have confidence in the testing regime.

Here is the summary of each software:

IFC2x3 Coordination View 2.0 Analysis.xlsx

Click on the above image to enlarge the results in a separate window

By my calculations when all the concepts that can be tested in the 59 files there are a total of 1190 concepts that can be tested if all the files are used. It could be that the solutions don’t support the concepts in the files that have not been tested. The question is what confidence can be placed in these tests by the user? The analogy I would use is that its like having an MOT certification but not bothering to test the brakes as the car passed 70% of the other tests! Would you trust that kind of test?

Which version?

Having looked through the first page summary for each solution it was noted that there was no consistency in documenting which version was used for approval. Solibri’s test results don’t document which version was certified. Solibri have recently released a new version so the question is version 8 or version 9 the approved version. This may seem minor, and of course its easily fixed but it is important where testing has been undertaken to provide this clarity.

Pass or fail?

So if we discount the above and accept there must be reasons for only testing certain files then the next question is what did they pass and fail on? Each test comes with a summary sheet(s). So each file has a number of items to test in each file. Whether a software supports the IFC functionality is described with three primary colours shown in the summary sheets: green (supported), yellow (restricted support) and red (not supported). A fourth colour, grey is used to show where the import functionality is supported although not a native function in the software.

What we also need to understand is that the same unsupported functionality may be picked multiple times in different tests. So for example, ARCHICAD has 48 unsupported or restricted features but this is actually only about 3 or 4 issues that then get picked up in different files. So the numbers need to be taken with the detail.

The number of concepts varies as we have already discussed as the files vary depending on which files have been used. The following results are an aggregated summary of these results along with our way of looking at the ranking of the certifications (this ignores Not Supported and Restricted functionality):

IFC2x3 Coordination View 2.0 Analysis.xlsx

This percentage approach is a way of ranking software to provide an insight into how well a piece of software is meeting the standard. For me this provides a way to understand which solutions are most inline with the standard. Of course its probably more difficult for files to be imported into an authoring tool than a solution that is based entirely on IFC but it is still useful to see how well they meet the standard. The ranking may be configured differently, maybe it could be more scientific. However having some form of ranking system will produce competition between vendors rather than simply providing a blanket approval.

Access to the files

The files that have been tested are shown with small thumb nail images within the documentation, indicating the type of geometry tested but the files themselves are not available. For me these files should be published publicly on the buildingSMART website. This would help us all understand better what is being tested but also provide those developing new software or improving existing software to provide the best solutions possible. Some might say these should only be available to buildingSMART members but don’t we all want the format to be adopted and allow developers of new software to create solutions compatible with the test files.

Conclusion

I know from our own testing that testing IFC is no mean feat as its a very complicated subject with many variables. I appreciate that buildingSMART is largely a volunteer organisation and therefore some compromises may need to be made in the testing procedures. But as a user of IFC it is of the upmost importance that the file transfers can be relied upon between different systems. Hopefully providing an ‘outsiders’ view of the certification process will allow buildingSMART to provide an answer to some of these queries and I will update this blog piece if I get a response to the issues raised.

I personally am using the results to demand better of our own vendor. This is not to say our vendor is not committed to improvement and on the whole they are performing very well. In fact most of the issues that are not supported or restricted are not show stoppers but we should strive for continual improvement and slowly the minor issues will disappear. The more we are aligned with the standard the easier it will be to understand and use IFC. This will also ultimately have the added benefit of helping to speed up the testing procedures.

I should conclude by saying that I strongly believe that IFC exchange is pretty good with the vast majority of solutions (note: I won’t say all because there are some solutions I have never personally tested). We know there are still issues but they are fewer than the sceptics would make out. We continually report these issues to all software vendors and are hopeful by reporting these issues they will be fixed in the very near future. We just now need to make sure that the certification process is open, clear and fit for purpose. This will drive up standards which can only be a good thing for our industry.

Rob Jackson, Associate, Bond Bryan Architects

linkedinicon4

This post has been viewed 8381 times.

4 thoughts on “Analysis of IFC 2×3 CV2.0 Import Certification

  1. Hi Rob

    Congratulations with this job, and the interesting table.

    But please take into account that – except for Nemetschek Scia – these are all CAD-applications. Nemetschek Scia is a AEC application for structural engineering purposes (e.g. used by Aecom, Ramboll etc.) and structural engineerings have no (or certainly much less) need for transforming all thes elements to structural objects (e.g. concrete beams, steel colums, plates and walls etc.) and that the % of supported concepts does not reflect daily practice.

    BTW … Nemetschek Scia (or better, the program Scia Engineer) is actually the only CAE software with IFC2x3 certification. Maybe it’s a good idea to mention that in your article.

    But as I wrote before, this was certainly a good reading.

    Kind regards
    Rudi

  2. We are just starting out on our BIM journey and I had a look at the IFC2 x 3 certification tests to get some feel for how we could evaluate software options – as you have pointed out its very frustrating as they don’t seem to compare like with like. Any ideas as to how we might evaluate software functionality (business case evaluation seems a lot more straightforward) before committing to any one option?

  3. I want to know the number of employees that are working in Bond Bryan Architects. Also is the company medium-sized or large company?

Leave a Reply to Jambil Suyudi Cancel reply

Your email address will not be published. Required fields are marked *