Software composition analysis: Difference between revisions

Content deleted Content added
Adwerald (talk | contribs)
Added references and fixed typo
No edit summary
 
(40 intermediate revisions by 16 users not shown)
Line 1:
{{Use dmy dates|date=February 2023}}
{{AFC submission|d|reason|Sending back to submitter to work on and resubmit per comment.|u=Adwerald|ns=118|decliner=Mattdaviesfsic|declinets=20230113173841|ts=20221109174935}} <!-- Do not remove this line! -->
{{Short description|Examining the embedded components of software}}
 
'''Software composition analysis''' (SCA) is a practice in the fields of Information technology and software engineering for analyzing custom-built software applications to detect embedded open-source software and detect if they are up-to-date, contain security flaws, or have licensing requirements.<ref>
{{AFC comment|1="Usage" section almost entirely unsourced. First paragraph of "Overview" section also unsourced. Other than that, looks good, and would be happy to accept when these minor things are sorted! [[User:Mattdaviesfsic|Mattdaviesfsic]] ([[User talk:Mattdaviesfsic|talk]]) 17:37, 13 January 2023 (UTC)}}
{{Cite journal
 
|last1=Prana|first1=Gede Artha Azriadi
----
|last2=Sharma|first2=Abhishek
 
|last3=Shar|first3=Lwin Khin
{{Short description|Software Composition Analysis}}
|last4=Foo|first4=Darius
{{Draft topics|software|technology}}
|last5=Santosa|first5=Andrew E
{{AfC topic|other}}
|last6=Sharma|first6=Asankhaya
|last7=Lo|first7=David
|date=July 2021
|title= Out of sight, out of mind? How vulnerable dependencies affect open-source projects
|journal=Empirical Software Engineering
|volume=26
|issue=4
|pages=1–34
|article-number=59
|publisher=Springer
|doi=10.1007/s10664-021-09959-3
|s2cid=197679660
|url=https://ink.library.smu.edu.sg/sis_research/6048
}}</ref>
 
==Background==
It is a common [[Software engineering|software engineering]] practice to develop software by using different components...<ref>
It is a common software engineering practice to develop software by using different components.<ref>
{{Cite journal
|last1=Nierstrasz|first1=Oscar
Line 15 ⟶ 31:
|date=1995
|title= Research directions in software composition
|journal=ACM Computing Surveys (CSUR)
|volume=27
|issue=2
Line 22 ⟶ 38:
|doi=10.1145/210376.210389
|s2cid=17612128
|doi-access=free
|url=https://hdl.acm.org/doi/pdf/10.1145/210376.210389
}}</ref> Using [[Component-based_software_engineering#Software_component|software components]] segments the complexity of larger elements into smaller pieces of code and increases flexibility by enabling easier reuse of components to address new requirements.<ref>
{{Cite book
Line 30 ⟶ 46:
|title= Object-oriented software composition
|pages=3–28
|publisher=Prentice Hall International (UK) Ltd.
|citeseerx=10.1.1.90.8174
}}</ref> The practice has widely expanded since the late 1990s with the popularization of [[Open-source_software|open-source software]] (OSS) to help speed up the software development process and reduce time to market.<ref>
{{Cite journal
|last1=De Hoon|first1=Michiel JL
Line 41 ⟶ 57:
|title= Open source clustering software
|journal=Bioinformatics
|volume=20
|issue=9
|pages=1453–1454
|doi=10.1093/bioinformatics/bth078
|publisher=Oxford University Press
|pmid=14871861
|bibcode=2004Bioin..20.1453D
|citeseerx=10.1.1.114.3335
}}</ref>
 
However, using [[Open-source_software|open-source software]] introduces many risks for the software applications being developed. These risks can be organized into 5 categories :<ref>
{{Cite journalbook
|last1=Duc Linh|first1=Nguyen
|last2=Duy Hung|first2=Phan
|last3=Dipe|first3=Vu Thu
|title=Proceedings of the 2019 8th International Conference on Software and Computer Applications
|chapter=Risk Management in Projects Based on Open-Source Software
|date=2019
|title= Risk Management in Projects Based on Open-Source Software
|journal=Proceedings of the 2019 8th International Conference on Software and Computer Applications
|pages= 178–183
|doi=10.1145/3316615.3316648
|isbn=9781450365734
|s2cid=153314145
|chapter-url=https://dl.acm.org/doi/pdf/10.1145/3316615.3316648
}}</ref>:
* OSS Version Control: risks of changes introduced by new versions
* Security: risks of vulnerabilities in components - [[Common Vulnerabilities and Exposures|Common Vulnerabilities & Exposures]] (or CVEs)
* License: risks of [[Intellectual property|Intellectual Property]] (IP) legal requirements
* Development: risks of compatibility between existing codebase and [[Open-source_software|open-source software]]
* Support: risk of poor documentation and [[Obsolescence|Obsolete software components]]
 
Shortly after the foundation of the [[Open Source Initiative]] in February 1998,<ref>{{cite web |url=http://opensource.org/history |title=History of the OSI |date=19 September 2006 | publisher=Opensource.org}}</ref>, the risks associated with OSS were raised<ref>
{{Cite journal
|last1=Payne|first1=Christian
Line 77 ⟶ 97:
|s2cid=8123076
|url=https://flosshub.org/sites/flosshub.org/files/Payne2002_ISJ12_SecurityOSS.pdf
}}</ref> and organizations tried to managedmanage this using spreadsheets and documents to track all the open source components used by their developers.<ref>
{{Cite journal
|last1=Kaur|first1=Sumandeep
Line 85 ⟶ 105:
|pages=47–51
|url=http://csjournals.com/IJCSC/PDF11-2/8.%20Suman.pdf
}}</ref>.
 
For organizations using open-source components extensively, there was a need to help automate the analysis and management of open source risk. This resulted in a new category of software products called Software Composition Analysis (SCA) which helps organizations manage open source risk.
SCA strives to detect all the 3rd party components in use within a software application to help reduce risks associated with security vulnerabilities, IP licensing requirements, and obsolescence of components being used.
 
==Principle of operation==
==Overview==
'''Software Composition Analysis''' (SCA) is a practice in the fields of [[Information technology]] and [[Software engineering|software engineering]] for analyzing custom-built software applications to detect embedded open-source software and detect if they are up-to-date, contain security flaws, or have licensing requirements.<ref>
{{Cite journal
|last1=Prana|first1=Gede Artha Azriadi
|last2=Sharma|first2=Abhishek
|last3=Shar|first3=Lwin Khin
|last4=Foo|first4=Darius
|last5=Santosa|first5=Andrew E
|last6=Sharma|first6=Asankhaya
|last7=Lo|first7=David
|date=July 2021
|title= Out of sight, out of mind? How vulnerable dependencies affect open-source projects
|journal=Empirical Software Engineering
|volume=26
|issue=4
|pages=1–34
|publisher=Springer
|doi=10.1007/s10664-021-09959-3
|url=https://ink.library.smu.edu.sg/sis_research/6048
}}</ref>
 
SCA products typically work as follows:<ref>
<ref>
{{Cite journal
|last1=Ombredanne|first1=Philippe
Line 121:
|issue=10
|pages=262–264
|publisher=IEEE
|doi=10.1109/MC.2020.3011082
|bibcode=2020Compr..53j.105O
|s2cid=222232127
|doi-access=free
|url=https://ieeexplore.ieee.org/document/9206429
}}</ref>:
* An engine scans the software source code, and the associated artifacts used to compile a software application.
* The engine identifies the OSS components and their versions and usually storestores this information in a database creating a catalog of OSS in use in the scanned application.
* This catalog is then compared to databases referencing known security vulnerabilities for each component, the licensing requirements for using the component, and the historical versions of the component.<ref>{{Cite web|url=https://insights.sei.cmu.edu/blog/10-types-of-application-security-testing-tools-when-and-how-to-use-them/|title=10 Types of Application Security Testing Tools: When and How to Use Them}}</ref> For security vulnerability detection, this comparison is typically made against known security vulnerabilities (CVEs) that are tracked in the [[National Vulnerability Database]] (NVD). Some products use an additional proprietary database of vulnerabilities. For [[Legal_governance,_risk_management,_and_compliance#Legal_compliance|IP / Legal Compliance]], SCA products will extract and evaluate the type of licensing used for the OSS component<ref>
{{Cite journalconference
|last1=Chen|first1=Yang
|last2=Santosa|first2=Andrew E
|last3=Yi|first3=Ang Ming
|last4=Sharma|first4=Abhishek
|last5=Sharma|first5=Asankhaya
|last6=Lo|first6=David
|date=2020
|title=A Machine Learning Approach for Vulnerability Curation
|conference=Proceedings of the 17th International Conference on Mining Software Repositories
|pages=32–42
|doi=10.1145/3379597.3387461
}}</ref> For security vulnerability detection, this comparison is typically made against known security vulnerabilities (CVEs) that are tracked in the [[National Vulnerability Database]] (NVD). Some products use an additional proprietary database of vulnerabilities. For [[Legal_governance,_risk_management,_and_compliance#Legal_compliance|IP / Legal Compliance]], SCA products will extract and evaluate the type of licensing used for the OSS component.<ref>
{{Cite book
|last1=Duan|first1=Ruian
|last2=Bijlani|first2=Ashish
Line 135 ⟶ 148:
|last4=Kim|first4=Taesoo
|last5=Lee|first5=Wenke
|title=Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security
|chapter=Identifying Open-Source License Violation and 1-day Security Risk at Large Scale
|date=2017
|title= Identifying open-source license violation and 1-day security risk at large scale
|journal=Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security
|pages=2169–2185
|publisher=ACM
Line 143 ⟶ 156:
|isbn=9781450349468
|s2cid=7402387
|chapter-url=https://dl.acm.org/doi/pdf/10.1145/3133956.3134048
}}</ref>. Versions of components are extracted from popular open source repositories such as [[GitHub]], [[Apache Maven|Maven]], [[Python Package Index|PyPi]], [[NuGet|NuGet]], and many others.
* Modern SCA systems have incorporated advanced analysis techniques to improve accuracy and reduce false positives. Notable contributions include '''vulnerable method analysis''', which determines whether vulnerable methods identified in dependencies are actually reachable from the application code. This approach, pioneered by [[Asankhaya Sharma]] and colleagues, uses call graph analysis to trace execution paths from application entry points to vulnerability-specific sinks in third-party libraries.<ref>
* The results are then made available to end users using different digital formats. The content and format depend on the SCA product and may include guidance to evaluate and interpret the risk, and recommandations especially when it concerns the legal requirements of open source components such as [[Copyleft#Strong_and_weak_copyleft|strong or weak copyleft]] licensing. The output may also contain a [[Software supply chain|Software Bill of Materials]] (SBOM) detailing all the open source components and associated attributes used in a software application<ref>
{{Cite arxiv
|last1=Foo|first1=Darius
|last2=Yeo|first2=Jason
|last3=Xiao|first3=Hao
|last4=Sharma|first4=Asankhaya
|title=The Dynamics of Software Composition Analysis
|date=2019
|eprint=1909.00973
|class=cs.SE
}}</ref>
* '''Hybrid static-dynamic analysis''' techniques combine statically-constructed call graphs with dynamic instrumentation to improve the performance of false positive elimination. This modular approach addresses limitations of purely static analysis, which can introduce both false positives and false negatives on real-world projects.<ref>
{{Cite arxiv
|last1=Foo|first1=Darius
|last2=Yeo|first2=Jason
|last3=Xiao|first3=Hao
|last4=Sharma|first4=Asankhaya
|title=The Dynamics of Software Composition Analysis
|date=2019
|eprint=1909.00973
|class=cs.SE
}}</ref>
* '''Machine learning-based vulnerability curation''' automates the process of building and maintaining vulnerability databases by predicting the vulnerability-relatedness of data items from various sources such as bug tracking systems, commits, and mailing lists. These systems use self-training techniques to iteratively improve model quality and include deployment stability metrics to evaluate new models before production deployment.<ref>
{{Cite conference
|last1=Chen|first1=Yang
|last2=Santosa|first2=Andrew E
|last3=Yi|first3=Ang Ming
|last4=Sharma|first4=Abhishek
|last5=Sharma|first5=Asankhaya
|last6=Lo|first6=David
|date=2020
|title=A Machine Learning Approach for Vulnerability Curation
|conference=Proceedings of the 17th International Conference on Mining Software Repositories
|pages=32–42
|doi=10.1145/3379597.3387461
}}</ref>
* '''Natural language processing techniques''' for automated vulnerability identification analyze commit messages and bug reports to identify security-related issues that may not have been publicly disclosed. This approach uses machine learning classifiers trained on textual features extracted from development artifacts to discover previously unknown vulnerabilities in open-source libraries.<ref>
{{Cite conference
|last1=Zhou|first1=Yaqin
|last2=Sharma|first2=Asankhaya
|date=2017
|title=Automated identification of security issues from commit messages and bug reports
|conference=Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering
|pages=914–919
|doi=10.1145/3106237.3106293
}}</ref>
* The results are then made available to end users using different digital formats. The content and format depend on the SCA product and may include guidance to evaluate and interpret the risk, and recommendations especially when it concerns the legal requirements of open source components such as [[Copyleft#Strong_and_weak_copyleft|strong or weak copyleft]] licensing. The output may also contain a [[Software supply chain|Software Bill of Materials]] (SBOM) detailing all the open source components and associated attributes used in a software application<ref>
{{Cite journal
|last1=Arora|first1=Arushi
Line 152 ⟶ 211:
|date=2022
|title= Strengthening the Security of Operational Technology: Understanding Contemporary Bill of Materials
|journal=JCIP the Journal of Critical Infrastructure Policy
|volume=3
|pages=111
|pages=111–135
|doi=10.18278/jcip.3.1.8
|url=https://www.jcip1.org/uploads/1/3/6/5/136597491/jcip_3.1_online.pdf#page=117
}}</ref>.
 
==Advanced techniques==
 
Since the early 2010s, researchers have developed several advanced techniques to improve the accuracy and efficiency of SCA tools:
 
===Vulnerable method analysis===
Vulnerable method analysis addresses the problem of determining whether a vulnerability in a third-party library poses an actual risk to an application. Rather than simply detecting the presence of vulnerable libraries, this technique analyzes whether the specific vulnerable methods within those libraries are reachable from the application's execution paths. The approach involves constructing call graphs that map the relationships between application code and library methods, then determining if there exists a path from application entry points to vulnerability-specific sinks in the libraries.<ref>
{{Cite arxiv
|last1=Foo|first1=Darius
|last2=Yeo|first2=Jason
|last3=Xiao|first3=Hao
|last4=Sharma|first4=Asankhaya
|title=The Dynamics of Software Composition Analysis
|date=2019
|eprint=1909.00973
|class=cs.SE
}}</ref>
 
===Machine learning for vulnerability databases===
Traditional vulnerability databases rely on manual curation by security researchers, which can be time-intensive and may miss relevant vulnerabilities. Machine learning approaches automate this process by training models to predict whether data items from various sources (such as bug reports, commits, and mailing lists) are vulnerability-related. These systems implement complete pipelines from data collection through model training and prediction, with iterative improvement mechanisms that generate better models as new data becomes available.<ref>
{{Cite conference
|last1=Chen|first1=Yang
|last2=Santosa|first2=Andrew E
|last3=Yi|first3=Ang Ming
|last4=Sharma|first4=Abhishek
|last5=Sharma|first5=Asankhaya
|last6=Lo|first6=David
|date=2020
|title=A Machine Learning Approach for Vulnerability Curation
|conference=Proceedings of the 17th International Conference on Mining Software Repositories
|pages=32–42
|doi=10.1145/3379597.3387461
}}</ref>
 
===Static analysis for library compatibility===
As SCA tools increasingly recommend library updates to address vulnerabilities, ensuring compatibility becomes critical. Advanced static analysis techniques can automatically detect [[API]] incompatibilities that would be introduced by library upgrades, enabling automated vulnerability remediation without breaking existing functionality. These lightweight analyses are designed to integrate into [[continuous integration]] and [[continuous delivery]] pipelines.<ref>
{{Cite conference
|last1=Foo|first1=Darius
|last2=Chua|first2=Hendy
|last3=Yeo|first3=Jason
|last4=Ang|first4=Ming Yi
|last5=Sharma|first5=Asankhaya
|date=2018
|title=Efficient static checking of library updates
|conference=Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering
|pages=791–796
|doi=10.1145/3236024.3275535
}}</ref>
 
== Usage ==
As SCA impacts different functions in organizations, different teams may use the data depending on the organization's corporation size and structure. The IT department will often use SCA for implementing and operationalizing the technology with common stakeholders including the Chiefchief Informationinformation Officerofficer (CIO), the Chief Technology Officer (CTO), and the Chief Enterprise Architects (EA).<ref name=SBM_1>{{Citecite web| title=Software bill of materials: Managing software cybersecurity risks| author1=Bailey, T.| author2=Greis, J.| author3=Watters, M.| author4=Welle, J.| url=https://www.mckinsey.com/capabilities/risk-and-resilience/our-insights/cybersecurity/software-bill-of-materials-managing-software-cybersecurity-risks|title publisher=Software[[McKinsey bill& ofCompany]]| materials:date=19 ManagingSeptember software2022| cybersecurityaccess-date=6 risksJanuary 2024}}</ref> Security and license data are often used by roles such as Chief Information Security Officers (CISO) for security risks, and Chief IP / Compliance officer for Intellectual Property risk management.<ref>{{cite book |last=Popp |first=Karl Michael |author-link= |date= 30 October 2019|title= Best Practices for commercial use of open source software|url= https://books.google.frcom/books?id=w1a6DwAAQBAJ |publisher=BoD – Books on Demand, 2019 |page=10 |isbn=9783750403093}}</ref>
 
Depending on the SCA product capabilities, it can be implemented directly within a developer's [[Integrated_development_environment|Integrated Development Environment]] (IDE) who uses and integrates OSS components, or it can be implemented as a dedicated step in the [[Software_quality_control|software quality control]] process.<ref>
{{Cite journalbook
|last1= Imtiaz|first1=Nasif
|last2=Thorn|first2=Seaver
|last3=Williams|first3=Laurie
|title=Proceedings of the 15th ACM / IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)
|chapter=A comparative study of vulnerability reporting by software composition analysis tools
|date=October 2021
|title= A comparative study of vulnerability reporting by software composition analysis tools
|journal=Proceedings of the 15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)
|pages=1–11
|publisher=ACM
|doi=10.1145/3475716.3475769
|arxiv=2108.12078
|url=https://dl.acm.org/doi/abs/10.1145/3475716.3475769
|isbn=9781450386654
|s2cid=237346987
|chapter-url=https://dl.acm.org/doi/abs/10.1145/3475716.3475769
}}</ref><ref>
{{Cite journalbook
|last1=Sun|first1=Xiaohan
|last2=Cheng|first2=Yunchang
|last3=Qu|first3=Xiaojie
|last4=Li|first4=Hang
|title=2021 IEEE 4th Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC)
|chapter=Design and Implementation of Security Test Pipeline based on DevSecOps
|date=June 2021
|title= Design and Implementation of Security Test Pipeline based on DevSecOps
|journal=2021 IEEE 4th Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC)
|volume=4
|pages=532–535
|publisher=IEEE
|doi=10.1109/IMCEC51613.2021.9482270
|isbn=978-1-7281-8535-4
|url=https://ieeexplore.ieee.org/abstract/document/9482270
|s2cid=236193144
}}</ref>
 
SCA products, and particularly their capacity to generate an SBOM is required in some countries such as the [[United_States|United States]] to enforce the security of software delivered to one of their agencies by a vendor.<ref name=SB_1>{{Citecite webjournal| title=Software Bill of Materials Elements and Considerations| url=https://www.federalregister.gov/documents/2021/06/02/2021-11592/software-bill-of-materials-elements-and-considerations|title journal=Software[[Federal BillRegister]]| ofdate=6 MaterialsFebruary Elements2021| andaccess-date=6 ConsiderationsJanuary 2024}}</ref>
 
Another common use case for SCA is for Technology [[Due diligence|Due Diligence]]. Prior to a [[Mergers and acquisitions|Merger & Acquisition]] (M&A) transaction, [[Independent advisory firm|Advisory firms]] review the risks associated with the software of the target firm.<ref>
{{Cite journalbook
|last1=Serafini|first1=Daniele
|last2=Zacchiroli|first2=Stefano
|title=The 18th International Symposium on Open Collaboration
|chapter=Efficient Prior Publication Identification for Open Source Code
|date=September 2022
|title= Efficient Prior Publication Identification for Open Source Code
|journal=Proceedings of the 18th International Symposium on Open Collaboration
|volume=4
|pages=1–8
|publisher=ACM
|doi=10.1145/3555051.3555068
|arxiv=2207.11057
|url=https://dl.acm.org/doi/abs/10.1145/3555051.3555068
|isbn=9781450398459
|s2cid=251018650
|chapter-url=https://dl.acm.org/doi/abs/10.1145/3555051.3555068
}}</ref>
 
== SCA Strengths ==
The automatic nature of SCA products is their primary strength. Developers don't have to manually do an extra work when using and integrating OSS components.<ref>
{{Cite journalbook
|last1=Chen|first1=Yang
|last2=Santosa|first2=Andrew E
|last3=Sharma|first3=Asankhaya
|last4=Lo|first4=David
|title=Proceedings of the ACM/IEEE 42nd International Conference on Software Engineering: Software Engineering in Practice
|chapter=Automated identification of libraries from vulnerability data
|date=September 2020
|title= Automated identification of libraries from vulnerability data
|journal=Proceedings of the ACM/IEEE 42nd International Conference on Software Engineering: Software Engineering in Practice
|pages=90–99
|doi=10.1145/3377813.3381360
|isbn=9781450371230
|s2cid=211167417
|url=https://dlink.acmlibrary.orgsmu.edu.sg/doisis_research/pdf/10.1145/3377813.33813605501
|chapter-url=https://dl.acm.org/doi/pdf/10.1145/3377813.3381360
}}</ref>. The automation also applies to indirect references to other OSS components within code and artifacts<ref>
}}</ref> The automation also applies to indirect references to other OSS components within code and artifacts.<ref>
{{Cite journal
{{Cite book
|last1=Kengo Oka|first1=Dennis
|chapter= Software Composition Analysis in the Automotive Industry
|title=Building Secure Cars
|date=2021
|title= Software Composition Analysis in the Automotive Industry
|journal=Building Secure Cars: Assuring the Automotive Software Development Lifecycle
|pages=91–110
|publisher=Wiley
|doi=10.1002/9781119710783.ch6
|isbn=9781119710783
|s2cid=233582862
Line 233 ⟶ 350:
}}</ref>
 
Modern SCA implementations have significantly improved accuracy through advanced analysis techniques. Vulnerable method analysis reduces false positives by determining actual reachability of vulnerable code paths, while machine learning approaches for vulnerability curation help maintain more comprehensive and up-to-date vulnerability databases. These advances address many traditional limitations of metadata-only approaches.<ref>
== SCA Weaknesses ==
{{Cite arxiv
|last1=Foo|first1=Darius
|last2=Yeo|first2=Jason
|last3=Xiao|first3=Hao
|last4=Sharma|first4=Asankhaya
|title=The Dynamics of Software Composition Analysis
|date=2019
|eprint=1909.00973
|class=cs.SE
}}</ref>
 
== Weaknesses ==
Conversely, some key weaknesses of current SCA products may include:
* Complex and labor-intensive deployment that can take months to get fully operational <ref>
{{Cite journalbook
|last1=Rajapakse|first1=Roshan Namal
|last2=Zahedi|first2=Mansooreh
|last3=Babar|first3=Muhammad Ali
|title=Proceedings of the 15th ACM / IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)
|chapter=An Empirical Analysis of Practitioners' Perspectives on Security Tool Integration into DevOps
|date=2021
|title= An Empirical Analysis of Practitioners' Perspectives on Security Tool Integration into DevOps
|journal=Proceedings of the 15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)
|pages=1–12
|doi=10.1145/3475716.3475776
|arxiv=2107.02096
|isbn=9781450386654
|s2cid=235731939
|chapter-url=https://dl.acm.org/doi/pdf/10.1145/3475716.3475776
}}</ref>
* Each product uses its own proprietary database of OSS components that can vary dramatically in terms of size and coverage <ref>
{{Cite journalbook
|last1=Imtiaz|first1=Nasif
|last2=Thorn|first2=Seaver
|last3=Williams|first3=Laurie
|title=Proceedings of the 15th ACM / IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)
|chapter=A comparative study of vulnerability reporting by software composition analysis tools
|date=2021
|title= A comparative study of vulnerability reporting by software composition analysis tools
|journal=Proceedings of the 15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)
|pages=1–11
|doi=10.1145/3475716.3475769
|arxiv=2108.12078
|isbn=9781450386654
|s2cid=237346987
|chapter-url=https://dl.acm.org/doi/pdf/10.1145/3475716.3475769
}}</ref>
* Limiting vulnerability data to reporting only on vulnerabilities officially reported in the NVD (which can be months after the vulnerability was originally discovered)<ref> {{Cite web|url=https://owasp.org/www-community/Component_Analysis|title=Component Analysis|website=owasp.org}}</ref>
* Lack of automated guidance on actions to take based on SCA reports and data <ref>
{{Cite journalbook
|last1=Foo|first1=Darius
|last2=Chua|first2=Hendy
Line 271 ⟶ 402:
|last4=Ang|first4=Ming Yi
|last5=Sharma|first5=Asankhaya
|title=Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering
|chapter=Efficient static checking of library updates
|date=2018
|title= Efficient static checking of library updates
|journal=Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering
|pages=791–796
|doi=10.1145/3236024.3275535
|isbn=9781450355735
|s2cid=53079466
|chapter-url=https://dl.acm.org/doi/pdf/10.1145/3236024.3275535
}}</ref>
* Lack of guidance on the legal requirements of OSS licenses that are detected <ref>
{{Cite journalweb
|last1=Millar|first1=Stuart
|date=November 2017
|title= Vulnerability Detection in Open Source Software: The Cure and the Cause
|journalpublisher=Queen's University Belfast
|url=https://pureadmin.qub.ac.uk/ws/portalfiles/portal/128394396/SMillar_13616005_VulnerabilityDetectionInOSS.pdf
}}</ref>
Line 292 ⟶ 423:
 
* [[Security testing]]
* [[Open-source_software|Open-source software]]
* [[Common Vulnerabilities and Exposures]]
* [[Open-source license]]
* [[Software intelligence]]
* [[Asankhaya Sharma]]
* [[Static program analysis]]
* [[Call graph]]
 
==References==
{{reflist}}
 
[[Category:Information technology governance]]
[[Category:Software]]