From 1422b720c42e9c74b91f1fdb7111d51f13ef2166 Mon Sep 17 00:00:00 2001 From: Packit Date: Sep 23 2020 13:25:39 +0000 Subject: liblognorm-2.0.5 base --- diff --git a/AUTHORS b/AUTHORS new file mode 100644 index 0000000..56f04a7 --- /dev/null +++ b/AUTHORS @@ -0,0 +1 @@ +Rainer Gerhards , Adiscon GmbH diff --git a/COPYING b/COPYING new file mode 100644 index 0000000..621f6cd --- /dev/null +++ b/COPYING @@ -0,0 +1,490 @@ + + GNU LESSER GENERAL PUBLIC LICENSE + Version 2.1, February 1999 + + Copyright (C) 1991, 1999 Free Software Foundation, Inc. + 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + Everyone is permitted to copy and distribute verbatim copies + of this license document, but changing it is not allowed. + +[This is the first released version of the Lesser GPL. It also counts + as the successor of the GNU Library Public License, version 2, hence + the version number 2.1.] + + Preamble + + The licenses for most software are designed to take away your +freedom to share and change it. By contrast, the GNU General Public +Licenses are intended to guarantee your freedom to share and change +free software--to make sure the software is free for all its users. + + This license, the Lesser General Public License, applies to some +specially designated software packages--typically libraries--of the +Free Software Foundation and other authors who decide to use it. You +can use it too, but we suggest you first think carefully about whether +this license or the ordinary General Public License is the better +strategy to use in any particular case, based on the explanations below. + + When we speak of free software, we are referring to freedom of use, +not price. Our General Public Licenses are designed to make sure that +you have the freedom to distribute copies of free software (and charge +for this service if you wish); that you receive source code or can get +it if you want it; that you can change the software and use pieces of +it in new free programs; and that you are informed that you can do +these things. + + To protect your rights, we need to make restrictions that forbid +distributors to deny you these rights or to ask you to surrender these +rights. These restrictions translate to certain responsibilities for +you if you distribute copies of the library or if you modify it. + + For example, if you distribute copies of the library, whether gratis +or for a fee, you must give the recipients all the rights that we gave +you. You must make sure that they, too, receive or can get the source +code. If you link other code with the library, you must provide +complete object files to the recipients, so that they can relink them +with the library after making changes to the library and recompiling +it. And you must show them these terms so they know their rights. + + We protect your rights with a two-step method: (1) we copyright the +library, and (2) we offer you this license, which gives you legal +permission to copy, distribute and/or modify the library. + + To protect each distributor, we want to make it very clear that +there is no warranty for the free library. Also, if the library is +modified by someone else and passed on, the recipients should know +that what they have is not the original version, so that the original +author's reputation will not be affected by problems that might be +introduced by others. + + Finally, software patents pose a constant threat to the existence of +any free program. We wish to make sure that a company cannot +effectively restrict the users of a free program by obtaining a +restrictive license from a patent holder. Therefore, we insist that +any patent license obtained for a version of the library must be +consistent with the full freedom of use specified in this license. + + Most GNU software, including some libraries, is covered by the +ordinary GNU General Public License. This license, the GNU Lesser +General Public License, applies to certain designated libraries, and +is quite different from the ordinary General Public License. We use +this license for certain libraries in order to permit linking those +libraries into non-free programs. + + When a program is linked with a library, whether statically or using +a shared library, the combination of the two is legally speaking a +combined work, a derivative of the original library. The ordinary +General Public License therefore permits such linking only if the +entire combination fits its criteria of freedom. The Lesser General +Public License permits more lax criteria for linking other code with +the library. + + We call this license the "Lesser" General Public License because it +does Less to protect the user's freedom than the ordinary General +Public License. It also provides other free software developers Less +of an advantage over competing non-free programs. These disadvantages +are the reason we use the ordinary General Public License for many +libraries. However, the Lesser license provides advantages in certain +special circumstances. + + For example, on rare occasions, there may be a special need to +encourage the widest possible use of a certain library, so that it becomes +a de-facto standard. To achieve this, non-free programs must be +allowed to use the library. A more frequent case is that a free +library does the same job as widely used non-free libraries. In this +case, there is little to gain by limiting the free library to free +software only, so we use the Lesser General Public License. + + In other cases, permission to use a particular library in non-free +programs enables a greater number of people to use a large body of +free software. For example, permission to use the GNU C Library in +non-free programs enables many more people to use the whole GNU +operating system, as well as its variant, the GNU/Linux operating +system. + + Although the Lesser General Public License is Less protective of the +users' freedom, it does ensure that the user of a program that is +linked with the Library has the freedom and the wherewithal to run +that program using a modified version of the Library. + + The precise terms and conditions for copying, distribution and +modification follow. Pay close attention to the difference between a +"work based on the library" and a "work that uses the library". The +former contains code derived from the library, whereas the latter must +be combined with the library in order to run. + + GNU LESSER GENERAL PUBLIC LICENSE + TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION + + 0. This License Agreement applies to any software library or other +program which contains a notice placed by the copyright holder or +other authorized party saying it may be distributed under the terms of +this Lesser General Public License (also called "this License"). +Each licensee is addressed as "you". + + A "library" means a collection of software functions and/or data +prepared so as to be conveniently linked with application programs +(which use some of those functions and data) to form executables. + + The "Library", below, refers to any such software library or work +which has been distributed under these terms. A "work based on the +Library" means either the Library or any derivative work under +copyright law: that is to say, a work containing the Library or a +portion of it, either verbatim or with modifications and/or translated +straightforwardly into another language. (Hereinafter, translation is +included without limitation in the term "modification".) + + "Source code" for a work means the preferred form of the work for +making modifications to it. For a library, complete source code means +all the source code for all modules it contains, plus any associated +interface definition files, plus the scripts used to control compilation +and installation of the library. + + Activities other than copying, distribution and modification are not +covered by this License; they are outside its scope. The act of +running a program using the Library is not restricted, and output from +such a program is covered only if its contents constitute a work based +on the Library (independent of the use of the Library in a tool for +writing it). Whether that is true depends on what the Library does +and what the program that uses the Library does. + + 1. You may copy and distribute verbatim copies of the Library's +complete source code as you receive it, in any medium, provided that +you conspicuously and appropriately publish on each copy an +appropriate copyright notice and disclaimer of warranty; keep intact +all the notices that refer to this License and to the absence of any +warranty; and distribute a copy of this License along with the +Library. + + You may charge a fee for the physical act of transferring a copy, +and you may at your option offer warranty protection in exchange for a +fee. + + 2. You may modify your copy or copies of the Library or any portion +of it, thus forming a work based on the Library, and copy and +distribute such modifications or work under the terms of Section 1 +above, provided that you also meet all of these conditions: + + a) The modified work must itself be a software library. + + b) You must cause the files modified to carry prominent notices + stating that you changed the files and the date of any change. + + c) You must cause the whole of the work to be licensed at no + charge to all third parties under the terms of this License. + + d) If a facility in the modified Library refers to a function or a + table of data to be supplied by an application program that uses + the facility, other than as an argument passed when the facility + is invoked, then you must make a good faith effort to ensure that, + in the event an application does not supply such function or + table, the facility still operates, and performs whatever part of + its purpose remains meaningful. + + (For example, a function in a library to compute square roots has + a purpose that is entirely well-defined independent of the + application. Therefore, Subsection 2d requires that any + application-supplied function or table used by this function must + be optional: if the application does not supply it, the square + root function must still compute square roots.) + +These requirements apply to the modified work as a whole. If +identifiable sections of that work are not derived from the Library, +and can be reasonably considered independent and separate works in +themselves, then this License, and its terms, do not apply to those +sections when you distribute them as separate works. But when you +distribute the same sections as part of a whole which is a work based +on the Library, the distribution of the whole must be on the terms of +this License, whose permissions for other licensees extend to the +entire whole, and thus to each and every part regardless of who wrote +it. + +Thus, it is not the intent of this section to claim rights or contest +your rights to work written entirely by you; rather, the intent is to +exercise the right to control the distribution of derivative or +collective works based on the Library. + +In addition, mere aggregation of another work not based on the Library +with the Library (or with a work based on the Library) on a volume of +a storage or distribution medium does not bring the other work under +the scope of this License. + + 3. You may opt to apply the terms of the ordinary GNU General Public +License instead of this License to a given copy of the Library. To do +this, you must alter all the notices that refer to this License, so +that they refer to the ordinary GNU General Public License, version 2, +instead of to this License. (If a newer version than version 2 of the +ordinary GNU General Public License has appeared, then you can specify +that version instead if you wish.) Do not make any other change in +these notices. + + Once this change is made in a given copy, it is irreversible for +that copy, so the ordinary GNU General Public License applies to all +subsequent copies and derivative works made from that copy. + + This option is useful when you wish to copy part of the code of +the Library into a program that is not a library. + + 4. You may copy and distribute the Library (or a portion or +derivative of it, under Section 2) in object code or executable form +under the terms of Sections 1 and 2 above provided that you accompany +it with the complete corresponding machine-readable source code, which +must be distributed under the terms of Sections 1 and 2 above on a +medium customarily used for software interchange. + + If distribution of object code is made by offering access to copy +from a designated place, then offering equivalent access to copy the +source code from the same place satisfies the requirement to +distribute the source code, even though third parties are not +compelled to copy the source along with the object code. + + 5. A program that contains no derivative of any portion of the +Library, but is designed to work with the Library by being compiled or +linked with it, is called a "work that uses the Library". Such a +work, in isolation, is not a derivative work of the Library, and +therefore falls outside the scope of this License. + + However, linking a "work that uses the Library" with the Library +creates an executable that is a derivative of the Library (because it +contains portions of the Library), rather than a "work that uses the +library". The executable is therefore covered by this License. +Section 6 states terms for distribution of such executables. + + When a "work that uses the Library" uses material from a header file +that is part of the Library, the object code for the work may be a +derivative work of the Library even though the source code is not. +Whether this is true is especially significant if the work can be +linked without the Library, or if the work is itself a library. The +threshold for this to be true is not precisely defined by law. + + If such an object file uses only numerical parameters, data +structure layouts and accessors, and small macros and small inline +functions (ten lines or less in length), then the use of the object +file is unrestricted, regardless of whether it is legally a derivative +work. (Executables containing this object code plus portions of the +Library will still fall under Section 6.) + + Otherwise, if the work is a derivative of the Library, you may +distribute the object code for the work under the terms of Section 6. +Any executables containing that work also fall under Section 6, +whether or not they are linked directly with the Library itself. + + 6. As an exception to the Sections above, you may also combine or +link a "work that uses the Library" with the Library to produce a +work containing portions of the Library, and distribute that work +under terms of your choice, provided that the terms permit +modification of the work for the customer's own use and reverse +engineering for debugging such modifications. + + You must give prominent notice with each copy of the work that the +Library is used in it and that the Library and its use are covered by +this License. You must supply a copy of this License. If the work +during execution displays copyright notices, you must include the +copyright notice for the Library among them, as well as a reference +directing the user to the copy of this License. Also, you must do one +of these things: + + a) Accompany the work with the complete corresponding + machine-readable source code for the Library including whatever + changes were used in the work (which must be distributed under + Sections 1 and 2 above); and, if the work is an executable linked + with the Library, with the complete machine-readable "work that + uses the Library", as object code and/or source code, so that the + user can modify the Library and then relink to produce a modified + executable containing the modified Library. (It is understood + that the user who changes the contents of definitions files in the + Library will not necessarily be able to recompile the application + to use the modified definitions.) + + b) Use a suitable shared library mechanism for linking with the + Library. A suitable mechanism is one that (1) uses at run time a + copy of the library already present on the user's computer system, + rather than copying library functions into the executable, and (2) + will operate properly with a modified version of the library, if + the user installs one, as long as the modified version is + interface-compatible with the version that the work was made with. + + c) Accompany the work with a written offer, valid for at + least three years, to give the same user the materials + specified in Subsection 6a, above, for a charge no more + than the cost of performing this distribution. + + d) If distribution of the work is made by offering access to copy + from a designated place, offer equivalent access to copy the above + specified materials from the same place. + + e) Verify that the user has already received a copy of these + materials or that you have already sent this user a copy. + + For an executable, the required form of the "work that uses the +Library" must include any data and utility programs needed for +reproducing the executable from it. However, as a special exception, +the materials to be distributed need not include anything that is +normally distributed (in either source or binary form) with the major +components (compiler, kernel, and so on) of the operating system on +which the executable runs, unless that component itself accompanies +the executable. + + It may happen that this requirement contradicts the license +restrictions of other proprietary libraries that do not normally +accompany the operating system. Such a contradiction means you cannot +use both them and the Library together in an executable that you +distribute. + + 7. You may place library facilities that are a work based on the +Library side-by-side in a single library together with other library +facilities not covered by this License, and distribute such a combined +library, provided that the separate distribution of the work based on +the Library and of the other library facilities is otherwise +permitted, and provided that you do these two things: + + a) Accompany the combined library with a copy of the same work + based on the Library, uncombined with any other library + facilities. This must be distributed under the terms of the + Sections above. + + b) Give prominent notice with the combined library of the fact + that part of it is a work based on the Library, and explaining + where to find the accompanying uncombined form of the same work. + + 8. You may not copy, modify, sublicense, link with, or distribute +the Library except as expressly provided under this License. Any +attempt otherwise to copy, modify, sublicense, link with, or +distribute the Library is void, and will automatically terminate your +rights under this License. However, parties who have received copies, +or rights, from you under this License will not have their licenses +terminated so long as such parties remain in full compliance. + + 9. You are not required to accept this License, since you have not +signed it. However, nothing else grants you permission to modify or +distribute the Library or its derivative works. These actions are +prohibited by law if you do not accept this License. Therefore, by +modifying or distributing the Library (or any work based on the +Library), you indicate your acceptance of this License to do so, and +all its terms and conditions for copying, distributing or modifying +the Library or works based on it. + + 10. Each time you redistribute the Library (or any work based on the +Library), the recipient automatically receives a license from the +original licensor to copy, distribute, link with or modify the Library +subject to these terms and conditions. You may not impose any further +restrictions on the recipients' exercise of the rights granted herein. +You are not responsible for enforcing compliance by third parties with +this License. + + 11. If, as a consequence of a court judgment or allegation of patent +infringement or for any other reason (not limited to patent issues), +conditions are imposed on you (whether by court order, agreement or +otherwise) that contradict the conditions of this License, they do not +excuse you from the conditions of this License. If you cannot +distribute so as to satisfy simultaneously your obligations under this +License and any other pertinent obligations, then as a consequence you +may not distribute the Library at all. For example, if a patent +license would not permit royalty-free redistribution of the Library by +all those who receive copies directly or indirectly through you, then +the only way you could satisfy both it and this License would be to +refrain entirely from distribution of the Library. + +If any portion of this section is held invalid or unenforceable under any +particular circumstance, the balance of the section is intended to apply, +and the section as a whole is intended to apply in other circumstances. + +It is not the purpose of this section to induce you to infringe any +patents or other property right claims or to contest validity of any +such claims; this section has the sole purpose of protecting the +integrity of the free software distribution system which is +implemented by public license practices. Many people have made +generous contributions to the wide range of software distributed +through that system in reliance on consistent application of that +system; it is up to the author/donor to decide if he or she is willing +to distribute software through any other system and a licensee cannot +impose that choice. + +This section is intended to make thoroughly clear what is believed to +be a consequence of the rest of this License. + + 12. If the distribution and/or use of the Library is restricted in +certain countries either by patents or by copyrighted interfaces, the +original copyright holder who places the Library under this License may add +an explicit geographical distribution limitation excluding those countries, +so that distribution is permitted only in or among countries not thus +excluded. In such case, this License incorporates the limitation as if +written in the body of this License. + + 13. The Free Software Foundation may publish revised and/or new +versions of the Lesser General Public License from time to time. +Such new versions will be similar in spirit to the present version, +but may differ in detail to address new problems or concerns. + +Each version is given a distinguishing version number. If the Library +specifies a version number of this License which applies to it and +"any later version", you have the option of following the terms and +conditions either of that version or of any later version published by +the Free Software Foundation. If the Library does not specify a +license version number, you may choose any version ever published by +the Free Software Foundation. + + 14. If you wish to incorporate parts of the Library into other free +programs whose distribution conditions are incompatible with these, +write to the author to ask for permission. For software which is +copyrighted by the Free Software Foundation, write to the Free +Software Foundation; we sometimes make exceptions for this. Our +decision will be guided by the two goals of preserving the free status +of all derivatives of our free software and of promoting the sharing +and reuse of software generally. + + NO WARRANTY + + 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO +WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. +EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR +OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY +KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR +PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE +LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME +THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. + + 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN +WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY +AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU +FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR +CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE +LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING +RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A +FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF +SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH +DAMAGES. + + END OF TERMS AND CONDITIONS + + How to Apply These Terms to Your New Libraries + + If you develop a new library, and you want it to be of the greatest +possible use to the public, we recommend making it free software that +everyone can redistribute and change. You can do so by permitting +redistribution under these terms (or, alternatively, under the terms of the +ordinary General Public License). + + To apply these terms, attach the following notices to the library. It is +safest to attach them to the start of each source file to most effectively +convey the exclusion of warranty; and each file should have at least the +"copyright" line and a pointer to where the full notice is found. + + liblognorm, a fast samples-based log normalization library + Copyright (C) 2010 Rainer Gerhards + + This library is free software; you can redistribute it and/or + modify it under the terms of the GNU Lesser General Public + License as published by the Free Software Foundation; either + version 2.1 of the License, or (at your option) any later version. + + This library is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + Lesser General Public License for more details. + + You should have received a copy of the GNU Lesser General Public + License along with this library; if not, write to the Free Software + Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + diff --git a/COPYING.ASL20 b/COPYING.ASL20 new file mode 100644 index 0000000..9d78c8b --- /dev/null +++ b/COPYING.ASL20 @@ -0,0 +1,50 @@ +Apache License + +Version 2.0, January 2004 + +http://www.apache.org/licenses/ + +TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + +1. Definitions. + +"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. + +"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. + +"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. + +"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. + +"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. + +"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. + +"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). + +"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. + +"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." + +"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. + +2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. + +3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. + +4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: + + 1. You must give any other recipients of the Work or Derivative Works a copy of this License; and + 2. You must cause any modified files to carry prominent notices stating that You changed the files; and + 3. You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and + 4. If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. + +5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. + +6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. + +7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. + +8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. + +9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. diff --git a/ChangeLog b/ChangeLog new file mode 100644 index 0000000..7b077eb --- /dev/null +++ b/ChangeLog @@ -0,0 +1,241 @@ +---------------------------------------------------------------------- +Version 2.0.5, 2018-04-26 +- bugfix: es_str2cstr leak in string-to v1 parser + Thanks to Harshvardhan Shrivastava for the patch. +- make "make check" "succeed" on solaris 10 + actually, we just ignore the CI failures so that OpenCSW can build + new packages. The problems actually exist on that platform, but + testing has shown they always existed. We currently run out of time + to really fixing this, plus we never had any bug report on Solaris + (I assme nobody uses it on Solaris 10). However, that issues is a + blocker to make new rsyslog versions available on OpenCSW for + Solaris 10, so we go the dirty way of pretenting there is no + problem. Note: the issues was orignally not seen, as the failing + tests have been added later on. So the problem was always there, + just not visible. +- some mostly cosmetic fixes detected by Coverity Scan + e. g. memory leak if and only if system was completely out of memory +---------------------------------------------------------------------- +Version 2.0.4, 2017-10-04 +- added support for native JSON number formats + supported by parsers: number, float, hex +- added support for creating unix timestamps + supported by parsers: date-rfc3164, date-rfc5424 +- fixed build problems on Solaris + ... but there still seem to be some code issues, manifested in + testbench failures. So use with care! +---------------------------------------------------------------------- +Version 2.0.3, 2017-03-22 +- add ability to load rulebase from a string + introduces new API: + int ln_loadSamplesFromString(ln_ctx ctx, const char *string); + closes https://github.com/rsyslog/liblognorm/issues/239 +- bugfix: string parser did not correctly parse word at end of line +- bugfix: literal parser does not always store value if name is specified + if + rule=:%{"type":"literal", "text":"a", "name":"var"}% + is used and matching message is provided, variable var ist not persisted. + see also http://lists.adiscon.net/pipermail/rsyslog/2016-December/043985.html +---------------------------------------------------------------------- +Version 2.0.2, 2016-11-15 +- bugfix: no error was emitted on invalid "annotate" line +- "annnotate": permit inline comments +- fix a problem with cross-compilation + see also: https://github.com/rsyslog/liblognorm/pull/221 + Thanks to Luca Boccassi for the patch +- testbench: add test for "annotate" functionality +- bugfix: abort in literal path compaction when useing "alternative" parser + When using the "alternative" parser, literals nodes could be created with + multiple reference count. This is valid. However, literal path compaction + did not consider this case, and so "merged" these nodes, which lead to + pdag corruption and quickly to segfault. + closes https://github.com/rsyslog/liblognorm/issues/220 + closes https://github.com/rsyslog/liblognorm/issues/153 +- bugfix: lognormalizer could loop + This also caused the testbench to fail on some platforms. + due too incorrect data type + Thanks to Michael Biebl for this fix. +- fix misleading compiler warning + Thanks to Michael Biebl for this fix. +- testbench: add test for "annotate" functionality +---------------------------------------------------------------------- +Version 2.0.1, 2016-08-01 +- fix public headers, which invalidly contained a strndup() definition + Thanks to Michael Biebl for this fix. +- fix some issues in pkgconfig file + Thanks to Michael Biebl for this fix. +- enhance build system to natively support systems with older + autoconf versions and/or missing autoconf-archive. In this case we + gracefully degrade functionality, but the build still is possible. + Among others, this enables builds on CentOS 5. +---------------------------------------------------------------------- +Version 2.0.0, 2016-07-21 +- completely rewritten, much feature-enhanced version +- requires libfastjson instead of json-c +- big improvements to testbench runs, especially on travis + among others, the static analyzer is now run and testbench throws + an error if the static analyzer (via clang) is not clean +- lognormalizer tool can now handle lines larger 10k characters + Thanks to Janmejay Singh for the patch +---------------------------------------------------------------------- +Version 1.1.3, 2015-??-?? [no official release] +- make work on Solaris +- check for runaway rules. + A runaway rule is one that has unmatched percent signs and thus + is not terminated properly at its end. This also means we no longer + accept "rule=" at the first column of a continuation line, which is + no problem (see doc for more information). +- fix: process last line if it misses the terminating LF + This problem occurs with the very last line of a rulebase (at EOF). + If it is not properly terminated (LF missing), it is silently ignored. + Previous versions did obviously process lines in this case. While + technically this is invalid input, we can't outrule that such rulebases + exist. For example, they do in the rsyslog testbench, which made + us aware of the problem (see https://github.com/rsyslog/rsyslog/issues/489 ) + I think the proper way of addressing this is to process such lines without + termination, as many other tools do as well. + closes https://github.com/rsyslog/liblognorm/issues/135 +---------------------------------------------------------------------- +Version 1.1.2, 2015-07-20 +- permit newline inside parser definition +- new parser "cisco-interface-spec" +- new parser "json" to process json parts of the message +- new parser "mac48" to process mac layer addresses +- new parser "name-value-list" (currently inofficial, experimental) +- some parsers did incorrectly report success when an error occurred + this was caused by inconsistencies between various macros. We have + changed the parser-generation macros to match the semantics of the + broader CHKN/CHKR macros and also restructured/simplified the + parser generation macros. + closes https://github.com/rsyslog/liblognorm/issues/41 +- call "rest" parser only if nothing else matches. + Versions prior to 1.1.2 did execute "rest" during regular parser + processing, and thus parser matches have been more or less random. + With 1.1.2 this is now always the last parser called. This may cause + problems with existing rulesets, HOWEVER, adding any other rule or + changing the load order would also have caused problems, so there + really is no compatibility to preserve. + see also: + http://blog.gerhards.net/2015/04/liblognorms-rest-parser-now-more-useful.html +- new API to support error callbacks + This permits callers to forward messages in regard to e.g. wrong rule + bases to their users, which is very useful and actually missing in the + previous code base. So far, we only have few error messages. + However, we will review the code and add more. The important part is + that callers can begin to use the new API and thus will benefit when + we add more error messages. +- testbench is now enabled by default +- bugfix: misadressing on some constant values + see also https://github.com/rsyslog/liblognorm/pull/67 + Thanks to github user ontholerian for the patch +- bugfix: add missing function prototypes + This could potentially lead to problems on some platforms, + especially those with 64 bit pointers. +---------------------------------------------------------------------- +Version 1.1.1, 2015-03-09 +- fixed library version numbering + Thanks to Tomas Heinreich for reporting the problem. +- added new parser syntaxes + Thanks to Janmejay Singh for implementing most of them. +- bugfix: function ln_parseFieldDescr() returns state value + due to unitialized variable. This can also lead to invalid + returning no sample node where one would have to be created. +---------------------------------------------------------------------- +Version 1.1.0, 2015-01-08 +- added regular expression support + use this feature with great care, as it thrashes performance + Thanks to Janmejay Singh for implementing this feature. +- fix build problem when --enable-debug was set + closes: https://github.com/rsyslog/liblognorm/issues/5 +---------------------------------------------------------------------- +Version 1.0.1, 2014-04-11 +- improved doc (via RST/Sphinx) +- bugfix: unparsed fields were copied incorrectly from non-terminated + string. Thanks to Josh Blum for the fix. +- bugfix: mandatory tag did not work in lognormalizer +---------------------------------------------------------------------- +Version 1.0.0, 2013-11-28 +- WARNING: this version has incompatible interface and older programs + will not compile with it. + For details see http://www.liblognorm.com/news/on-liblognorm-1-0-0/ +- libestr is not used any more in interface functions. Traditional + C strings are used instead. Internally, libestr is still used, but + scheduled for removal. +- libee is not used any more. JSON-C is used for object handling + instead. Parsers and formatters are now part of liblognorm. +- added new field type "rest", which simply sinks all up to end of + the string. +- added support for glueing two fields together, without literal + between them. It allows for constructs like: + %volume:number%%unit:word% + which matches string "1000Kbps" +- Fix incorrect merging of trees with empty literal at end + Thanks to Pavel Levshin for the patch +- this version has survived many bugfixes +---------------------------------------------------------------------- +================================================================================ +The versions below is liblognorm0, which has a different API +================================================================================ +---------------------------------------------------------------------- +Version 0.3.7, 2013-07-17 +- added support to load single samples + Thanks to John Hopper for the patch +---------------------------------------------------------------------- +Version 0.3.6, 2013-03-22 +- bugfix: unitialized variable could lead to rulebase load error +---------------------------------------------------------------------- +Version 0.3.5 (rgerhards), 2012-09-18 +- renamed "normalizer" tool to "lognormalizer" to solve name clashes + Thanks to the Fedora folks for pointing this out. +---------------------------------------------------------------------- +Version 0.3.4 (rgerhards), 2012-04-16 +- bugfix: normalizer tool had a memory leak + Thanks to Brian Know for alerting me and helping to debug +---------------------------------------------------------------------- +Version 0.3.3 (rgerhards), 2012-02-06 +- required header file was not installed, resulting in compile error + closes: http://bugzilla.adiscon.com/show_bug.cgi?id=307 + Thanks to Andreis Vinogradovs for alerting us on this bug. +---------------------------------------------------------------------- +Version 0.3.2 (rgerhards), 2011-11-21 +- added rfc5424 parser (requires libee >= 0.3.2) +- added "-" to serve as name for filler fields. Value is extracted, + but no field is written +- special handling for iptables log via %iptables% parser added + (currently experimental pending practical verification) +- normalizer tool on its way to a full-blow stand-alone tool +- support for annotations added, for the time being see + http://blog.gerhards.net/2011/11/log-annotation-with-liblognorm.html +---------------------------------------------------------------------- +Version 0.3.1 (rgerhards), 2011-04-18 +- added -t option to normalizer so that only messages with a + specified tag will be output +- bugfix: abort if a tag was assigned to a message without any + fields parsed out (uncommon scenario) +- bugfix: mem leak on parse tree destruct -- associated tags were + not deleted +- bugfix: potential abort in normalizer due to misadressing in debug + message generation +---------------------------------------------------------------------- +Version 0.3.0 (rgerhards), 2011-04-06 +- support for message classification via tags added +- bugfix: partial messages were invalidly matched + closes: http://bugzilla.adiscon.com/show_bug.cgi?id=247 +---------------------------------------------------------------------- +Version 0.2.0 (rgerhards), 2011-04-01 +- added -E option to normalizer tool, permits to specify data for + encoders +- support for new libee parsers: + * Time12hr + * Time24hr + * ISODate + * QuotedString +- support for csv encoding added +- added -p option to normalizer tool (output only correctly parsed + entries) +- bugfix: segfault if a parse tree prefix had exactly buffer size, + in which case it was invalidly assumed that an external buffer had + been allocated +---------------------------------------------------------------------- +Version 0.1.0 (rgerhards), 2010-12-09 +Initial public release. diff --git a/INSTALL b/INSTALL new file mode 100644 index 0000000..2099840 --- /dev/null +++ b/INSTALL @@ -0,0 +1,370 @@ +Installation Instructions +************************* + +Copyright (C) 1994-1996, 1999-2002, 2004-2013 Free Software Foundation, +Inc. + + Copying and distribution of this file, with or without modification, +are permitted in any medium without royalty provided the copyright +notice and this notice are preserved. This file is offered as-is, +without warranty of any kind. + +Basic Installation +================== + + Briefly, the shell command `./configure && make && make install' +should configure, build, and install this package. The following +more-detailed instructions are generic; see the `README' file for +instructions specific to this package. Some packages provide this +`INSTALL' file but do not implement all of the features documented +below. The lack of an optional feature in a given package is not +necessarily a bug. More recommendations for GNU packages can be found +in *note Makefile Conventions: (standards)Makefile Conventions. + + The `configure' shell script attempts to guess correct values for +various system-dependent variables used during compilation. It uses +those values to create a `Makefile' in each directory of the package. +It may also create one or more `.h' files containing system-dependent +definitions. Finally, it creates a shell script `config.status' that +you can run in the future to recreate the current configuration, and a +file `config.log' containing compiler output (useful mainly for +debugging `configure'). + + It can also use an optional file (typically called `config.cache' +and enabled with `--cache-file=config.cache' or simply `-C') that saves +the results of its tests to speed up reconfiguring. Caching is +disabled by default to prevent problems with accidental use of stale +cache files. + + If you need to do unusual things to compile the package, please try +to figure out how `configure' could check whether to do them, and mail +diffs or instructions to the address given in the `README' so they can +be considered for the next release. If you are using the cache, and at +some point `config.cache' contains results you don't want to keep, you +may remove or edit it. + + The file `configure.ac' (or `configure.in') is used to create +`configure' by a program called `autoconf'. You need `configure.ac' if +you want to change it or regenerate `configure' using a newer version +of `autoconf'. + + The simplest way to compile this package is: + + 1. `cd' to the directory containing the package's source code and type + `./configure' to configure the package for your system. + + Running `configure' might take a while. While running, it prints + some messages telling which features it is checking for. + + 2. Type `make' to compile the package. + + 3. Optionally, type `make check' to run any self-tests that come with + the package, generally using the just-built uninstalled binaries. + + 4. Type `make install' to install the programs and any data files and + documentation. When installing into a prefix owned by root, it is + recommended that the package be configured and built as a regular + user, and only the `make install' phase executed with root + privileges. + + 5. Optionally, type `make installcheck' to repeat any self-tests, but + this time using the binaries in their final installed location. + This target does not install anything. Running this target as a + regular user, particularly if the prior `make install' required + root privileges, verifies that the installation completed + correctly. + + 6. You can remove the program binaries and object files from the + source code directory by typing `make clean'. To also remove the + files that `configure' created (so you can compile the package for + a different kind of computer), type `make distclean'. There is + also a `make maintainer-clean' target, but that is intended mainly + for the package's developers. If you use it, you may have to get + all sorts of other programs in order to regenerate files that came + with the distribution. + + 7. Often, you can also type `make uninstall' to remove the installed + files again. In practice, not all packages have tested that + uninstallation works correctly, even though it is required by the + GNU Coding Standards. + + 8. Some packages, particularly those that use Automake, provide `make + distcheck', which can by used by developers to test that all other + targets like `make install' and `make uninstall' work correctly. + This target is generally not run by end users. + +Compilers and Options +===================== + + Some systems require unusual options for compilation or linking that +the `configure' script does not know about. Run `./configure --help' +for details on some of the pertinent environment variables. + + You can give `configure' initial values for configuration parameters +by setting variables in the command line or in the environment. Here +is an example: + + ./configure CC=c99 CFLAGS=-g LIBS=-lposix + + *Note Defining Variables::, for more details. + +Compiling For Multiple Architectures +==================================== + + You can compile the package for more than one kind of computer at the +same time, by placing the object files for each architecture in their +own directory. To do this, you can use GNU `make'. `cd' to the +directory where you want the object files and executables to go and run +the `configure' script. `configure' automatically checks for the +source code in the directory that `configure' is in and in `..'. This +is known as a "VPATH" build. + + With a non-GNU `make', it is safer to compile the package for one +architecture at a time in the source code directory. After you have +installed the package for one architecture, use `make distclean' before +reconfiguring for another architecture. + + On MacOS X 10.5 and later systems, you can create libraries and +executables that work on multiple system types--known as "fat" or +"universal" binaries--by specifying multiple `-arch' options to the +compiler but only a single `-arch' option to the preprocessor. Like +this: + + ./configure CC="gcc -arch i386 -arch x86_64 -arch ppc -arch ppc64" \ + CXX="g++ -arch i386 -arch x86_64 -arch ppc -arch ppc64" \ + CPP="gcc -E" CXXCPP="g++ -E" + + This is not guaranteed to produce working output in all cases, you +may have to build one architecture at a time and combine the results +using the `lipo' tool if you have problems. + +Installation Names +================== + + By default, `make install' installs the package's commands under +`/usr/local/bin', include files under `/usr/local/include', etc. You +can specify an installation prefix other than `/usr/local' by giving +`configure' the option `--prefix=PREFIX', where PREFIX must be an +absolute file name. + + You can specify separate installation prefixes for +architecture-specific files and architecture-independent files. If you +pass the option `--exec-prefix=PREFIX' to `configure', the package uses +PREFIX as the prefix for installing programs and libraries. +Documentation and other data files still use the regular prefix. + + In addition, if you use an unusual directory layout you can give +options like `--bindir=DIR' to specify different values for particular +kinds of files. Run `configure --help' for a list of the directories +you can set and what kinds of files go in them. In general, the +default for these options is expressed in terms of `${prefix}', so that +specifying just `--prefix' will affect all of the other directory +specifications that were not explicitly provided. + + The most portable way to affect installation locations is to pass the +correct locations to `configure'; however, many packages provide one or +both of the following shortcuts of passing variable assignments to the +`make install' command line to change installation locations without +having to reconfigure or recompile. + + The first method involves providing an override variable for each +affected directory. For example, `make install +prefix=/alternate/directory' will choose an alternate location for all +directory configuration variables that were expressed in terms of +`${prefix}'. Any directories that were specified during `configure', +but not in terms of `${prefix}', must each be overridden at install +time for the entire installation to be relocated. The approach of +makefile variable overrides for each directory variable is required by +the GNU Coding Standards, and ideally causes no recompilation. +However, some platforms have known limitations with the semantics of +shared libraries that end up requiring recompilation when using this +method, particularly noticeable in packages that use GNU Libtool. + + The second method involves providing the `DESTDIR' variable. For +example, `make install DESTDIR=/alternate/directory' will prepend +`/alternate/directory' before all installation names. The approach of +`DESTDIR' overrides is not required by the GNU Coding Standards, and +does not work on platforms that have drive letters. On the other hand, +it does better at avoiding recompilation issues, and works well even +when some directory options were not specified in terms of `${prefix}' +at `configure' time. + +Optional Features +================= + + If the package supports it, you can cause programs to be installed +with an extra prefix or suffix on their names by giving `configure' the +option `--program-prefix=PREFIX' or `--program-suffix=SUFFIX'. + + Some packages pay attention to `--enable-FEATURE' options to +`configure', where FEATURE indicates an optional part of the package. +They may also pay attention to `--with-PACKAGE' options, where PACKAGE +is something like `gnu-as' or `x' (for the X Window System). The +`README' should mention any `--enable-' and `--with-' options that the +package recognizes. + + For packages that use the X Window System, `configure' can usually +find the X include and library files automatically, but if it doesn't, +you can use the `configure' options `--x-includes=DIR' and +`--x-libraries=DIR' to specify their locations. + + Some packages offer the ability to configure how verbose the +execution of `make' will be. For these packages, running `./configure +--enable-silent-rules' sets the default to minimal output, which can be +overridden with `make V=1'; while running `./configure +--disable-silent-rules' sets the default to verbose, which can be +overridden with `make V=0'. + +Particular systems +================== + + On HP-UX, the default C compiler is not ANSI C compatible. If GNU +CC is not installed, it is recommended to use the following options in +order to use an ANSI C compiler: + + ./configure CC="cc -Ae -D_XOPEN_SOURCE=500" + +and if that doesn't work, install pre-built binaries of GCC for HP-UX. + + HP-UX `make' updates targets which have the same time stamps as +their prerequisites, which makes it generally unusable when shipped +generated files such as `configure' are involved. Use GNU `make' +instead. + + On OSF/1 a.k.a. Tru64, some versions of the default C compiler cannot +parse its `' header file. The option `-nodtk' can be used as +a workaround. If GNU CC is not installed, it is therefore recommended +to try + + ./configure CC="cc" + +and if that doesn't work, try + + ./configure CC="cc -nodtk" + + On Solaris, don't put `/usr/ucb' early in your `PATH'. This +directory contains several dysfunctional programs; working variants of +these programs are available in `/usr/bin'. So, if you need `/usr/ucb' +in your `PATH', put it _after_ `/usr/bin'. + + On Haiku, software installed for all users goes in `/boot/common', +not `/usr/local'. It is recommended to use the following options: + + ./configure --prefix=/boot/common + +Specifying the System Type +========================== + + There may be some features `configure' cannot figure out +automatically, but needs to determine by the type of machine the package +will run on. Usually, assuming the package is built to be run on the +_same_ architectures, `configure' can figure that out, but if it prints +a message saying it cannot guess the machine type, give it the +`--build=TYPE' option. TYPE can either be a short name for the system +type, such as `sun4', or a canonical name which has the form: + + CPU-COMPANY-SYSTEM + +where SYSTEM can have one of these forms: + + OS + KERNEL-OS + + See the file `config.sub' for the possible values of each field. If +`config.sub' isn't included in this package, then this package doesn't +need to know the machine type. + + If you are _building_ compiler tools for cross-compiling, you should +use the option `--target=TYPE' to select the type of system they will +produce code for. + + If you want to _use_ a cross compiler, that generates code for a +platform different from the build platform, you should specify the +"host" platform (i.e., that on which the generated programs will +eventually be run) with `--host=TYPE'. + +Sharing Defaults +================ + + If you want to set default values for `configure' scripts to share, +you can create a site shell script called `config.site' that gives +default values for variables like `CC', `cache_file', and `prefix'. +`configure' looks for `PREFIX/share/config.site' if it exists, then +`PREFIX/etc/config.site' if it exists. Or, you can set the +`CONFIG_SITE' environment variable to the location of the site script. +A warning: not all `configure' scripts look for a site script. + +Defining Variables +================== + + Variables not defined in a site shell script can be set in the +environment passed to `configure'. However, some packages may run +configure again during the build, and the customized values of these +variables may be lost. In order to avoid this problem, you should set +them in the `configure' command line, using `VAR=value'. For example: + + ./configure CC=/usr/local2/bin/gcc + +causes the specified `gcc' to be used as the C compiler (unless it is +overridden in the site shell script). + +Unfortunately, this technique does not work for `CONFIG_SHELL' due to +an Autoconf limitation. Until the limitation is lifted, you can use +this workaround: + + CONFIG_SHELL=/bin/bash ./configure CONFIG_SHELL=/bin/bash + +`configure' Invocation +====================== + + `configure' recognizes the following options to control how it +operates. + +`--help' +`-h' + Print a summary of all of the options to `configure', and exit. + +`--help=short' +`--help=recursive' + Print a summary of the options unique to this package's + `configure', and exit. The `short' variant lists options used + only in the top level, while the `recursive' variant lists options + also present in any nested packages. + +`--version' +`-V' + Print the version of Autoconf used to generate the `configure' + script, and exit. + +`--cache-file=FILE' + Enable the cache: use and save the results of the tests in FILE, + traditionally `config.cache'. FILE defaults to `/dev/null' to + disable caching. + +`--config-cache' +`-C' + Alias for `--cache-file=config.cache'. + +`--quiet' +`--silent' +`-q' + Do not print messages saying which checks are being made. To + suppress all normal output, redirect it to `/dev/null' (any error + messages will still be shown). + +`--srcdir=DIR' + Look for the package's source code in directory DIR. Usually + `configure' can determine that directory automatically. + +`--prefix=DIR' + Use DIR as the installation prefix. *note Installation Names:: + for more details, including other options available for fine-tuning + the installation locations. + +`--no-create' +`-n' + Run the configure checks, but stop before creating any output + files. + +`configure' also accepts some other, not widely useful, options. Run +`configure --help' for more details. diff --git a/Makefile.am b/Makefile.am new file mode 100644 index 0000000..3711be1 --- /dev/null +++ b/Makefile.am @@ -0,0 +1,15 @@ +SUBDIRS = compat src tools +if ENABLE_DOCS + SUBDIRS += doc +endif + +EXTRA_DIST = rulebases \ + COPYING.ASL20 +pkgconfigdir = $(libdir)/pkgconfig +pkgconfig_DATA = lognorm.pc + +ACLOCAL_AMFLAGS = -I m4 + +if ENABLE_TESTBENCH + SUBDIRS += tests +endif diff --git a/Makefile.in b/Makefile.in new file mode 100644 index 0000000..800ef72 --- /dev/null +++ b/Makefile.in @@ -0,0 +1,893 @@ +# Makefile.in generated by automake 1.15 from Makefile.am. +# @configure_input@ + +# Copyright (C) 1994-2014 Free Software Foundation, Inc. + +# This Makefile.in is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY, to the extent permitted by law; without +# even the implied warranty of MERCHANTABILITY or FITNESS FOR A +# PARTICULAR PURPOSE. + +@SET_MAKE@ + +VPATH = @srcdir@ +am__is_gnu_make = { \ + if test -z '$(MAKELEVEL)'; then \ + false; \ + elif test -n '$(MAKE_HOST)'; then \ + true; \ + elif test -n '$(MAKE_VERSION)' && test -n '$(CURDIR)'; then \ + true; \ + else \ + false; \ + fi; \ +} +am__make_running_with_option = \ + case $${target_option-} in \ + ?) ;; \ + *) echo "am__make_running_with_option: internal error: invalid" \ + "target option '$${target_option-}' specified" >&2; \ + exit 1;; \ + esac; \ + has_opt=no; \ + sane_makeflags=$$MAKEFLAGS; \ + if $(am__is_gnu_make); then \ + sane_makeflags=$$MFLAGS; \ + else \ + case $$MAKEFLAGS in \ + *\\[\ \ ]*) \ + bs=\\; \ + sane_makeflags=`printf '%s\n' "$$MAKEFLAGS" \ + | sed "s/$$bs$$bs[$$bs $$bs ]*//g"`;; \ + esac; \ + fi; \ + skip_next=no; \ + strip_trailopt () \ + { \ + flg=`printf '%s\n' "$$flg" | sed "s/$$1.*$$//"`; \ + }; \ + for flg in $$sane_makeflags; do \ + test $$skip_next = yes && { skip_next=no; continue; }; \ + case $$flg in \ + *=*|--*) continue;; \ + -*I) strip_trailopt 'I'; skip_next=yes;; \ + -*I?*) strip_trailopt 'I';; \ + -*O) strip_trailopt 'O'; skip_next=yes;; \ + -*O?*) strip_trailopt 'O';; \ + -*l) strip_trailopt 'l'; skip_next=yes;; \ + -*l?*) strip_trailopt 'l';; \ + -[dEDm]) skip_next=yes;; \ + -[JT]) skip_next=yes;; \ + esac; \ + case $$flg in \ + *$$target_option*) has_opt=yes; break;; \ + esac; \ + done; \ + test $$has_opt = yes +am__make_dryrun = (target_option=n; $(am__make_running_with_option)) +am__make_keepgoing = (target_option=k; $(am__make_running_with_option)) +pkgdatadir = $(datadir)/@PACKAGE@ +pkgincludedir = $(includedir)/@PACKAGE@ +pkglibdir = $(libdir)/@PACKAGE@ +pkglibexecdir = $(libexecdir)/@PACKAGE@ +am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd +install_sh_DATA = $(install_sh) -c -m 644 +install_sh_PROGRAM = $(install_sh) -c +install_sh_SCRIPT = $(install_sh) -c +INSTALL_HEADER = $(INSTALL_DATA) +transform = $(program_transform_name) +NORMAL_INSTALL = : +PRE_INSTALL = : +POST_INSTALL = : +NORMAL_UNINSTALL = : +PRE_UNINSTALL = : +POST_UNINSTALL = : +build_triplet = @build@ +host_triplet = @host@ +@ENABLE_DOCS_TRUE@am__append_1 = doc +@ENABLE_TESTBENCH_TRUE@am__append_2 = tests +subdir = . +ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 +am__aclocal_m4_deps = $(top_srcdir)/m4/libtool.m4 \ + $(top_srcdir)/m4/ltoptions.m4 $(top_srcdir)/m4/ltsugar.m4 \ + $(top_srcdir)/m4/ltversion.m4 $(top_srcdir)/m4/lt~obsolete.m4 \ + $(top_srcdir)/m4/pkg.m4 $(top_srcdir)/configure.ac +am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \ + $(ACLOCAL_M4) +DIST_COMMON = $(srcdir)/Makefile.am $(top_srcdir)/configure \ + $(am__configure_deps) $(am__DIST_COMMON) +am__CONFIG_DISTCLEAN_FILES = config.status config.cache config.log \ + configure.lineno config.status.lineno +mkinstalldirs = $(install_sh) -d +CONFIG_HEADER = config.h +CONFIG_CLEAN_FILES = lognorm.pc +CONFIG_CLEAN_VPATH_FILES = +AM_V_P = $(am__v_P_@AM_V@) +am__v_P_ = $(am__v_P_@AM_DEFAULT_V@) +am__v_P_0 = false +am__v_P_1 = : +AM_V_GEN = $(am__v_GEN_@AM_V@) +am__v_GEN_ = $(am__v_GEN_@AM_DEFAULT_V@) +am__v_GEN_0 = @echo " GEN " $@; +am__v_GEN_1 = +AM_V_at = $(am__v_at_@AM_V@) +am__v_at_ = $(am__v_at_@AM_DEFAULT_V@) +am__v_at_0 = @ +am__v_at_1 = +SOURCES = +DIST_SOURCES = +RECURSIVE_TARGETS = all-recursive check-recursive cscopelist-recursive \ + ctags-recursive dvi-recursive html-recursive info-recursive \ + install-data-recursive install-dvi-recursive \ + install-exec-recursive install-html-recursive \ + install-info-recursive install-pdf-recursive \ + install-ps-recursive install-recursive installcheck-recursive \ + installdirs-recursive pdf-recursive ps-recursive \ + tags-recursive uninstall-recursive +am__can_run_installinfo = \ + case $$AM_UPDATE_INFO_DIR in \ + n|no|NO) false;; \ + *) (install-info --version) >/dev/null 2>&1;; \ + esac +am__vpath_adj_setup = srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`; +am__vpath_adj = case $$p in \ + $(srcdir)/*) f=`echo "$$p" | sed "s|^$$srcdirstrip/||"`;; \ + *) f=$$p;; \ + esac; +am__strip_dir = f=`echo $$p | sed -e 's|^.*/||'`; +am__install_max = 40 +am__nobase_strip_setup = \ + srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*|]/\\\\&/g'` +am__nobase_strip = \ + for p in $$list; do echo "$$p"; done | sed -e "s|$$srcdirstrip/||" +am__nobase_list = $(am__nobase_strip_setup); \ + for p in $$list; do echo "$$p $$p"; done | \ + sed "s| $$srcdirstrip/| |;"' / .*\//!s/ .*/ ./; s,\( .*\)/[^/]*$$,\1,' | \ + $(AWK) 'BEGIN { files["."] = "" } { files[$$2] = files[$$2] " " $$1; \ + if (++n[$$2] == $(am__install_max)) \ + { print $$2, files[$$2]; n[$$2] = 0; files[$$2] = "" } } \ + END { for (dir in files) print dir, files[dir] }' +am__base_list = \ + sed '$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;s/\n/ /g' | \ + sed '$$!N;$$!N;$$!N;$$!N;s/\n/ /g' +am__uninstall_files_from_dir = { \ + test -z "$$files" \ + || { test ! -d "$$dir" && test ! -f "$$dir" && test ! -r "$$dir"; } \ + || { echo " ( cd '$$dir' && rm -f" $$files ")"; \ + $(am__cd) "$$dir" && rm -f $$files; }; \ + } +am__installdirs = "$(DESTDIR)$(pkgconfigdir)" +DATA = $(pkgconfig_DATA) +RECURSIVE_CLEAN_TARGETS = mostlyclean-recursive clean-recursive \ + distclean-recursive maintainer-clean-recursive +am__recursive_targets = \ + $(RECURSIVE_TARGETS) \ + $(RECURSIVE_CLEAN_TARGETS) \ + $(am__extra_recursive_targets) +AM_RECURSIVE_TARGETS = $(am__recursive_targets:-recursive=) TAGS CTAGS \ + cscope distdir dist dist-all distcheck +am__tagged_files = $(HEADERS) $(SOURCES) $(TAGS_FILES) \ + $(LISP)config.h.in +# Read a list of newline-separated strings from the standard input, +# and print each of them once, without duplicates. Input order is +# *not* preserved. +am__uniquify_input = $(AWK) '\ + BEGIN { nonempty = 0; } \ + { items[$$0] = 1; nonempty = 1; } \ + END { if (nonempty) { for (i in items) print i; }; } \ +' +# Make sure the list of sources is unique. This is necessary because, +# e.g., the same source file might be shared among _SOURCES variables +# for different programs/libraries. +am__define_uniq_tagged_files = \ + list='$(am__tagged_files)'; \ + unique=`for i in $$list; do \ + if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \ + done | $(am__uniquify_input)` +ETAGS = etags +CTAGS = ctags +CSCOPE = cscope +DIST_SUBDIRS = compat src tools doc tests +am__DIST_COMMON = $(srcdir)/Makefile.in $(srcdir)/config.h.in \ + $(srcdir)/lognorm.pc.in AUTHORS COPYING ChangeLog INSTALL NEWS \ + README compile config.guess config.sub install-sh ltmain.sh \ + missing +DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST) +distdir = $(PACKAGE)-$(VERSION) +top_distdir = $(distdir) +am__remove_distdir = \ + if test -d "$(distdir)"; then \ + find "$(distdir)" -type d ! -perm -200 -exec chmod u+w {} ';' \ + && rm -rf "$(distdir)" \ + || { sleep 5 && rm -rf "$(distdir)"; }; \ + else :; fi +am__post_remove_distdir = $(am__remove_distdir) +am__relativize = \ + dir0=`pwd`; \ + sed_first='s,^\([^/]*\)/.*$$,\1,'; \ + sed_rest='s,^[^/]*/*,,'; \ + sed_last='s,^.*/\([^/]*\)$$,\1,'; \ + sed_butlast='s,/*[^/]*$$,,'; \ + while test -n "$$dir1"; do \ + first=`echo "$$dir1" | sed -e "$$sed_first"`; \ + if test "$$first" != "."; then \ + if test "$$first" = ".."; then \ + dir2=`echo "$$dir0" | sed -e "$$sed_last"`/"$$dir2"; \ + dir0=`echo "$$dir0" | sed -e "$$sed_butlast"`; \ + else \ + first2=`echo "$$dir2" | sed -e "$$sed_first"`; \ + if test "$$first2" = "$$first"; then \ + dir2=`echo "$$dir2" | sed -e "$$sed_rest"`; \ + else \ + dir2="../$$dir2"; \ + fi; \ + dir0="$$dir0"/"$$first"; \ + fi; \ + fi; \ + dir1=`echo "$$dir1" | sed -e "$$sed_rest"`; \ + done; \ + reldir="$$dir2" +DIST_ARCHIVES = $(distdir).tar.gz +GZIP_ENV = --best +DIST_TARGETS = dist-gzip +distuninstallcheck_listfiles = find . -type f -print +am__distuninstallcheck_listfiles = $(distuninstallcheck_listfiles) \ + | sed 's|^\./|$(prefix)/|' | grep -v '$(infodir)/dir$$' +distcleancheck_listfiles = find . -type f -print +ACLOCAL = @ACLOCAL@ +AMTAR = @AMTAR@ +AM_DEFAULT_VERBOSITY = @AM_DEFAULT_VERBOSITY@ +AR = @AR@ +AUTOCONF = @AUTOCONF@ +AUTOHEADER = @AUTOHEADER@ +AUTOMAKE = @AUTOMAKE@ +AWK = @AWK@ +CC = @CC@ +CCDEPMODE = @CCDEPMODE@ +CFLAGS = @CFLAGS@ +CPP = @CPP@ +CPPFLAGS = @CPPFLAGS@ +CYGPATH_W = @CYGPATH_W@ +DEFS = @DEFS@ +DEPDIR = @DEPDIR@ +DLLTOOL = @DLLTOOL@ +DSYMUTIL = @DSYMUTIL@ +DUMPBIN = @DUMPBIN@ +ECHO_C = @ECHO_C@ +ECHO_N = @ECHO_N@ +ECHO_T = @ECHO_T@ +EGREP = @EGREP@ +EXEEXT = @EXEEXT@ +FEATURE_REGEXP = @FEATURE_REGEXP@ +FGREP = @FGREP@ +GREP = @GREP@ +INSTALL = @INSTALL@ +INSTALL_DATA = @INSTALL_DATA@ +INSTALL_PROGRAM = @INSTALL_PROGRAM@ +INSTALL_SCRIPT = @INSTALL_SCRIPT@ +INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@ +JSON_C_CFLAGS = @JSON_C_CFLAGS@ +JSON_C_LIBS = @JSON_C_LIBS@ +LD = @LD@ +LDFLAGS = @LDFLAGS@ +LIBESTR_CFLAGS = @LIBESTR_CFLAGS@ +LIBESTR_LIBS = @LIBESTR_LIBS@ +LIBLOGNORM_CFLAGS = @LIBLOGNORM_CFLAGS@ +LIBLOGNORM_LIBS = @LIBLOGNORM_LIBS@ +LIBOBJS = @LIBOBJS@ +LIBS = @LIBS@ +LIBTOOL = @LIBTOOL@ +LIPO = @LIPO@ +LN_S = @LN_S@ +LTLIBOBJS = @LTLIBOBJS@ +LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAKEINFO = @MAKEINFO@ +MANIFEST_TOOL = @MANIFEST_TOOL@ +MKDIR_P = @MKDIR_P@ +NM = @NM@ +NMEDIT = @NMEDIT@ +OBJDUMP = @OBJDUMP@ +OBJEXT = @OBJEXT@ +OTOOL = @OTOOL@ +OTOOL64 = @OTOOL64@ +PACKAGE = @PACKAGE@ +PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@ +PACKAGE_NAME = @PACKAGE_NAME@ +PACKAGE_STRING = @PACKAGE_STRING@ +PACKAGE_TARNAME = @PACKAGE_TARNAME@ +PACKAGE_URL = @PACKAGE_URL@ +PACKAGE_VERSION = @PACKAGE_VERSION@ +PATH_SEPARATOR = @PATH_SEPARATOR@ +PCRE_CFLAGS = @PCRE_CFLAGS@ +PCRE_LIBS = @PCRE_LIBS@ +PKG_CONFIG = @PKG_CONFIG@ +PKG_CONFIG_LIBDIR = @PKG_CONFIG_LIBDIR@ +PKG_CONFIG_PATH = @PKG_CONFIG_PATH@ +RANLIB = @RANLIB@ +SED = @SED@ +SET_MAKE = @SET_MAKE@ +SHELL = @SHELL@ +SPHINXBUILD = @SPHINXBUILD@ +STRIP = @STRIP@ +VALGRIND = @VALGRIND@ +VERSION = @VERSION@ +WARN_CFLAGS = @WARN_CFLAGS@ +WARN_LDFLAGS = @WARN_LDFLAGS@ +WARN_SCANNERFLAGS = @WARN_SCANNERFLAGS@ +abs_builddir = @abs_builddir@ +abs_srcdir = @abs_srcdir@ +abs_top_builddir = @abs_top_builddir@ +abs_top_srcdir = @abs_top_srcdir@ +ac_ct_AR = @ac_ct_AR@ +ac_ct_CC = @ac_ct_CC@ +ac_ct_DUMPBIN = @ac_ct_DUMPBIN@ +am__include = @am__include@ +am__leading_dot = @am__leading_dot@ +am__quote = @am__quote@ +am__tar = @am__tar@ +am__untar = @am__untar@ +bindir = @bindir@ +build = @build@ +build_alias = @build_alias@ +build_cpu = @build_cpu@ +build_os = @build_os@ +build_vendor = @build_vendor@ +builddir = @builddir@ +datadir = @datadir@ +datarootdir = @datarootdir@ +docdir = @docdir@ +dvidir = @dvidir@ +exec_prefix = @exec_prefix@ +host = @host@ +host_alias = @host_alias@ +host_cpu = @host_cpu@ +host_os = @host_os@ +host_vendor = @host_vendor@ +htmldir = @htmldir@ +includedir = @includedir@ +infodir = @infodir@ +install_sh = @install_sh@ +libdir = @libdir@ +libexecdir = @libexecdir@ +localedir = @localedir@ +localstatedir = @localstatedir@ +mandir = @mandir@ +mkdir_p = @mkdir_p@ +oldincludedir = @oldincludedir@ +pdfdir = @pdfdir@ +pkg_config_libs_private = @pkg_config_libs_private@ +prefix = @prefix@ +program_transform_name = @program_transform_name@ +psdir = @psdir@ +runstatedir = @runstatedir@ +sbindir = @sbindir@ +sharedstatedir = @sharedstatedir@ +srcdir = @srcdir@ +sysconfdir = @sysconfdir@ +target_alias = @target_alias@ +top_build_prefix = @top_build_prefix@ +top_builddir = @top_builddir@ +top_srcdir = @top_srcdir@ +SUBDIRS = compat src tools $(am__append_1) $(am__append_2) +EXTRA_DIST = rulebases \ + COPYING.ASL20 + +pkgconfigdir = $(libdir)/pkgconfig +pkgconfig_DATA = lognorm.pc +ACLOCAL_AMFLAGS = -I m4 +all: config.h + $(MAKE) $(AM_MAKEFLAGS) all-recursive + +.SUFFIXES: +am--refresh: Makefile + @: +$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) + @for dep in $?; do \ + case '$(am__configure_deps)' in \ + *$$dep*) \ + echo ' cd $(srcdir) && $(AUTOMAKE) --gnu'; \ + $(am__cd) $(srcdir) && $(AUTOMAKE) --gnu \ + && exit 0; \ + exit 1;; \ + esac; \ + done; \ + echo ' cd $(top_srcdir) && $(AUTOMAKE) --gnu Makefile'; \ + $(am__cd) $(top_srcdir) && \ + $(AUTOMAKE) --gnu Makefile +Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status + @case '$?' in \ + *config.status*) \ + echo ' $(SHELL) ./config.status'; \ + $(SHELL) ./config.status;; \ + *) \ + echo ' cd $(top_builddir) && $(SHELL) ./config.status $@ $(am__depfiles_maybe)'; \ + cd $(top_builddir) && $(SHELL) ./config.status $@ $(am__depfiles_maybe);; \ + esac; + +$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) + $(SHELL) ./config.status --recheck + +$(top_srcdir)/configure: $(am__configure_deps) + $(am__cd) $(srcdir) && $(AUTOCONF) +$(ACLOCAL_M4): $(am__aclocal_m4_deps) + $(am__cd) $(srcdir) && $(ACLOCAL) $(ACLOCAL_AMFLAGS) +$(am__aclocal_m4_deps): + +config.h: stamp-h1 + @test -f $@ || rm -f stamp-h1 + @test -f $@ || $(MAKE) $(AM_MAKEFLAGS) stamp-h1 + +stamp-h1: $(srcdir)/config.h.in $(top_builddir)/config.status + @rm -f stamp-h1 + cd $(top_builddir) && $(SHELL) ./config.status config.h +$(srcdir)/config.h.in: $(am__configure_deps) + ($(am__cd) $(top_srcdir) && $(AUTOHEADER)) + rm -f stamp-h1 + touch $@ + +distclean-hdr: + -rm -f config.h stamp-h1 +lognorm.pc: $(top_builddir)/config.status $(srcdir)/lognorm.pc.in + cd $(top_builddir) && $(SHELL) ./config.status $@ + +mostlyclean-libtool: + -rm -f *.lo + +clean-libtool: + -rm -rf .libs _libs + +distclean-libtool: + -rm -f libtool config.lt +install-pkgconfigDATA: $(pkgconfig_DATA) + @$(NORMAL_INSTALL) + @list='$(pkgconfig_DATA)'; test -n "$(pkgconfigdir)" || list=; \ + if test -n "$$list"; then \ + echo " $(MKDIR_P) '$(DESTDIR)$(pkgconfigdir)'"; \ + $(MKDIR_P) "$(DESTDIR)$(pkgconfigdir)" || exit 1; \ + fi; \ + for p in $$list; do \ + if test -f "$$p"; then d=; else d="$(srcdir)/"; fi; \ + echo "$$d$$p"; \ + done | $(am__base_list) | \ + while read files; do \ + echo " $(INSTALL_DATA) $$files '$(DESTDIR)$(pkgconfigdir)'"; \ + $(INSTALL_DATA) $$files "$(DESTDIR)$(pkgconfigdir)" || exit $$?; \ + done + +uninstall-pkgconfigDATA: + @$(NORMAL_UNINSTALL) + @list='$(pkgconfig_DATA)'; test -n "$(pkgconfigdir)" || list=; \ + files=`for p in $$list; do echo $$p; done | sed -e 's|^.*/||'`; \ + dir='$(DESTDIR)$(pkgconfigdir)'; $(am__uninstall_files_from_dir) + +# This directory's subdirectories are mostly independent; you can cd +# into them and run 'make' without going through this Makefile. +# To change the values of 'make' variables: instead of editing Makefiles, +# (1) if the variable is set in 'config.status', edit 'config.status' +# (which will cause the Makefiles to be regenerated when you run 'make'); +# (2) otherwise, pass the desired values on the 'make' command line. +$(am__recursive_targets): + @fail=; \ + if $(am__make_keepgoing); then \ + failcom='fail=yes'; \ + else \ + failcom='exit 1'; \ + fi; \ + dot_seen=no; \ + target=`echo $@ | sed s/-recursive//`; \ + case "$@" in \ + distclean-* | maintainer-clean-*) list='$(DIST_SUBDIRS)' ;; \ + *) list='$(SUBDIRS)' ;; \ + esac; \ + for subdir in $$list; do \ + echo "Making $$target in $$subdir"; \ + if test "$$subdir" = "."; then \ + dot_seen=yes; \ + local_target="$$target-am"; \ + else \ + local_target="$$target"; \ + fi; \ + ($(am__cd) $$subdir && $(MAKE) $(AM_MAKEFLAGS) $$local_target) \ + || eval $$failcom; \ + done; \ + if test "$$dot_seen" = "no"; then \ + $(MAKE) $(AM_MAKEFLAGS) "$$target-am" || exit 1; \ + fi; test -z "$$fail" + +ID: $(am__tagged_files) + $(am__define_uniq_tagged_files); mkid -fID $$unique +tags: tags-recursive +TAGS: tags + +tags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files) + set x; \ + here=`pwd`; \ + if ($(ETAGS) --etags-include --version) >/dev/null 2>&1; then \ + include_option=--etags-include; \ + empty_fix=.; \ + else \ + include_option=--include; \ + empty_fix=; \ + fi; \ + list='$(SUBDIRS)'; for subdir in $$list; do \ + if test "$$subdir" = .; then :; else \ + test ! -f $$subdir/TAGS || \ + set "$$@" "$$include_option=$$here/$$subdir/TAGS"; \ + fi; \ + done; \ + $(am__define_uniq_tagged_files); \ + shift; \ + if test -z "$(ETAGS_ARGS)$$*$$unique"; then :; else \ + test -n "$$unique" || unique=$$empty_fix; \ + if test $$# -gt 0; then \ + $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \ + "$$@" $$unique; \ + else \ + $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \ + $$unique; \ + fi; \ + fi +ctags: ctags-recursive + +CTAGS: ctags +ctags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files) + $(am__define_uniq_tagged_files); \ + test -z "$(CTAGS_ARGS)$$unique" \ + || $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \ + $$unique + +GTAGS: + here=`$(am__cd) $(top_builddir) && pwd` \ + && $(am__cd) $(top_srcdir) \ + && gtags -i $(GTAGS_ARGS) "$$here" +cscope: cscope.files + test ! -s cscope.files \ + || $(CSCOPE) -b -q $(AM_CSCOPEFLAGS) $(CSCOPEFLAGS) -i cscope.files $(CSCOPE_ARGS) +clean-cscope: + -rm -f cscope.files +cscope.files: clean-cscope cscopelist +cscopelist: cscopelist-recursive + +cscopelist-am: $(am__tagged_files) + list='$(am__tagged_files)'; \ + case "$(srcdir)" in \ + [\\/]* | ?:[\\/]*) sdir="$(srcdir)" ;; \ + *) sdir=$(subdir)/$(srcdir) ;; \ + esac; \ + for i in $$list; do \ + if test -f "$$i"; then \ + echo "$(subdir)/$$i"; \ + else \ + echo "$$sdir/$$i"; \ + fi; \ + done >> $(top_builddir)/cscope.files + +distclean-tags: + -rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags + -rm -f cscope.out cscope.in.out cscope.po.out cscope.files + +distdir: $(DISTFILES) + $(am__remove_distdir) + test -d "$(distdir)" || mkdir "$(distdir)" + @srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + list='$(DISTFILES)'; \ + dist_files=`for file in $$list; do echo $$file; done | \ + sed -e "s|^$$srcdirstrip/||;t" \ + -e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \ + case $$dist_files in \ + */*) $(MKDIR_P) `echo "$$dist_files" | \ + sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \ + sort -u` ;; \ + esac; \ + for file in $$dist_files; do \ + if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \ + if test -d $$d/$$file; then \ + dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \ + if test -d "$(distdir)/$$file"; then \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \ + cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \ + else \ + test -f "$(distdir)/$$file" \ + || cp -p $$d/$$file "$(distdir)/$$file" \ + || exit 1; \ + fi; \ + done + @list='$(DIST_SUBDIRS)'; for subdir in $$list; do \ + if test "$$subdir" = .; then :; else \ + $(am__make_dryrun) \ + || test -d "$(distdir)/$$subdir" \ + || $(MKDIR_P) "$(distdir)/$$subdir" \ + || exit 1; \ + dir1=$$subdir; dir2="$(distdir)/$$subdir"; \ + $(am__relativize); \ + new_distdir=$$reldir; \ + dir1=$$subdir; dir2="$(top_distdir)"; \ + $(am__relativize); \ + new_top_distdir=$$reldir; \ + echo " (cd $$subdir && $(MAKE) $(AM_MAKEFLAGS) top_distdir="$$new_top_distdir" distdir="$$new_distdir" \\"; \ + echo " am__remove_distdir=: am__skip_length_check=: am__skip_mode_fix=: distdir)"; \ + ($(am__cd) $$subdir && \ + $(MAKE) $(AM_MAKEFLAGS) \ + top_distdir="$$new_top_distdir" \ + distdir="$$new_distdir" \ + am__remove_distdir=: \ + am__skip_length_check=: \ + am__skip_mode_fix=: \ + distdir) \ + || exit 1; \ + fi; \ + done + -test -n "$(am__skip_mode_fix)" \ + || find "$(distdir)" -type d ! -perm -755 \ + -exec chmod u+rwx,go+rx {} \; -o \ + ! -type d ! -perm -444 -links 1 -exec chmod a+r {} \; -o \ + ! -type d ! -perm -400 -exec chmod a+r {} \; -o \ + ! -type d ! -perm -444 -exec $(install_sh) -c -m a+r {} {} \; \ + || chmod -R a+r "$(distdir)" +dist-gzip: distdir + tardir=$(distdir) && $(am__tar) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).tar.gz + $(am__post_remove_distdir) + +dist-bzip2: distdir + tardir=$(distdir) && $(am__tar) | BZIP2=$${BZIP2--9} bzip2 -c >$(distdir).tar.bz2 + $(am__post_remove_distdir) + +dist-lzip: distdir + tardir=$(distdir) && $(am__tar) | lzip -c $${LZIP_OPT--9} >$(distdir).tar.lz + $(am__post_remove_distdir) + +dist-xz: distdir + tardir=$(distdir) && $(am__tar) | XZ_OPT=$${XZ_OPT--e} xz -c >$(distdir).tar.xz + $(am__post_remove_distdir) + +dist-tarZ: distdir + @echo WARNING: "Support for distribution archives compressed with" \ + "legacy program 'compress' is deprecated." >&2 + @echo WARNING: "It will be removed altogether in Automake 2.0" >&2 + tardir=$(distdir) && $(am__tar) | compress -c >$(distdir).tar.Z + $(am__post_remove_distdir) + +dist-shar: distdir + @echo WARNING: "Support for shar distribution archives is" \ + "deprecated." >&2 + @echo WARNING: "It will be removed altogether in Automake 2.0" >&2 + shar $(distdir) | GZIP=$(GZIP_ENV) gzip -c >$(distdir).shar.gz + $(am__post_remove_distdir) + +dist-zip: distdir + -rm -f $(distdir).zip + zip -rq $(distdir).zip $(distdir) + $(am__post_remove_distdir) + +dist dist-all: + $(MAKE) $(AM_MAKEFLAGS) $(DIST_TARGETS) am__post_remove_distdir='@:' + $(am__post_remove_distdir) + +# This target untars the dist file and tries a VPATH configuration. Then +# it guarantees that the distribution is self-contained by making another +# tarfile. +distcheck: dist + case '$(DIST_ARCHIVES)' in \ + *.tar.gz*) \ + GZIP=$(GZIP_ENV) gzip -dc $(distdir).tar.gz | $(am__untar) ;;\ + *.tar.bz2*) \ + bzip2 -dc $(distdir).tar.bz2 | $(am__untar) ;;\ + *.tar.lz*) \ + lzip -dc $(distdir).tar.lz | $(am__untar) ;;\ + *.tar.xz*) \ + xz -dc $(distdir).tar.xz | $(am__untar) ;;\ + *.tar.Z*) \ + uncompress -c $(distdir).tar.Z | $(am__untar) ;;\ + *.shar.gz*) \ + GZIP=$(GZIP_ENV) gzip -dc $(distdir).shar.gz | unshar ;;\ + *.zip*) \ + unzip $(distdir).zip ;;\ + esac + chmod -R a-w $(distdir) + chmod u+w $(distdir) + mkdir $(distdir)/_build $(distdir)/_build/sub $(distdir)/_inst + chmod a-w $(distdir) + test -d $(distdir)/_build || exit 0; \ + dc_install_base=`$(am__cd) $(distdir)/_inst && pwd | sed -e 's,^[^:\\/]:[\\/],/,'` \ + && dc_destdir="$${TMPDIR-/tmp}/am-dc-$$$$/" \ + && am__cwd=`pwd` \ + && $(am__cd) $(distdir)/_build/sub \ + && ../../configure \ + $(AM_DISTCHECK_CONFIGURE_FLAGS) \ + $(DISTCHECK_CONFIGURE_FLAGS) \ + --srcdir=../.. --prefix="$$dc_install_base" \ + && $(MAKE) $(AM_MAKEFLAGS) \ + && $(MAKE) $(AM_MAKEFLAGS) dvi \ + && $(MAKE) $(AM_MAKEFLAGS) check \ + && $(MAKE) $(AM_MAKEFLAGS) install \ + && $(MAKE) $(AM_MAKEFLAGS) installcheck \ + && $(MAKE) $(AM_MAKEFLAGS) uninstall \ + && $(MAKE) $(AM_MAKEFLAGS) distuninstallcheck_dir="$$dc_install_base" \ + distuninstallcheck \ + && chmod -R a-w "$$dc_install_base" \ + && ({ \ + (cd ../.. && umask 077 && mkdir "$$dc_destdir") \ + && $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" install \ + && $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" uninstall \ + && $(MAKE) $(AM_MAKEFLAGS) DESTDIR="$$dc_destdir" \ + distuninstallcheck_dir="$$dc_destdir" distuninstallcheck; \ + } || { rm -rf "$$dc_destdir"; exit 1; }) \ + && rm -rf "$$dc_destdir" \ + && $(MAKE) $(AM_MAKEFLAGS) dist \ + && rm -rf $(DIST_ARCHIVES) \ + && $(MAKE) $(AM_MAKEFLAGS) distcleancheck \ + && cd "$$am__cwd" \ + || exit 1 + $(am__post_remove_distdir) + @(echo "$(distdir) archives ready for distribution: "; \ + list='$(DIST_ARCHIVES)'; for i in $$list; do echo $$i; done) | \ + sed -e 1h -e 1s/./=/g -e 1p -e 1x -e '$$p' -e '$$x' +distuninstallcheck: + @test -n '$(distuninstallcheck_dir)' || { \ + echo 'ERROR: trying to run $@ with an empty' \ + '$$(distuninstallcheck_dir)' >&2; \ + exit 1; \ + }; \ + $(am__cd) '$(distuninstallcheck_dir)' || { \ + echo 'ERROR: cannot chdir into $(distuninstallcheck_dir)' >&2; \ + exit 1; \ + }; \ + test `$(am__distuninstallcheck_listfiles) | wc -l` -eq 0 \ + || { echo "ERROR: files left after uninstall:" ; \ + if test -n "$(DESTDIR)"; then \ + echo " (check DESTDIR support)"; \ + fi ; \ + $(distuninstallcheck_listfiles) ; \ + exit 1; } >&2 +distcleancheck: distclean + @if test '$(srcdir)' = . ; then \ + echo "ERROR: distcleancheck can only run from a VPATH build" ; \ + exit 1 ; \ + fi + @test `$(distcleancheck_listfiles) | wc -l` -eq 0 \ + || { echo "ERROR: files left in build directory after distclean:" ; \ + $(distcleancheck_listfiles) ; \ + exit 1; } >&2 +check-am: all-am +check: check-recursive +all-am: Makefile $(DATA) config.h +installdirs: installdirs-recursive +installdirs-am: + for dir in "$(DESTDIR)$(pkgconfigdir)"; do \ + test -z "$$dir" || $(MKDIR_P) "$$dir"; \ + done +install: install-recursive +install-exec: install-exec-recursive +install-data: install-data-recursive +uninstall: uninstall-recursive + +install-am: all-am + @$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am + +installcheck: installcheck-recursive +install-strip: + if test -z '$(STRIP)'; then \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + install; \ + else \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \ + fi +mostlyclean-generic: + +clean-generic: + +distclean-generic: + -test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES) + -test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES) + +maintainer-clean-generic: + @echo "This command is intended for maintainers to use" + @echo "it deletes files that may require special tools to rebuild." +clean: clean-recursive + +clean-am: clean-generic clean-libtool mostlyclean-am + +distclean: distclean-recursive + -rm -f $(am__CONFIG_DISTCLEAN_FILES) + -rm -f Makefile +distclean-am: clean-am distclean-generic distclean-hdr \ + distclean-libtool distclean-tags + +dvi: dvi-recursive + +dvi-am: + +html: html-recursive + +html-am: + +info: info-recursive + +info-am: + +install-data-am: install-pkgconfigDATA + +install-dvi: install-dvi-recursive + +install-dvi-am: + +install-exec-am: + +install-html: install-html-recursive + +install-html-am: + +install-info: install-info-recursive + +install-info-am: + +install-man: + +install-pdf: install-pdf-recursive + +install-pdf-am: + +install-ps: install-ps-recursive + +install-ps-am: + +installcheck-am: + +maintainer-clean: maintainer-clean-recursive + -rm -f $(am__CONFIG_DISTCLEAN_FILES) + -rm -rf $(top_srcdir)/autom4te.cache + -rm -f Makefile +maintainer-clean-am: distclean-am maintainer-clean-generic + +mostlyclean: mostlyclean-recursive + +mostlyclean-am: mostlyclean-generic mostlyclean-libtool + +pdf: pdf-recursive + +pdf-am: + +ps: ps-recursive + +ps-am: + +uninstall-am: uninstall-pkgconfigDATA + +.MAKE: $(am__recursive_targets) all install-am install-strip + +.PHONY: $(am__recursive_targets) CTAGS GTAGS TAGS all all-am \ + am--refresh check check-am clean clean-cscope clean-generic \ + clean-libtool cscope cscopelist-am ctags ctags-am dist \ + dist-all dist-bzip2 dist-gzip dist-lzip dist-shar dist-tarZ \ + dist-xz dist-zip distcheck distclean distclean-generic \ + distclean-hdr distclean-libtool distclean-tags distcleancheck \ + distdir distuninstallcheck dvi dvi-am html html-am info \ + info-am install install-am install-data install-data-am \ + install-dvi install-dvi-am install-exec install-exec-am \ + install-html install-html-am install-info install-info-am \ + install-man install-pdf install-pdf-am install-pkgconfigDATA \ + install-ps install-ps-am install-strip installcheck \ + installcheck-am installdirs installdirs-am maintainer-clean \ + maintainer-clean-generic mostlyclean mostlyclean-generic \ + mostlyclean-libtool pdf pdf-am ps ps-am tags tags-am uninstall \ + uninstall-am uninstall-pkgconfigDATA + +.PRECIOUS: Makefile + + +# Tell versions [3.59,3.63) of GNU make to not export all variables. +# Otherwise a system limit (for SysV at least) may be exceeded. +.NOEXPORT: diff --git a/NEWS b/NEWS new file mode 100644 index 0000000..1b14f65 --- /dev/null +++ b/NEWS @@ -0,0 +1 @@ +This file has been superseeded by ChangeLog. Please see there. diff --git a/README b/README new file mode 100644 index 0000000..c4342a7 --- /dev/null +++ b/README @@ -0,0 +1,39 @@ +Liblognorm is a fast-samples based normalization library. + +More information on liblognorm can be found at + http://www.liblognorm.com + +Liblognorm evolves since several years and was intially meant to be used primarily with +the Mitre CEE effort. Consequently, the initial version of liblognorm (0.x) +uses the libee CEE support library in its API. + +As time evolved, the initial CEE schema underwent considerable change. Even +worse, Mitre lost funding for CEE. While the CEE ideas survived as part +of Red Hat-driven "Project Lumberjack", the data structures became greatly +simplified and JSON based. That effectively made libee obsolete (and also +in parts libestr, which was specifically written to support CEE's +initial requirement of embedded NUL chars in strings). + +In 2013, Pavel Levshin converted liblognorm to native JSON, which helped +improve performance and simplicity for many client applications. +Unfortunately, this change broke interface compatibility (and there was +no way to avoid that, obviously...). + +In 2015, most parts of liblognorm were redesigned and rewritten as part +of Rainer Gerhards' master thesis. For full technical details of how +liblognorm operates, and why it is so fast, please have a look at + +https://www.researchgate.net/publication/310545144_Efficient_Normalization_of_IT_Log_Messages_under_Realtime_Conditions + +The current library is the result of that effort. Application developers +are encouraged to switch to this version, as it provides the benefit of +a simpler API. This version is now being tracked by the git master branch. + +However, if you need to stick to the old API, there is a git branch +liblognorm0, which contains the previous version of the library. This +branch is also maintained for important bug fixes, so it is safe to use. + +We recommend that packagers create packages both for liblognorm0 and +liblognorm1. Note that liblognorm's development packages cannot +coexist on the same system as the PKGCONFIG system would get into +trouble. Adiscon's own packages follow this schema. diff --git a/aclocal.m4 b/aclocal.m4 new file mode 100644 index 0000000..915a344 --- /dev/null +++ b/aclocal.m4 @@ -0,0 +1,2048 @@ +# generated automatically by aclocal 1.15 -*- Autoconf -*- + +# Copyright (C) 1996-2014 Free Software Foundation, Inc. + +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY, to the extent permitted by law; without +# even the implied warranty of MERCHANTABILITY or FITNESS FOR A +# PARTICULAR PURPOSE. + +m4_ifndef([AC_CONFIG_MACRO_DIRS], [m4_defun([_AM_CONFIG_MACRO_DIRS], [])m4_defun([AC_CONFIG_MACRO_DIRS], [_AM_CONFIG_MACRO_DIRS($@)])]) +m4_ifndef([AC_AUTOCONF_VERSION], + [m4_copy([m4_PACKAGE_VERSION], [AC_AUTOCONF_VERSION])])dnl +m4_if(m4_defn([AC_AUTOCONF_VERSION]), [2.69],, +[m4_warning([this file was generated for autoconf 2.69. +You have another version of autoconf. It may work, but is not guaranteed to. +If you have problems, you may need to regenerate the build system entirely. +To do so, use the procedure documented by the package, typically 'autoreconf'.])]) + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_append_compile_flags.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_APPEND_COMPILE_FLAGS([FLAG1 FLAG2 ...], [FLAGS-VARIABLE], [EXTRA-FLAGS]) +# +# DESCRIPTION +# +# For every FLAG1, FLAG2 it is checked whether the compiler works with the +# flag. If it does, the flag is added FLAGS-VARIABLE +# +# If FLAGS-VARIABLE is not specified, the current language's flags (e.g. +# CFLAGS) is used. During the check the flag is always added to the +# current language's flags. +# +# If EXTRA-FLAGS is defined, it is added to the current language's default +# flags (e.g. CFLAGS) when the check is done. The check is thus made with +# the flags: "CFLAGS EXTRA-FLAGS FLAG". This can for example be used to +# force the compiler to issue an error when a bad flag is given. +# +# NOTE: This macro depends on the AX_APPEND_FLAG and +# AX_CHECK_COMPILE_FLAG. Please keep this macro in sync with +# AX_APPEND_LINK_FLAGS. +# +# LICENSE +# +# Copyright (c) 2011 Maarten Bosmans +# +# This program is free software: you can redistribute it and/or modify it +# under the terms of the GNU General Public License as published by the +# Free Software Foundation, either version 3 of the License, or (at your +# option) any later version. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General +# Public License for more details. +# +# You should have received a copy of the GNU General Public License along +# with this program. If not, see . +# +# As a special exception, the respective Autoconf Macro's copyright owner +# gives unlimited permission to copy, distribute and modify the configure +# scripts that are the output of Autoconf when processing the Macro. You +# need not follow the terms of the GNU General Public License when using +# or distributing such scripts, even though portions of the text of the +# Macro appear in them. The GNU General Public License (GPL) does govern +# all other use of the material that constitutes the Autoconf Macro. +# +# This special exception to the GPL applies to versions of the Autoconf +# Macro released by the Autoconf Archive. When you make and distribute a +# modified version of the Autoconf Macro, you may extend this special +# exception to the GPL to apply to your modified version as well. + +#serial 4 + +AC_DEFUN([AX_APPEND_COMPILE_FLAGS], +[AX_REQUIRE_DEFINED([AX_CHECK_COMPILE_FLAG]) +AX_REQUIRE_DEFINED([AX_APPEND_FLAG]) +for flag in $1; do + AX_CHECK_COMPILE_FLAG([$flag], [AX_APPEND_FLAG([$flag], [$2])], [], [$3]) +done +])dnl AX_APPEND_COMPILE_FLAGS + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_append_flag.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_APPEND_FLAG(FLAG, [FLAGS-VARIABLE]) +# +# DESCRIPTION +# +# FLAG is appended to the FLAGS-VARIABLE shell variable, with a space +# added in between. +# +# If FLAGS-VARIABLE is not specified, the current language's flags (e.g. +# CFLAGS) is used. FLAGS-VARIABLE is not changed if it already contains +# FLAG. If FLAGS-VARIABLE is unset in the shell, it is set to exactly +# FLAG. +# +# NOTE: Implementation based on AX_CFLAGS_GCC_OPTION. +# +# LICENSE +# +# Copyright (c) 2008 Guido U. Draheim +# Copyright (c) 2011 Maarten Bosmans +# +# This program is free software: you can redistribute it and/or modify it +# under the terms of the GNU General Public License as published by the +# Free Software Foundation, either version 3 of the License, or (at your +# option) any later version. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General +# Public License for more details. +# +# You should have received a copy of the GNU General Public License along +# with this program. If not, see . +# +# As a special exception, the respective Autoconf Macro's copyright owner +# gives unlimited permission to copy, distribute and modify the configure +# scripts that are the output of Autoconf when processing the Macro. You +# need not follow the terms of the GNU General Public License when using +# or distributing such scripts, even though portions of the text of the +# Macro appear in them. The GNU General Public License (GPL) does govern +# all other use of the material that constitutes the Autoconf Macro. +# +# This special exception to the GPL applies to versions of the Autoconf +# Macro released by the Autoconf Archive. When you make and distribute a +# modified version of the Autoconf Macro, you may extend this special +# exception to the GPL to apply to your modified version as well. + +#serial 6 + +AC_DEFUN([AX_APPEND_FLAG], +[dnl +AC_PREREQ(2.64)dnl for _AC_LANG_PREFIX and AS_VAR_SET_IF +AS_VAR_PUSHDEF([FLAGS], [m4_default($2,_AC_LANG_PREFIX[FLAGS])]) +AS_VAR_SET_IF(FLAGS,[ + AS_CASE([" AS_VAR_GET(FLAGS) "], + [*" $1 "*], [AC_RUN_LOG([: FLAGS already contains $1])], + [ + AS_VAR_APPEND(FLAGS,[" $1"]) + AC_RUN_LOG([: FLAGS="$FLAGS"]) + ]) + ], + [ + AS_VAR_SET(FLAGS,[$1]) + AC_RUN_LOG([: FLAGS="$FLAGS"]) + ]) +AS_VAR_POPDEF([FLAGS])dnl +])dnl AX_APPEND_FLAG + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_append_link_flags.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_APPEND_LINK_FLAGS([FLAG1 FLAG2 ...], [FLAGS-VARIABLE], [EXTRA-FLAGS]) +# +# DESCRIPTION +# +# For every FLAG1, FLAG2 it is checked whether the linker works with the +# flag. If it does, the flag is added FLAGS-VARIABLE +# +# If FLAGS-VARIABLE is not specified, the linker's flags (LDFLAGS) is +# used. During the check the flag is always added to the linker's flags. +# +# If EXTRA-FLAGS is defined, it is added to the linker's default flags +# when the check is done. The check is thus made with the flags: "LDFLAGS +# EXTRA-FLAGS FLAG". This can for example be used to force the linker to +# issue an error when a bad flag is given. +# +# NOTE: This macro depends on the AX_APPEND_FLAG and AX_CHECK_LINK_FLAG. +# Please keep this macro in sync with AX_APPEND_COMPILE_FLAGS. +# +# LICENSE +# +# Copyright (c) 2011 Maarten Bosmans +# +# This program is free software: you can redistribute it and/or modify it +# under the terms of the GNU General Public License as published by the +# Free Software Foundation, either version 3 of the License, or (at your +# option) any later version. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General +# Public License for more details. +# +# You should have received a copy of the GNU General Public License along +# with this program. If not, see . +# +# As a special exception, the respective Autoconf Macro's copyright owner +# gives unlimited permission to copy, distribute and modify the configure +# scripts that are the output of Autoconf when processing the Macro. You +# need not follow the terms of the GNU General Public License when using +# or distributing such scripts, even though portions of the text of the +# Macro appear in them. The GNU General Public License (GPL) does govern +# all other use of the material that constitutes the Autoconf Macro. +# +# This special exception to the GPL applies to versions of the Autoconf +# Macro released by the Autoconf Archive. When you make and distribute a +# modified version of the Autoconf Macro, you may extend this special +# exception to the GPL to apply to your modified version as well. + +#serial 4 + +AC_DEFUN([AX_APPEND_LINK_FLAGS], +[AX_REQUIRE_DEFINED([AX_CHECK_LINK_FLAG]) +AX_REQUIRE_DEFINED([AX_APPEND_FLAG]) +for flag in $1; do + AX_CHECK_LINK_FLAG([$flag], [AX_APPEND_FLAG([$flag], [m4_default([$2], [LDFLAGS])])], [], [$3]) +done +])dnl AX_APPEND_LINK_FLAGS + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_check_compile_flag.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_CHECK_COMPILE_FLAG(FLAG, [ACTION-SUCCESS], [ACTION-FAILURE], [EXTRA-FLAGS], [INPUT]) +# +# DESCRIPTION +# +# Check whether the given FLAG works with the current language's compiler +# or gives an error. (Warnings, however, are ignored) +# +# ACTION-SUCCESS/ACTION-FAILURE are shell commands to execute on +# success/failure. +# +# If EXTRA-FLAGS is defined, it is added to the current language's default +# flags (e.g. CFLAGS) when the check is done. The check is thus made with +# the flags: "CFLAGS EXTRA-FLAGS FLAG". This can for example be used to +# force the compiler to issue an error when a bad flag is given. +# +# INPUT gives an alternative input source to AC_COMPILE_IFELSE. +# +# NOTE: Implementation based on AX_CFLAGS_GCC_OPTION. Please keep this +# macro in sync with AX_CHECK_{PREPROC,LINK}_FLAG. +# +# LICENSE +# +# Copyright (c) 2008 Guido U. Draheim +# Copyright (c) 2011 Maarten Bosmans +# +# This program is free software: you can redistribute it and/or modify it +# under the terms of the GNU General Public License as published by the +# Free Software Foundation, either version 3 of the License, or (at your +# option) any later version. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General +# Public License for more details. +# +# You should have received a copy of the GNU General Public License along +# with this program. If not, see . +# +# As a special exception, the respective Autoconf Macro's copyright owner +# gives unlimited permission to copy, distribute and modify the configure +# scripts that are the output of Autoconf when processing the Macro. You +# need not follow the terms of the GNU General Public License when using +# or distributing such scripts, even though portions of the text of the +# Macro appear in them. The GNU General Public License (GPL) does govern +# all other use of the material that constitutes the Autoconf Macro. +# +# This special exception to the GPL applies to versions of the Autoconf +# Macro released by the Autoconf Archive. When you make and distribute a +# modified version of the Autoconf Macro, you may extend this special +# exception to the GPL to apply to your modified version as well. + +#serial 4 + +AC_DEFUN([AX_CHECK_COMPILE_FLAG], +[AC_PREREQ(2.64)dnl for _AC_LANG_PREFIX and AS_VAR_IF +AS_VAR_PUSHDEF([CACHEVAR],[ax_cv_check_[]_AC_LANG_ABBREV[]flags_$4_$1])dnl +AC_CACHE_CHECK([whether _AC_LANG compiler accepts $1], CACHEVAR, [ + ax_check_save_flags=$[]_AC_LANG_PREFIX[]FLAGS + _AC_LANG_PREFIX[]FLAGS="$[]_AC_LANG_PREFIX[]FLAGS $4 $1" + AC_COMPILE_IFELSE([m4_default([$5],[AC_LANG_PROGRAM()])], + [AS_VAR_SET(CACHEVAR,[yes])], + [AS_VAR_SET(CACHEVAR,[no])]) + _AC_LANG_PREFIX[]FLAGS=$ax_check_save_flags]) +AS_VAR_IF(CACHEVAR,yes, + [m4_default([$2], :)], + [m4_default([$3], :)]) +AS_VAR_POPDEF([CACHEVAR])dnl +])dnl AX_CHECK_COMPILE_FLAGS + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_check_link_flag.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_CHECK_LINK_FLAG(FLAG, [ACTION-SUCCESS], [ACTION-FAILURE], [EXTRA-FLAGS], [INPUT]) +# +# DESCRIPTION +# +# Check whether the given FLAG works with the linker or gives an error. +# (Warnings, however, are ignored) +# +# ACTION-SUCCESS/ACTION-FAILURE are shell commands to execute on +# success/failure. +# +# If EXTRA-FLAGS is defined, it is added to the linker's default flags +# when the check is done. The check is thus made with the flags: "LDFLAGS +# EXTRA-FLAGS FLAG". This can for example be used to force the linker to +# issue an error when a bad flag is given. +# +# INPUT gives an alternative input source to AC_LINK_IFELSE. +# +# NOTE: Implementation based on AX_CFLAGS_GCC_OPTION. Please keep this +# macro in sync with AX_CHECK_{PREPROC,COMPILE}_FLAG. +# +# LICENSE +# +# Copyright (c) 2008 Guido U. Draheim +# Copyright (c) 2011 Maarten Bosmans +# +# This program is free software: you can redistribute it and/or modify it +# under the terms of the GNU General Public License as published by the +# Free Software Foundation, either version 3 of the License, or (at your +# option) any later version. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General +# Public License for more details. +# +# You should have received a copy of the GNU General Public License along +# with this program. If not, see . +# +# As a special exception, the respective Autoconf Macro's copyright owner +# gives unlimited permission to copy, distribute and modify the configure +# scripts that are the output of Autoconf when processing the Macro. You +# need not follow the terms of the GNU General Public License when using +# or distributing such scripts, even though portions of the text of the +# Macro appear in them. The GNU General Public License (GPL) does govern +# all other use of the material that constitutes the Autoconf Macro. +# +# This special exception to the GPL applies to versions of the Autoconf +# Macro released by the Autoconf Archive. When you make and distribute a +# modified version of the Autoconf Macro, you may extend this special +# exception to the GPL to apply to your modified version as well. + +#serial 4 + +AC_DEFUN([AX_CHECK_LINK_FLAG], +[AC_PREREQ(2.64)dnl for _AC_LANG_PREFIX and AS_VAR_IF +AS_VAR_PUSHDEF([CACHEVAR],[ax_cv_check_ldflags_$4_$1])dnl +AC_CACHE_CHECK([whether the linker accepts $1], CACHEVAR, [ + ax_check_save_flags=$LDFLAGS + LDFLAGS="$LDFLAGS $4 $1" + AC_LINK_IFELSE([m4_default([$5],[AC_LANG_PROGRAM()])], + [AS_VAR_SET(CACHEVAR,[yes])], + [AS_VAR_SET(CACHEVAR,[no])]) + LDFLAGS=$ax_check_save_flags]) +AS_VAR_IF(CACHEVAR,yes, + [m4_default([$2], :)], + [m4_default([$3], :)]) +AS_VAR_POPDEF([CACHEVAR])dnl +])dnl AX_CHECK_LINK_FLAGS + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_compiler_flags.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_COMPILER_FLAGS([CFLAGS-VARIABLE], [LDFLAGS-VARIABLE], [IS-RELEASE], [EXTRA-BASE-CFLAGS], [EXTRA-YES-CFLAGS], [UNUSED], [UNUSED], [UNUSED], [EXTRA-BASE-LDFLAGS], [EXTRA-YES-LDFLAGS], [UNUSED], [UNUSED], [UNUSED]) +# +# DESCRIPTION +# +# Check for the presence of an --enable-compile-warnings option to +# configure, defaulting to "error" in normal operation, or "yes" if +# IS-RELEASE is equal to "yes". Return the value in the variable +# $ax_enable_compile_warnings. +# +# Depending on the value of --enable-compile-warnings, different compiler +# warnings are checked to see if they work with the current compiler and, +# if so, are appended to CFLAGS-VARIABLE and LDFLAGS-VARIABLE. This +# allows a consistent set of baseline compiler warnings to be used across +# a code base, irrespective of any warnings enabled locally by individual +# developers. By standardising the warnings used by all developers of a +# project, the project can commit to a zero-warnings policy, using -Werror +# to prevent compilation if new warnings are introduced. This makes +# catching bugs which are flagged by warnings a lot easier. +# +# By providing a consistent --enable-compile-warnings argument across all +# projects using this macro, continuous integration systems can easily be +# configured the same for all projects. Automated systems or build +# systems aimed at beginners may want to pass the --disable-Werror +# argument to unconditionally prevent warnings being fatal. +# +# --enable-compile-warnings can take the values: +# +# * no: Base compiler warnings only; not even -Wall. +# * yes: The above, plus a broad range of useful warnings. +# * error: The above, plus -Werror so that all warnings are fatal. +# Use --disable-Werror to override this and disable fatal +# warnings. +# +# The set of base and enabled flags can be augmented using the +# EXTRA-*-CFLAGS and EXTRA-*-LDFLAGS variables, which are tested and +# appended to the output variable if --enable-compile-warnings is not +# "no". Flags should not be disabled using these arguments, as the entire +# point of AX_COMPILER_FLAGS is to enforce a consistent set of useful +# compiler warnings on code, using warnings which have been chosen for low +# false positive rates. If a compiler emits false positives for a +# warning, a #pragma should be used in the code to disable the warning +# locally. See: +# +# https://gcc.gnu.org/onlinedocs/gcc-4.9.2/gcc/Diagnostic-Pragmas.html#Diagnostic-Pragmas +# +# The EXTRA-* variables should only be used to supply extra warning flags, +# and not general purpose compiler flags, as they are controlled by +# configure options such as --disable-Werror. +# +# IS-RELEASE can be used to disable -Werror when making a release, which +# is useful for those hairy moments when you just want to get the release +# done as quickly as possible. Set it to "yes" to disable -Werror. By +# default, it uses the value of $ax_is_release, so if you are using the +# AX_IS_RELEASE macro, there is no need to pass this parameter. For +# example: +# +# AX_IS_RELEASE([git-directory]) +# AX_COMPILER_FLAGS() +# +# CFLAGS-VARIABLE defaults to WARN_CFLAGS, and LDFLAGS-VARIABLE defaults +# to WARN_LDFLAGS. Both variables are AC_SUBST-ed by this macro, but must +# be manually added to the CFLAGS and LDFLAGS variables for each target in +# the code base. +# +# If C++ language support is enabled with AC_PROG_CXX, which must occur +# before this macro in configure.ac, warning flags for the C++ compiler +# are AC_SUBST-ed as WARN_CXXFLAGS, and must be manually added to the +# CXXFLAGS variables for each target in the code base. EXTRA-*-CFLAGS can +# be used to augment the base and enabled flags. +# +# Warning flags for g-ir-scanner (from GObject Introspection) are +# AC_SUBST-ed as WARN_SCANNERFLAGS. This variable must be manually added +# to the SCANNERFLAGS variable for each GIR target in the code base. If +# extra g-ir-scanner flags need to be enabled, the AX_COMPILER_FLAGS_GIR +# macro must be invoked manually. +# +# AX_COMPILER_FLAGS may add support for other tools in future, in addition +# to the compiler and linker. No extra EXTRA-* variables will be added +# for those tools, and all extra support will still use the single +# --enable-compile-warnings configure option. For finer grained control +# over the flags for individual tools, use AX_COMPILER_FLAGS_CFLAGS, +# AX_COMPILER_FLAGS_LDFLAGS and AX_COMPILER_FLAGS_* for new tools. +# +# The UNUSED variables date from a previous version of this macro, and are +# automatically appended to the preceding non-UNUSED variable. They should +# be left empty in new uses of the macro. +# +# LICENSE +# +# Copyright (c) 2014, 2015 Philip Withnall +# Copyright (c) 2015 David King +# +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. This file is offered as-is, without any +# warranty. + +#serial 13 + +# _AX_COMPILER_FLAGS_LANG([LANGNAME]) +m4_defun([_AX_COMPILER_FLAGS_LANG], +[m4_ifdef([_AX_COMPILER_FLAGS_LANG_]$1[_enabled], [], + [m4_define([_AX_COMPILER_FLAGS_LANG_]$1[_enabled], [])dnl + AX_REQUIRE_DEFINED([AX_COMPILER_FLAGS_]$1[FLAGS])])dnl +]) + +AC_DEFUN([AX_COMPILER_FLAGS],[ + # C support is enabled by default. + _AX_COMPILER_FLAGS_LANG([C]) + # Only enable C++ support if AC_PROG_CXX is called. The redefinition of + # AC_PROG_CXX is so that a fatal error is emitted if this macro is called + # before AC_PROG_CXX, which would otherwise cause no C++ warnings to be + # checked. + AC_PROVIDE_IFELSE([AC_PROG_CXX], + [_AX_COMPILER_FLAGS_LANG([CXX])], + [m4_define([AC_PROG_CXX], defn([AC_PROG_CXX])[_AX_COMPILER_FLAGS_LANG([CXX])])]) + AX_REQUIRE_DEFINED([AX_COMPILER_FLAGS_LDFLAGS]) + + # Default value for IS-RELEASE is $ax_is_release + ax_compiler_flags_is_release=m4_tolower(m4_normalize(ifelse([$3],, + [$ax_is_release], + [$3]))) + + AC_ARG_ENABLE([compile-warnings], + AS_HELP_STRING([--enable-compile-warnings=@<:@no/yes/error@:>@], + [Enable compiler warnings and errors]),, + [AS_IF([test "$ax_compiler_flags_is_release" = "yes"], + [enable_compile_warnings="yes"], + [enable_compile_warnings="error"])]) + AC_ARG_ENABLE([Werror], + AS_HELP_STRING([--disable-Werror], + [Unconditionally make all compiler warnings non-fatal]),, + [enable_Werror=maybe]) + + # Return the user's chosen warning level + AS_IF([test "$enable_Werror" = "no" -a \ + "$enable_compile_warnings" = "error"],[ + enable_compile_warnings="yes" + ]) + + ax_enable_compile_warnings=$enable_compile_warnings + + AX_COMPILER_FLAGS_CFLAGS([$1],[$ax_compiler_flags_is_release], + [$4],[$5 $6 $7 $8]) + m4_ifdef([_AX_COMPILER_FLAGS_LANG_CXX_enabled], + [AX_COMPILER_FLAGS_CXXFLAGS([WARN_CXXFLAGS], + [$ax_compiler_flags_is_release], + [$4],[$5 $6 $7 $8])]) + AX_COMPILER_FLAGS_LDFLAGS([$2],[$ax_compiler_flags_is_release], + [$9],[$10 $11 $12 $13]) + AX_COMPILER_FLAGS_GIR([WARN_SCANNERFLAGS],[$ax_compiler_flags_is_release]) +])dnl AX_COMPILER_FLAGS + +# ============================================================================ +# http://www.gnu.org/software/autoconf-archive/ax_compiler_flags_cflags.html +# ============================================================================ +# +# SYNOPSIS +# +# AX_COMPILER_FLAGS_CFLAGS([VARIABLE], [IS-RELEASE], [EXTRA-BASE-FLAGS], [EXTRA-YES-FLAGS]) +# +# DESCRIPTION +# +# Add warning flags for the C compiler to VARIABLE, which defaults to +# WARN_CFLAGS. VARIABLE is AC_SUBST-ed by this macro, but must be +# manually added to the CFLAGS variable for each target in the code base. +# +# This macro depends on the environment set up by AX_COMPILER_FLAGS. +# Specifically, it uses the value of $ax_enable_compile_warnings to decide +# which flags to enable. +# +# LICENSE +# +# Copyright (c) 2014, 2015 Philip Withnall +# +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. This file is offered as-is, without any +# warranty. + +#serial 11 + +AC_DEFUN([AX_COMPILER_FLAGS_CFLAGS],[ + AC_REQUIRE([AC_PROG_SED]) + AX_REQUIRE_DEFINED([AX_APPEND_COMPILE_FLAGS]) + AX_REQUIRE_DEFINED([AX_APPEND_FLAG]) + AX_REQUIRE_DEFINED([AX_CHECK_COMPILE_FLAG]) + + # Variable names + m4_define(ax_warn_cflags_variable, + [m4_normalize(ifelse([$1],,[WARN_CFLAGS],[$1]))]) + + AC_LANG_PUSH([C]) + + # Always pass -Werror=unknown-warning-option to get Clang to fail on bad + # flags, otherwise they are always appended to the warn_cflags variable, and + # Clang warns on them for every compilation unit. + # If this is passed to GCC, it will explode, so the flag must be enabled + # conditionally. + AX_CHECK_COMPILE_FLAG([-Werror=unknown-warning-option],[ + ax_compiler_flags_test="-Werror=unknown-warning-option" + ],[ + ax_compiler_flags_test="" + ]) + + # Base flags + AX_APPEND_COMPILE_FLAGS([ dnl + -fno-strict-aliasing dnl + $3 dnl + ],ax_warn_cflags_variable,[$ax_compiler_flags_test]) + + AS_IF([test "$ax_enable_compile_warnings" != "no"],[ + # "yes" flags + AX_APPEND_COMPILE_FLAGS([ dnl + -Wall dnl + -Wextra dnl + -Wundef dnl + -Wnested-externs dnl + -Wwrite-strings dnl + -Wpointer-arith dnl + -Wmissing-declarations dnl + -Wmissing-prototypes dnl + -Wstrict-prototypes dnl + -Wredundant-decls dnl + -Wno-unused-parameter dnl + -Wno-missing-field-initializers dnl + -Wdeclaration-after-statement dnl + -Wformat=2 dnl + -Wold-style-definition dnl + -Wcast-align dnl + -Wformat-nonliteral dnl + -Wformat-security dnl + -Wsign-compare dnl + -Wstrict-aliasing dnl + -Wshadow dnl + -Winline dnl + -Wpacked dnl + -Wmissing-format-attribute dnl + -Wmissing-noreturn dnl + -Winit-self dnl + -Wredundant-decls dnl + -Wmissing-include-dirs dnl + -Wunused-but-set-variable dnl + -Warray-bounds dnl + -Wimplicit-function-declaration dnl + -Wreturn-type dnl + -Wswitch-enum dnl + -Wswitch-default dnl + $4 dnl + $5 dnl + $6 dnl + $7 dnl + ],ax_warn_cflags_variable,[$ax_compiler_flags_test]) + ]) + AS_IF([test "$ax_enable_compile_warnings" = "error"],[ + # "error" flags; -Werror has to be appended unconditionally because + # it's not possible to test for + # + # suggest-attribute=format is disabled because it gives too many false + # positives + AX_APPEND_FLAG([-Werror],ax_warn_cflags_variable) + + AX_APPEND_COMPILE_FLAGS([ dnl + -Wno-suggest-attribute=format dnl + ],ax_warn_cflags_variable,[$ax_compiler_flags_test]) + ]) + + # In the flags below, when disabling specific flags, always add *both* + # -Wno-foo and -Wno-error=foo. This fixes the situation where (for example) + # we enable -Werror, disable a flag, and a build bot passes CFLAGS=-Wall, + # which effectively turns that flag back on again as an error. + for flag in $ax_warn_cflags_variable; do + AS_CASE([$flag], + [-Wno-*=*],[], + [-Wno-*],[ + AX_APPEND_COMPILE_FLAGS([-Wno-error=$(AS_ECHO([$flag]) | $SED 's/^-Wno-//')], + ax_warn_cflags_variable, + [$ax_compiler_flags_test]) + ]) + done + + AC_LANG_POP([C]) + + # Substitute the variables + AC_SUBST(ax_warn_cflags_variable) +])dnl AX_COMPILER_FLAGS + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_compiler_flags_gir.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_COMPILER_FLAGS_GIR([VARIABLE], [IS-RELEASE], [EXTRA-BASE-FLAGS], [EXTRA-YES-FLAGS]) +# +# DESCRIPTION +# +# Add warning flags for the g-ir-scanner (from GObject Introspection) to +# VARIABLE, which defaults to WARN_SCANNERFLAGS. VARIABLE is AC_SUBST-ed +# by this macro, but must be manually added to the SCANNERFLAGS variable +# for each GIR target in the code base. +# +# This macro depends on the environment set up by AX_COMPILER_FLAGS. +# Specifically, it uses the value of $ax_enable_compile_warnings to decide +# which flags to enable. +# +# LICENSE +# +# Copyright (c) 2015 Philip Withnall +# +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. This file is offered as-is, without any +# warranty. + +#serial 4 + +AC_DEFUN([AX_COMPILER_FLAGS_GIR],[ + AX_REQUIRE_DEFINED([AX_APPEND_FLAG]) + + # Variable names + m4_define(ax_warn_scannerflags_variable, + [m4_normalize(ifelse([$1],,[WARN_SCANNERFLAGS],[$1]))]) + + # Base flags + AX_APPEND_FLAG([$3],ax_warn_scannerflags_variable) + + AS_IF([test "$ax_enable_compile_warnings" != "no"],[ + # "yes" flags + AX_APPEND_FLAG([ dnl + --warn-all dnl + $4 dnl + $5 dnl + $6 dnl + $7 dnl + ],ax_warn_scannerflags_variable) + ]) + AS_IF([test "$ax_enable_compile_warnings" = "error"],[ + # "error" flags + AX_APPEND_FLAG([ dnl + --warn-error dnl + ],ax_warn_scannerflags_variable) + ]) + + # Substitute the variables + AC_SUBST(ax_warn_scannerflags_variable) +])dnl AX_COMPILER_FLAGS + +# ============================================================================= +# http://www.gnu.org/software/autoconf-archive/ax_compiler_flags_ldflags.html +# ============================================================================= +# +# SYNOPSIS +# +# AX_COMPILER_FLAGS_LDFLAGS([VARIABLE], [IS-RELEASE], [EXTRA-BASE-FLAGS], [EXTRA-YES-FLAGS]) +# +# DESCRIPTION +# +# Add warning flags for the linker to VARIABLE, which defaults to +# WARN_LDFLAGS. VARIABLE is AC_SUBST-ed by this macro, but must be +# manually added to the LDFLAGS variable for each target in the code base. +# +# This macro depends on the environment set up by AX_COMPILER_FLAGS. +# Specifically, it uses the value of $ax_enable_compile_warnings to decide +# which flags to enable. +# +# LICENSE +# +# Copyright (c) 2014, 2015 Philip Withnall +# +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. This file is offered as-is, without any +# warranty. + +#serial 5 + +AC_DEFUN([AX_COMPILER_FLAGS_LDFLAGS],[ + AX_REQUIRE_DEFINED([AX_APPEND_LINK_FLAGS]) + AX_REQUIRE_DEFINED([AX_APPEND_FLAG]) + AX_REQUIRE_DEFINED([AX_CHECK_COMPILE_FLAG]) + + # Variable names + m4_define(ax_warn_ldflags_variable, + [m4_normalize(ifelse([$1],,[WARN_LDFLAGS],[$1]))]) + + # Always pass -Werror=unknown-warning-option to get Clang to fail on bad + # flags, otherwise they are always appended to the warn_ldflags variable, + # and Clang warns on them for every compilation unit. + # If this is passed to GCC, it will explode, so the flag must be enabled + # conditionally. + AX_CHECK_COMPILE_FLAG([-Werror=unknown-warning-option],[ + ax_compiler_flags_test="-Werror=unknown-warning-option" + ],[ + ax_compiler_flags_test="" + ]) + + # Base flags + AX_APPEND_LINK_FLAGS([ dnl + -Wl,--no-as-needed dnl + $3 dnl + ],ax_warn_ldflags_variable,[$ax_compiler_flags_test]) + + AS_IF([test "$ax_enable_compile_warnings" != "no"],[ + # "yes" flags + AX_APPEND_LINK_FLAGS([$4 $5 $6 $7], + ax_warn_ldflags_variable, + [$ax_compiler_flags_test]) + ]) + AS_IF([test "$ax_enable_compile_warnings" = "error"],[ + # "error" flags; -Werror has to be appended unconditionally because + # it's not possible to test for + # + # suggest-attribute=format is disabled because it gives too many false + # positives + AX_APPEND_LINK_FLAGS([ dnl + -Wl,--fatal-warnings dnl + ],ax_warn_ldflags_variable,[$ax_compiler_flags_test]) + ]) + + # Substitute the variables + AC_SUBST(ax_warn_ldflags_variable) +])dnl AX_COMPILER_FLAGS + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_is_release.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_IS_RELEASE(POLICY) +# +# DESCRIPTION +# +# Determine whether the code is being configured as a release, or from +# git. Set the ax_is_release variable to 'yes' or 'no'. +# +# If building a release version, it is recommended that the configure +# script disable compiler errors and debug features, by conditionalising +# them on the ax_is_release variable. If building from git, these +# features should be enabled. +# +# The POLICY parameter specifies how ax_is_release is determined. It can +# take the following values: +# +# * git-directory: ax_is_release will be 'no' if a '.git' directory exists +# * minor-version: ax_is_release will be 'no' if the minor version number +# in $PACKAGE_VERSION is odd; this assumes +# $PACKAGE_VERSION follows the 'major.minor.micro' scheme +# * micro-version: ax_is_release will be 'no' if the micro version number +# in $PACKAGE_VERSION is odd; this assumes +# $PACKAGE_VERSION follows the 'major.minor.micro' scheme +# * always: ax_is_release will always be 'yes' +# * never: ax_is_release will always be 'no' +# +# Other policies may be added in future. +# +# LICENSE +# +# Copyright (c) 2015 Philip Withnall +# +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. + +#serial 3 + +AC_DEFUN([AX_IS_RELEASE],[ + AC_BEFORE([AC_INIT],[$0]) + + m4_case([$1], + [git-directory],[ + # $is_release = (.git directory does not exist) + AS_IF([test -d .git],[ax_is_release=no],[ax_is_release=yes]) + ], + [minor-version],[ + # $is_release = ($minor_version is even) + minor_version=`echo "$PACKAGE_VERSION" | sed 's/[[^.]][[^.]]*.\([[^.]][[^.]]*\).*/\1/'` + AS_IF([test "$(( $minor_version % 2 ))" -ne 0], + [ax_is_release=no],[ax_is_release=yes]) + ], + [micro-version],[ + # $is_release = ($micro_version is even) + micro_version=`echo "$PACKAGE_VERSION" | sed 's/[[^.]]*\.[[^.]]*\.\([[^.]]*\).*/\1/'` + AS_IF([test "$(( $micro_version % 2 ))" -ne 0], + [ax_is_release=no],[ax_is_release=yes]) + ], + [always],[ax_is_release=yes], + [never],[ax_is_release=no], + [ + AC_MSG_ERROR([Invalid policy. Valid policies: git-directory, minor-version.]) + ]) +]) + +# =========================================================================== +# http://www.gnu.org/software/autoconf-archive/ax_require_defined.html +# =========================================================================== +# +# SYNOPSIS +# +# AX_REQUIRE_DEFINED(MACRO) +# +# DESCRIPTION +# +# AX_REQUIRE_DEFINED is a simple helper for making sure other macros have +# been defined and thus are available for use. This avoids random issues +# where a macro isn't expanded. Instead the configure script emits a +# non-fatal: +# +# ./configure: line 1673: AX_CFLAGS_WARN_ALL: command not found +# +# It's like AC_REQUIRE except it doesn't expand the required macro. +# +# Here's an example: +# +# AX_REQUIRE_DEFINED([AX_CHECK_LINK_FLAG]) +# +# LICENSE +# +# Copyright (c) 2014 Mike Frysinger +# +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. This file is offered as-is, without any +# warranty. + +#serial 1 + +AC_DEFUN([AX_REQUIRE_DEFINED], [dnl + m4_ifndef([$1], [m4_fatal([macro ]$1[ is not defined; is a m4 file missing?])]) +])dnl AX_REQUIRE_DEFINED + +# Copyright (C) 2002-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_AUTOMAKE_VERSION(VERSION) +# ---------------------------- +# Automake X.Y traces this macro to ensure aclocal.m4 has been +# generated from the m4 files accompanying Automake X.Y. +# (This private macro should not be called outside this file.) +AC_DEFUN([AM_AUTOMAKE_VERSION], +[am__api_version='1.15' +dnl Some users find AM_AUTOMAKE_VERSION and mistake it for a way to +dnl require some minimum version. Point them to the right macro. +m4_if([$1], [1.15], [], + [AC_FATAL([Do not call $0, use AM_INIT_AUTOMAKE([$1]).])])dnl +]) + +# _AM_AUTOCONF_VERSION(VERSION) +# ----------------------------- +# aclocal traces this macro to find the Autoconf version. +# This is a private macro too. Using m4_define simplifies +# the logic in aclocal, which can simply ignore this definition. +m4_define([_AM_AUTOCONF_VERSION], []) + +# AM_SET_CURRENT_AUTOMAKE_VERSION +# ------------------------------- +# Call AM_AUTOMAKE_VERSION and AM_AUTOMAKE_VERSION so they can be traced. +# This function is AC_REQUIREd by AM_INIT_AUTOMAKE. +AC_DEFUN([AM_SET_CURRENT_AUTOMAKE_VERSION], +[AM_AUTOMAKE_VERSION([1.15])dnl +m4_ifndef([AC_AUTOCONF_VERSION], + [m4_copy([m4_PACKAGE_VERSION], [AC_AUTOCONF_VERSION])])dnl +_AM_AUTOCONF_VERSION(m4_defn([AC_AUTOCONF_VERSION]))]) + +# AM_AUX_DIR_EXPAND -*- Autoconf -*- + +# Copyright (C) 2001-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# For projects using AC_CONFIG_AUX_DIR([foo]), Autoconf sets +# $ac_aux_dir to '$srcdir/foo'. In other projects, it is set to +# '$srcdir', '$srcdir/..', or '$srcdir/../..'. +# +# Of course, Automake must honor this variable whenever it calls a +# tool from the auxiliary directory. The problem is that $srcdir (and +# therefore $ac_aux_dir as well) can be either absolute or relative, +# depending on how configure is run. This is pretty annoying, since +# it makes $ac_aux_dir quite unusable in subdirectories: in the top +# source directory, any form will work fine, but in subdirectories a +# relative path needs to be adjusted first. +# +# $ac_aux_dir/missing +# fails when called from a subdirectory if $ac_aux_dir is relative +# $top_srcdir/$ac_aux_dir/missing +# fails if $ac_aux_dir is absolute, +# fails when called from a subdirectory in a VPATH build with +# a relative $ac_aux_dir +# +# The reason of the latter failure is that $top_srcdir and $ac_aux_dir +# are both prefixed by $srcdir. In an in-source build this is usually +# harmless because $srcdir is '.', but things will broke when you +# start a VPATH build or use an absolute $srcdir. +# +# So we could use something similar to $top_srcdir/$ac_aux_dir/missing, +# iff we strip the leading $srcdir from $ac_aux_dir. That would be: +# am_aux_dir='\$(top_srcdir)/'`expr "$ac_aux_dir" : "$srcdir//*\(.*\)"` +# and then we would define $MISSING as +# MISSING="\${SHELL} $am_aux_dir/missing" +# This will work as long as MISSING is not called from configure, because +# unfortunately $(top_srcdir) has no meaning in configure. +# However there are other variables, like CC, which are often used in +# configure, and could therefore not use this "fixed" $ac_aux_dir. +# +# Another solution, used here, is to always expand $ac_aux_dir to an +# absolute PATH. The drawback is that using absolute paths prevent a +# configured tree to be moved without reconfiguration. + +AC_DEFUN([AM_AUX_DIR_EXPAND], +[AC_REQUIRE([AC_CONFIG_AUX_DIR_DEFAULT])dnl +# Expand $ac_aux_dir to an absolute path. +am_aux_dir=`cd "$ac_aux_dir" && pwd` +]) + +# AM_CONDITIONAL -*- Autoconf -*- + +# Copyright (C) 1997-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_CONDITIONAL(NAME, SHELL-CONDITION) +# ------------------------------------- +# Define a conditional. +AC_DEFUN([AM_CONDITIONAL], +[AC_PREREQ([2.52])dnl + m4_if([$1], [TRUE], [AC_FATAL([$0: invalid condition: $1])], + [$1], [FALSE], [AC_FATAL([$0: invalid condition: $1])])dnl +AC_SUBST([$1_TRUE])dnl +AC_SUBST([$1_FALSE])dnl +_AM_SUBST_NOTMAKE([$1_TRUE])dnl +_AM_SUBST_NOTMAKE([$1_FALSE])dnl +m4_define([_AM_COND_VALUE_$1], [$2])dnl +if $2; then + $1_TRUE= + $1_FALSE='#' +else + $1_TRUE='#' + $1_FALSE= +fi +AC_CONFIG_COMMANDS_PRE( +[if test -z "${$1_TRUE}" && test -z "${$1_FALSE}"; then + AC_MSG_ERROR([[conditional "$1" was never defined. +Usually this means the macro was only invoked conditionally.]]) +fi])]) + +# Copyright (C) 1999-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + + +# There are a few dirty hacks below to avoid letting 'AC_PROG_CC' be +# written in clear, in which case automake, when reading aclocal.m4, +# will think it sees a *use*, and therefore will trigger all it's +# C support machinery. Also note that it means that autoscan, seeing +# CC etc. in the Makefile, will ask for an AC_PROG_CC use... + + +# _AM_DEPENDENCIES(NAME) +# ---------------------- +# See how the compiler implements dependency checking. +# NAME is "CC", "CXX", "OBJC", "OBJCXX", "UPC", or "GJC". +# We try a few techniques and use that to set a single cache variable. +# +# We don't AC_REQUIRE the corresponding AC_PROG_CC since the latter was +# modified to invoke _AM_DEPENDENCIES(CC); we would have a circular +# dependency, and given that the user is not expected to run this macro, +# just rely on AC_PROG_CC. +AC_DEFUN([_AM_DEPENDENCIES], +[AC_REQUIRE([AM_SET_DEPDIR])dnl +AC_REQUIRE([AM_OUTPUT_DEPENDENCY_COMMANDS])dnl +AC_REQUIRE([AM_MAKE_INCLUDE])dnl +AC_REQUIRE([AM_DEP_TRACK])dnl + +m4_if([$1], [CC], [depcc="$CC" am_compiler_list=], + [$1], [CXX], [depcc="$CXX" am_compiler_list=], + [$1], [OBJC], [depcc="$OBJC" am_compiler_list='gcc3 gcc'], + [$1], [OBJCXX], [depcc="$OBJCXX" am_compiler_list='gcc3 gcc'], + [$1], [UPC], [depcc="$UPC" am_compiler_list=], + [$1], [GCJ], [depcc="$GCJ" am_compiler_list='gcc3 gcc'], + [depcc="$$1" am_compiler_list=]) + +AC_CACHE_CHECK([dependency style of $depcc], + [am_cv_$1_dependencies_compiler_type], +[if test -z "$AMDEP_TRUE" && test -f "$am_depcomp"; then + # We make a subdir and do the tests there. Otherwise we can end up + # making bogus files that we don't know about and never remove. For + # instance it was reported that on HP-UX the gcc test will end up + # making a dummy file named 'D' -- because '-MD' means "put the output + # in D". + rm -rf conftest.dir + mkdir conftest.dir + # Copy depcomp to subdir because otherwise we won't find it if we're + # using a relative directory. + cp "$am_depcomp" conftest.dir + cd conftest.dir + # We will build objects and dependencies in a subdirectory because + # it helps to detect inapplicable dependency modes. For instance + # both Tru64's cc and ICC support -MD to output dependencies as a + # side effect of compilation, but ICC will put the dependencies in + # the current directory while Tru64 will put them in the object + # directory. + mkdir sub + + am_cv_$1_dependencies_compiler_type=none + if test "$am_compiler_list" = ""; then + am_compiler_list=`sed -n ['s/^#*\([a-zA-Z0-9]*\))$/\1/p'] < ./depcomp` + fi + am__universal=false + m4_case([$1], [CC], + [case " $depcc " in #( + *\ -arch\ *\ -arch\ *) am__universal=true ;; + esac], + [CXX], + [case " $depcc " in #( + *\ -arch\ *\ -arch\ *) am__universal=true ;; + esac]) + + for depmode in $am_compiler_list; do + # Setup a source with many dependencies, because some compilers + # like to wrap large dependency lists on column 80 (with \), and + # we should not choose a depcomp mode which is confused by this. + # + # We need to recreate these files for each test, as the compiler may + # overwrite some of them when testing with obscure command lines. + # This happens at least with the AIX C compiler. + : > sub/conftest.c + for i in 1 2 3 4 5 6; do + echo '#include "conftst'$i'.h"' >> sub/conftest.c + # Using ": > sub/conftst$i.h" creates only sub/conftst1.h with + # Solaris 10 /bin/sh. + echo '/* dummy */' > sub/conftst$i.h + done + echo "${am__include} ${am__quote}sub/conftest.Po${am__quote}" > confmf + + # We check with '-c' and '-o' for the sake of the "dashmstdout" + # mode. It turns out that the SunPro C++ compiler does not properly + # handle '-M -o', and we need to detect this. Also, some Intel + # versions had trouble with output in subdirs. + am__obj=sub/conftest.${OBJEXT-o} + am__minus_obj="-o $am__obj" + case $depmode in + gcc) + # This depmode causes a compiler race in universal mode. + test "$am__universal" = false || continue + ;; + nosideeffect) + # After this tag, mechanisms are not by side-effect, so they'll + # only be used when explicitly requested. + if test "x$enable_dependency_tracking" = xyes; then + continue + else + break + fi + ;; + msvc7 | msvc7msys | msvisualcpp | msvcmsys) + # This compiler won't grok '-c -o', but also, the minuso test has + # not run yet. These depmodes are late enough in the game, and + # so weak that their functioning should not be impacted. + am__obj=conftest.${OBJEXT-o} + am__minus_obj= + ;; + none) break ;; + esac + if depmode=$depmode \ + source=sub/conftest.c object=$am__obj \ + depfile=sub/conftest.Po tmpdepfile=sub/conftest.TPo \ + $SHELL ./depcomp $depcc -c $am__minus_obj sub/conftest.c \ + >/dev/null 2>conftest.err && + grep sub/conftst1.h sub/conftest.Po > /dev/null 2>&1 && + grep sub/conftst6.h sub/conftest.Po > /dev/null 2>&1 && + grep $am__obj sub/conftest.Po > /dev/null 2>&1 && + ${MAKE-make} -s -f confmf > /dev/null 2>&1; then + # icc doesn't choke on unknown options, it will just issue warnings + # or remarks (even with -Werror). So we grep stderr for any message + # that says an option was ignored or not supported. + # When given -MP, icc 7.0 and 7.1 complain thusly: + # icc: Command line warning: ignoring option '-M'; no argument required + # The diagnosis changed in icc 8.0: + # icc: Command line remark: option '-MP' not supported + if (grep 'ignoring option' conftest.err || + grep 'not supported' conftest.err) >/dev/null 2>&1; then :; else + am_cv_$1_dependencies_compiler_type=$depmode + break + fi + fi + done + + cd .. + rm -rf conftest.dir +else + am_cv_$1_dependencies_compiler_type=none +fi +]) +AC_SUBST([$1DEPMODE], [depmode=$am_cv_$1_dependencies_compiler_type]) +AM_CONDITIONAL([am__fastdep$1], [ + test "x$enable_dependency_tracking" != xno \ + && test "$am_cv_$1_dependencies_compiler_type" = gcc3]) +]) + + +# AM_SET_DEPDIR +# ------------- +# Choose a directory name for dependency files. +# This macro is AC_REQUIREd in _AM_DEPENDENCIES. +AC_DEFUN([AM_SET_DEPDIR], +[AC_REQUIRE([AM_SET_LEADING_DOT])dnl +AC_SUBST([DEPDIR], ["${am__leading_dot}deps"])dnl +]) + + +# AM_DEP_TRACK +# ------------ +AC_DEFUN([AM_DEP_TRACK], +[AC_ARG_ENABLE([dependency-tracking], [dnl +AS_HELP_STRING( + [--enable-dependency-tracking], + [do not reject slow dependency extractors]) +AS_HELP_STRING( + [--disable-dependency-tracking], + [speeds up one-time build])]) +if test "x$enable_dependency_tracking" != xno; then + am_depcomp="$ac_aux_dir/depcomp" + AMDEPBACKSLASH='\' + am__nodep='_no' +fi +AM_CONDITIONAL([AMDEP], [test "x$enable_dependency_tracking" != xno]) +AC_SUBST([AMDEPBACKSLASH])dnl +_AM_SUBST_NOTMAKE([AMDEPBACKSLASH])dnl +AC_SUBST([am__nodep])dnl +_AM_SUBST_NOTMAKE([am__nodep])dnl +]) + +# Generate code to set up dependency tracking. -*- Autoconf -*- + +# Copyright (C) 1999-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + + +# _AM_OUTPUT_DEPENDENCY_COMMANDS +# ------------------------------ +AC_DEFUN([_AM_OUTPUT_DEPENDENCY_COMMANDS], +[{ + # Older Autoconf quotes --file arguments for eval, but not when files + # are listed without --file. Let's play safe and only enable the eval + # if we detect the quoting. + case $CONFIG_FILES in + *\'*) eval set x "$CONFIG_FILES" ;; + *) set x $CONFIG_FILES ;; + esac + shift + for mf + do + # Strip MF so we end up with the name of the file. + mf=`echo "$mf" | sed -e 's/:.*$//'` + # Check whether this is an Automake generated Makefile or not. + # We used to match only the files named 'Makefile.in', but + # some people rename them; so instead we look at the file content. + # Grep'ing the first line is not enough: some people post-process + # each Makefile.in and add a new line on top of each file to say so. + # Grep'ing the whole file is not good either: AIX grep has a line + # limit of 2048, but all sed's we know have understand at least 4000. + if sed -n 's,^#.*generated by automake.*,X,p' "$mf" | grep X >/dev/null 2>&1; then + dirpart=`AS_DIRNAME("$mf")` + else + continue + fi + # Extract the definition of DEPDIR, am__include, and am__quote + # from the Makefile without running 'make'. + DEPDIR=`sed -n 's/^DEPDIR = //p' < "$mf"` + test -z "$DEPDIR" && continue + am__include=`sed -n 's/^am__include = //p' < "$mf"` + test -z "$am__include" && continue + am__quote=`sed -n 's/^am__quote = //p' < "$mf"` + # Find all dependency output files, they are included files with + # $(DEPDIR) in their names. We invoke sed twice because it is the + # simplest approach to changing $(DEPDIR) to its actual value in the + # expansion. + for file in `sed -n " + s/^$am__include $am__quote\(.*(DEPDIR).*\)$am__quote"'$/\1/p' <"$mf" | \ + sed -e 's/\$(DEPDIR)/'"$DEPDIR"'/g'`; do + # Make sure the directory exists. + test -f "$dirpart/$file" && continue + fdir=`AS_DIRNAME(["$file"])` + AS_MKDIR_P([$dirpart/$fdir]) + # echo "creating $dirpart/$file" + echo '# dummy' > "$dirpart/$file" + done + done +} +])# _AM_OUTPUT_DEPENDENCY_COMMANDS + + +# AM_OUTPUT_DEPENDENCY_COMMANDS +# ----------------------------- +# This macro should only be invoked once -- use via AC_REQUIRE. +# +# This code is only required when automatic dependency tracking +# is enabled. FIXME. This creates each '.P' file that we will +# need in order to bootstrap the dependency handling code. +AC_DEFUN([AM_OUTPUT_DEPENDENCY_COMMANDS], +[AC_CONFIG_COMMANDS([depfiles], + [test x"$AMDEP_TRUE" != x"" || _AM_OUTPUT_DEPENDENCY_COMMANDS], + [AMDEP_TRUE="$AMDEP_TRUE" ac_aux_dir="$ac_aux_dir"]) +]) + +# Do all the work for Automake. -*- Autoconf -*- + +# Copyright (C) 1996-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# This macro actually does too much. Some checks are only needed if +# your package does certain things. But this isn't really a big deal. + +dnl Redefine AC_PROG_CC to automatically invoke _AM_PROG_CC_C_O. +m4_define([AC_PROG_CC], +m4_defn([AC_PROG_CC]) +[_AM_PROG_CC_C_O +]) + +# AM_INIT_AUTOMAKE(PACKAGE, VERSION, [NO-DEFINE]) +# AM_INIT_AUTOMAKE([OPTIONS]) +# ----------------------------------------------- +# The call with PACKAGE and VERSION arguments is the old style +# call (pre autoconf-2.50), which is being phased out. PACKAGE +# and VERSION should now be passed to AC_INIT and removed from +# the call to AM_INIT_AUTOMAKE. +# We support both call styles for the transition. After +# the next Automake release, Autoconf can make the AC_INIT +# arguments mandatory, and then we can depend on a new Autoconf +# release and drop the old call support. +AC_DEFUN([AM_INIT_AUTOMAKE], +[AC_PREREQ([2.65])dnl +dnl Autoconf wants to disallow AM_ names. We explicitly allow +dnl the ones we care about. +m4_pattern_allow([^AM_[A-Z]+FLAGS$])dnl +AC_REQUIRE([AM_SET_CURRENT_AUTOMAKE_VERSION])dnl +AC_REQUIRE([AC_PROG_INSTALL])dnl +if test "`cd $srcdir && pwd`" != "`pwd`"; then + # Use -I$(srcdir) only when $(srcdir) != ., so that make's output + # is not polluted with repeated "-I." + AC_SUBST([am__isrc], [' -I$(srcdir)'])_AM_SUBST_NOTMAKE([am__isrc])dnl + # test to see if srcdir already configured + if test -f $srcdir/config.status; then + AC_MSG_ERROR([source directory already configured; run "make distclean" there first]) + fi +fi + +# test whether we have cygpath +if test -z "$CYGPATH_W"; then + if (cygpath --version) >/dev/null 2>/dev/null; then + CYGPATH_W='cygpath -w' + else + CYGPATH_W=echo + fi +fi +AC_SUBST([CYGPATH_W]) + +# Define the identity of the package. +dnl Distinguish between old-style and new-style calls. +m4_ifval([$2], +[AC_DIAGNOSE([obsolete], + [$0: two- and three-arguments forms are deprecated.]) +m4_ifval([$3], [_AM_SET_OPTION([no-define])])dnl + AC_SUBST([PACKAGE], [$1])dnl + AC_SUBST([VERSION], [$2])], +[_AM_SET_OPTIONS([$1])dnl +dnl Diagnose old-style AC_INIT with new-style AM_AUTOMAKE_INIT. +m4_if( + m4_ifdef([AC_PACKAGE_NAME], [ok]):m4_ifdef([AC_PACKAGE_VERSION], [ok]), + [ok:ok],, + [m4_fatal([AC_INIT should be called with package and version arguments])])dnl + AC_SUBST([PACKAGE], ['AC_PACKAGE_TARNAME'])dnl + AC_SUBST([VERSION], ['AC_PACKAGE_VERSION'])])dnl + +_AM_IF_OPTION([no-define],, +[AC_DEFINE_UNQUOTED([PACKAGE], ["$PACKAGE"], [Name of package]) + AC_DEFINE_UNQUOTED([VERSION], ["$VERSION"], [Version number of package])])dnl + +# Some tools Automake needs. +AC_REQUIRE([AM_SANITY_CHECK])dnl +AC_REQUIRE([AC_ARG_PROGRAM])dnl +AM_MISSING_PROG([ACLOCAL], [aclocal-${am__api_version}]) +AM_MISSING_PROG([AUTOCONF], [autoconf]) +AM_MISSING_PROG([AUTOMAKE], [automake-${am__api_version}]) +AM_MISSING_PROG([AUTOHEADER], [autoheader]) +AM_MISSING_PROG([MAKEINFO], [makeinfo]) +AC_REQUIRE([AM_PROG_INSTALL_SH])dnl +AC_REQUIRE([AM_PROG_INSTALL_STRIP])dnl +AC_REQUIRE([AC_PROG_MKDIR_P])dnl +# For better backward compatibility. To be removed once Automake 1.9.x +# dies out for good. For more background, see: +# +# +AC_SUBST([mkdir_p], ['$(MKDIR_P)']) +# We need awk for the "check" target (and possibly the TAP driver). The +# system "awk" is bad on some platforms. +AC_REQUIRE([AC_PROG_AWK])dnl +AC_REQUIRE([AC_PROG_MAKE_SET])dnl +AC_REQUIRE([AM_SET_LEADING_DOT])dnl +_AM_IF_OPTION([tar-ustar], [_AM_PROG_TAR([ustar])], + [_AM_IF_OPTION([tar-pax], [_AM_PROG_TAR([pax])], + [_AM_PROG_TAR([v7])])]) +_AM_IF_OPTION([no-dependencies],, +[AC_PROVIDE_IFELSE([AC_PROG_CC], + [_AM_DEPENDENCIES([CC])], + [m4_define([AC_PROG_CC], + m4_defn([AC_PROG_CC])[_AM_DEPENDENCIES([CC])])])dnl +AC_PROVIDE_IFELSE([AC_PROG_CXX], + [_AM_DEPENDENCIES([CXX])], + [m4_define([AC_PROG_CXX], + m4_defn([AC_PROG_CXX])[_AM_DEPENDENCIES([CXX])])])dnl +AC_PROVIDE_IFELSE([AC_PROG_OBJC], + [_AM_DEPENDENCIES([OBJC])], + [m4_define([AC_PROG_OBJC], + m4_defn([AC_PROG_OBJC])[_AM_DEPENDENCIES([OBJC])])])dnl +AC_PROVIDE_IFELSE([AC_PROG_OBJCXX], + [_AM_DEPENDENCIES([OBJCXX])], + [m4_define([AC_PROG_OBJCXX], + m4_defn([AC_PROG_OBJCXX])[_AM_DEPENDENCIES([OBJCXX])])])dnl +]) +AC_REQUIRE([AM_SILENT_RULES])dnl +dnl The testsuite driver may need to know about EXEEXT, so add the +dnl 'am__EXEEXT' conditional if _AM_COMPILER_EXEEXT was seen. This +dnl macro is hooked onto _AC_COMPILER_EXEEXT early, see below. +AC_CONFIG_COMMANDS_PRE(dnl +[m4_provide_if([_AM_COMPILER_EXEEXT], + [AM_CONDITIONAL([am__EXEEXT], [test -n "$EXEEXT"])])])dnl + +# POSIX will say in a future version that running "rm -f" with no argument +# is OK; and we want to be able to make that assumption in our Makefile +# recipes. So use an aggressive probe to check that the usage we want is +# actually supported "in the wild" to an acceptable degree. +# See automake bug#10828. +# To make any issue more visible, cause the running configure to be aborted +# by default if the 'rm' program in use doesn't match our expectations; the +# user can still override this though. +if rm -f && rm -fr && rm -rf; then : OK; else + cat >&2 <<'END' +Oops! + +Your 'rm' program seems unable to run without file operands specified +on the command line, even when the '-f' option is present. This is contrary +to the behaviour of most rm programs out there, and not conforming with +the upcoming POSIX standard: + +Please tell bug-automake@gnu.org about your system, including the value +of your $PATH and any error possibly output before this message. This +can help us improve future automake versions. + +END + if test x"$ACCEPT_INFERIOR_RM_PROGRAM" = x"yes"; then + echo 'Configuration will proceed anyway, since you have set the' >&2 + echo 'ACCEPT_INFERIOR_RM_PROGRAM variable to "yes"' >&2 + echo >&2 + else + cat >&2 <<'END' +Aborting the configuration process, to ensure you take notice of the issue. + +You can download and install GNU coreutils to get an 'rm' implementation +that behaves properly: . + +If you want to complete the configuration process using your problematic +'rm' anyway, export the environment variable ACCEPT_INFERIOR_RM_PROGRAM +to "yes", and re-run configure. + +END + AC_MSG_ERROR([Your 'rm' program is bad, sorry.]) + fi +fi +dnl The trailing newline in this macro's definition is deliberate, for +dnl backward compatibility and to allow trailing 'dnl'-style comments +dnl after the AM_INIT_AUTOMAKE invocation. See automake bug#16841. +]) + +dnl Hook into '_AC_COMPILER_EXEEXT' early to learn its expansion. Do not +dnl add the conditional right here, as _AC_COMPILER_EXEEXT may be further +dnl mangled by Autoconf and run in a shell conditional statement. +m4_define([_AC_COMPILER_EXEEXT], +m4_defn([_AC_COMPILER_EXEEXT])[m4_provide([_AM_COMPILER_EXEEXT])]) + +# When config.status generates a header, we must update the stamp-h file. +# This file resides in the same directory as the config header +# that is generated. The stamp files are numbered to have different names. + +# Autoconf calls _AC_AM_CONFIG_HEADER_HOOK (when defined) in the +# loop where config.status creates the headers, so we can generate +# our stamp files there. +AC_DEFUN([_AC_AM_CONFIG_HEADER_HOOK], +[# Compute $1's index in $config_headers. +_am_arg=$1 +_am_stamp_count=1 +for _am_header in $config_headers :; do + case $_am_header in + $_am_arg | $_am_arg:* ) + break ;; + * ) + _am_stamp_count=`expr $_am_stamp_count + 1` ;; + esac +done +echo "timestamp for $_am_arg" >`AS_DIRNAME(["$_am_arg"])`/stamp-h[]$_am_stamp_count]) + +# Copyright (C) 2001-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_PROG_INSTALL_SH +# ------------------ +# Define $install_sh. +AC_DEFUN([AM_PROG_INSTALL_SH], +[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl +if test x"${install_sh+set}" != xset; then + case $am_aux_dir in + *\ * | *\ *) + install_sh="\${SHELL} '$am_aux_dir/install-sh'" ;; + *) + install_sh="\${SHELL} $am_aux_dir/install-sh" + esac +fi +AC_SUBST([install_sh])]) + +# Copyright (C) 2003-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# Check whether the underlying file-system supports filenames +# with a leading dot. For instance MS-DOS doesn't. +AC_DEFUN([AM_SET_LEADING_DOT], +[rm -rf .tst 2>/dev/null +mkdir .tst 2>/dev/null +if test -d .tst; then + am__leading_dot=. +else + am__leading_dot=_ +fi +rmdir .tst 2>/dev/null +AC_SUBST([am__leading_dot])]) + +# Check to see how 'make' treats includes. -*- Autoconf -*- + +# Copyright (C) 2001-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_MAKE_INCLUDE() +# ----------------- +# Check to see how make treats includes. +AC_DEFUN([AM_MAKE_INCLUDE], +[am_make=${MAKE-make} +cat > confinc << 'END' +am__doit: + @echo this is the am__doit target +.PHONY: am__doit +END +# If we don't find an include directive, just comment out the code. +AC_MSG_CHECKING([for style of include used by $am_make]) +am__include="#" +am__quote= +_am_result=none +# First try GNU make style include. +echo "include confinc" > confmf +# Ignore all kinds of additional output from 'make'. +case `$am_make -s -f confmf 2> /dev/null` in #( +*the\ am__doit\ target*) + am__include=include + am__quote= + _am_result=GNU + ;; +esac +# Now try BSD make style include. +if test "$am__include" = "#"; then + echo '.include "confinc"' > confmf + case `$am_make -s -f confmf 2> /dev/null` in #( + *the\ am__doit\ target*) + am__include=.include + am__quote="\"" + _am_result=BSD + ;; + esac +fi +AC_SUBST([am__include]) +AC_SUBST([am__quote]) +AC_MSG_RESULT([$_am_result]) +rm -f confinc confmf +]) + +# Fake the existence of programs that GNU maintainers use. -*- Autoconf -*- + +# Copyright (C) 1997-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_MISSING_PROG(NAME, PROGRAM) +# ------------------------------ +AC_DEFUN([AM_MISSING_PROG], +[AC_REQUIRE([AM_MISSING_HAS_RUN]) +$1=${$1-"${am_missing_run}$2"} +AC_SUBST($1)]) + +# AM_MISSING_HAS_RUN +# ------------------ +# Define MISSING if not defined so far and test if it is modern enough. +# If it is, set am_missing_run to use it, otherwise, to nothing. +AC_DEFUN([AM_MISSING_HAS_RUN], +[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl +AC_REQUIRE_AUX_FILE([missing])dnl +if test x"${MISSING+set}" != xset; then + case $am_aux_dir in + *\ * | *\ *) + MISSING="\${SHELL} \"$am_aux_dir/missing\"" ;; + *) + MISSING="\${SHELL} $am_aux_dir/missing" ;; + esac +fi +# Use eval to expand $SHELL +if eval "$MISSING --is-lightweight"; then + am_missing_run="$MISSING " +else + am_missing_run= + AC_MSG_WARN(['missing' script is too old or missing]) +fi +]) + +# Helper functions for option handling. -*- Autoconf -*- + +# Copyright (C) 2001-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# _AM_MANGLE_OPTION(NAME) +# ----------------------- +AC_DEFUN([_AM_MANGLE_OPTION], +[[_AM_OPTION_]m4_bpatsubst($1, [[^a-zA-Z0-9_]], [_])]) + +# _AM_SET_OPTION(NAME) +# -------------------- +# Set option NAME. Presently that only means defining a flag for this option. +AC_DEFUN([_AM_SET_OPTION], +[m4_define(_AM_MANGLE_OPTION([$1]), [1])]) + +# _AM_SET_OPTIONS(OPTIONS) +# ------------------------ +# OPTIONS is a space-separated list of Automake options. +AC_DEFUN([_AM_SET_OPTIONS], +[m4_foreach_w([_AM_Option], [$1], [_AM_SET_OPTION(_AM_Option)])]) + +# _AM_IF_OPTION(OPTION, IF-SET, [IF-NOT-SET]) +# ------------------------------------------- +# Execute IF-SET if OPTION is set, IF-NOT-SET otherwise. +AC_DEFUN([_AM_IF_OPTION], +[m4_ifset(_AM_MANGLE_OPTION([$1]), [$2], [$3])]) + +# Copyright (C) 1999-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# _AM_PROG_CC_C_O +# --------------- +# Like AC_PROG_CC_C_O, but changed for automake. We rewrite AC_PROG_CC +# to automatically call this. +AC_DEFUN([_AM_PROG_CC_C_O], +[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl +AC_REQUIRE_AUX_FILE([compile])dnl +AC_LANG_PUSH([C])dnl +AC_CACHE_CHECK( + [whether $CC understands -c and -o together], + [am_cv_prog_cc_c_o], + [AC_LANG_CONFTEST([AC_LANG_PROGRAM([])]) + # Make sure it works both with $CC and with simple cc. + # Following AC_PROG_CC_C_O, we do the test twice because some + # compilers refuse to overwrite an existing .o file with -o, + # though they will create one. + am_cv_prog_cc_c_o=yes + for am_i in 1 2; do + if AM_RUN_LOG([$CC -c conftest.$ac_ext -o conftest2.$ac_objext]) \ + && test -f conftest2.$ac_objext; then + : OK + else + am_cv_prog_cc_c_o=no + break + fi + done + rm -f core conftest* + unset am_i]) +if test "$am_cv_prog_cc_c_o" != yes; then + # Losing compiler, so override with the script. + # FIXME: It is wrong to rewrite CC. + # But if we don't then we get into trouble of one sort or another. + # A longer-term fix would be to have automake use am__CC in this case, + # and then we could set am__CC="\$(top_srcdir)/compile \$(CC)" + CC="$am_aux_dir/compile $CC" +fi +AC_LANG_POP([C])]) + +# For backward compatibility. +AC_DEFUN_ONCE([AM_PROG_CC_C_O], [AC_REQUIRE([AC_PROG_CC])]) + +# Copyright (C) 2001-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_RUN_LOG(COMMAND) +# ------------------- +# Run COMMAND, save the exit status in ac_status, and log it. +# (This has been adapted from Autoconf's _AC_RUN_LOG macro.) +AC_DEFUN([AM_RUN_LOG], +[{ echo "$as_me:$LINENO: $1" >&AS_MESSAGE_LOG_FD + ($1) >&AS_MESSAGE_LOG_FD 2>&AS_MESSAGE_LOG_FD + ac_status=$? + echo "$as_me:$LINENO: \$? = $ac_status" >&AS_MESSAGE_LOG_FD + (exit $ac_status); }]) + +# Check to make sure that the build environment is sane. -*- Autoconf -*- + +# Copyright (C) 1996-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_SANITY_CHECK +# --------------- +AC_DEFUN([AM_SANITY_CHECK], +[AC_MSG_CHECKING([whether build environment is sane]) +# Reject unsafe characters in $srcdir or the absolute working directory +# name. Accept space and tab only in the latter. +am_lf=' +' +case `pwd` in + *[[\\\"\#\$\&\'\`$am_lf]]*) + AC_MSG_ERROR([unsafe absolute working directory name]);; +esac +case $srcdir in + *[[\\\"\#\$\&\'\`$am_lf\ \ ]]*) + AC_MSG_ERROR([unsafe srcdir value: '$srcdir']);; +esac + +# Do 'set' in a subshell so we don't clobber the current shell's +# arguments. Must try -L first in case configure is actually a +# symlink; some systems play weird games with the mod time of symlinks +# (eg FreeBSD returns the mod time of the symlink's containing +# directory). +if ( + am_has_slept=no + for am_try in 1 2; do + echo "timestamp, slept: $am_has_slept" > conftest.file + set X `ls -Lt "$srcdir/configure" conftest.file 2> /dev/null` + if test "$[*]" = "X"; then + # -L didn't work. + set X `ls -t "$srcdir/configure" conftest.file` + fi + if test "$[*]" != "X $srcdir/configure conftest.file" \ + && test "$[*]" != "X conftest.file $srcdir/configure"; then + + # If neither matched, then we have a broken ls. This can happen + # if, for instance, CONFIG_SHELL is bash and it inherits a + # broken ls alias from the environment. This has actually + # happened. Such a system could not be considered "sane". + AC_MSG_ERROR([ls -t appears to fail. Make sure there is not a broken + alias in your environment]) + fi + if test "$[2]" = conftest.file || test $am_try -eq 2; then + break + fi + # Just in case. + sleep 1 + am_has_slept=yes + done + test "$[2]" = conftest.file + ) +then + # Ok. + : +else + AC_MSG_ERROR([newly created file is older than distributed files! +Check your system clock]) +fi +AC_MSG_RESULT([yes]) +# If we didn't sleep, we still need to ensure time stamps of config.status and +# generated files are strictly newer. +am_sleep_pid= +if grep 'slept: no' conftest.file >/dev/null 2>&1; then + ( sleep 1 ) & + am_sleep_pid=$! +fi +AC_CONFIG_COMMANDS_PRE( + [AC_MSG_CHECKING([that generated files are newer than configure]) + if test -n "$am_sleep_pid"; then + # Hide warnings about reused PIDs. + wait $am_sleep_pid 2>/dev/null + fi + AC_MSG_RESULT([done])]) +rm -f conftest.file +]) + +# Copyright (C) 2009-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_SILENT_RULES([DEFAULT]) +# -------------------------- +# Enable less verbose build rules; with the default set to DEFAULT +# ("yes" being less verbose, "no" or empty being verbose). +AC_DEFUN([AM_SILENT_RULES], +[AC_ARG_ENABLE([silent-rules], [dnl +AS_HELP_STRING( + [--enable-silent-rules], + [less verbose build output (undo: "make V=1")]) +AS_HELP_STRING( + [--disable-silent-rules], + [verbose build output (undo: "make V=0")])dnl +]) +case $enable_silent_rules in @%:@ ((( + yes) AM_DEFAULT_VERBOSITY=0;; + no) AM_DEFAULT_VERBOSITY=1;; + *) AM_DEFAULT_VERBOSITY=m4_if([$1], [yes], [0], [1]);; +esac +dnl +dnl A few 'make' implementations (e.g., NonStop OS and NextStep) +dnl do not support nested variable expansions. +dnl See automake bug#9928 and bug#10237. +am_make=${MAKE-make} +AC_CACHE_CHECK([whether $am_make supports nested variables], + [am_cv_make_support_nested_variables], + [if AS_ECHO([['TRUE=$(BAR$(V)) +BAR0=false +BAR1=true +V=1 +am__doit: + @$(TRUE) +.PHONY: am__doit']]) | $am_make -f - >/dev/null 2>&1; then + am_cv_make_support_nested_variables=yes +else + am_cv_make_support_nested_variables=no +fi]) +if test $am_cv_make_support_nested_variables = yes; then + dnl Using '$V' instead of '$(V)' breaks IRIX make. + AM_V='$(V)' + AM_DEFAULT_V='$(AM_DEFAULT_VERBOSITY)' +else + AM_V=$AM_DEFAULT_VERBOSITY + AM_DEFAULT_V=$AM_DEFAULT_VERBOSITY +fi +AC_SUBST([AM_V])dnl +AM_SUBST_NOTMAKE([AM_V])dnl +AC_SUBST([AM_DEFAULT_V])dnl +AM_SUBST_NOTMAKE([AM_DEFAULT_V])dnl +AC_SUBST([AM_DEFAULT_VERBOSITY])dnl +AM_BACKSLASH='\' +AC_SUBST([AM_BACKSLASH])dnl +_AM_SUBST_NOTMAKE([AM_BACKSLASH])dnl +]) + +# Copyright (C) 2001-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_PROG_INSTALL_STRIP +# --------------------- +# One issue with vendor 'install' (even GNU) is that you can't +# specify the program used to strip binaries. This is especially +# annoying in cross-compiling environments, where the build's strip +# is unlikely to handle the host's binaries. +# Fortunately install-sh will honor a STRIPPROG variable, so we +# always use install-sh in "make install-strip", and initialize +# STRIPPROG with the value of the STRIP variable (set by the user). +AC_DEFUN([AM_PROG_INSTALL_STRIP], +[AC_REQUIRE([AM_PROG_INSTALL_SH])dnl +# Installed binaries are usually stripped using 'strip' when the user +# run "make install-strip". However 'strip' might not be the right +# tool to use in cross-compilation environments, therefore Automake +# will honor the 'STRIP' environment variable to overrule this program. +dnl Don't test for $cross_compiling = yes, because it might be 'maybe'. +if test "$cross_compiling" != no; then + AC_CHECK_TOOL([STRIP], [strip], :) +fi +INSTALL_STRIP_PROGRAM="\$(install_sh) -c -s" +AC_SUBST([INSTALL_STRIP_PROGRAM])]) + +# Copyright (C) 2006-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# _AM_SUBST_NOTMAKE(VARIABLE) +# --------------------------- +# Prevent Automake from outputting VARIABLE = @VARIABLE@ in Makefile.in. +# This macro is traced by Automake. +AC_DEFUN([_AM_SUBST_NOTMAKE]) + +# AM_SUBST_NOTMAKE(VARIABLE) +# -------------------------- +# Public sister of _AM_SUBST_NOTMAKE. +AC_DEFUN([AM_SUBST_NOTMAKE], [_AM_SUBST_NOTMAKE($@)]) + +# Check how to create a tarball. -*- Autoconf -*- + +# Copyright (C) 2004-2014 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# _AM_PROG_TAR(FORMAT) +# -------------------- +# Check how to create a tarball in format FORMAT. +# FORMAT should be one of 'v7', 'ustar', or 'pax'. +# +# Substitute a variable $(am__tar) that is a command +# writing to stdout a FORMAT-tarball containing the directory +# $tardir. +# tardir=directory && $(am__tar) > result.tar +# +# Substitute a variable $(am__untar) that extract such +# a tarball read from stdin. +# $(am__untar) < result.tar +# +AC_DEFUN([_AM_PROG_TAR], +[# Always define AMTAR for backward compatibility. Yes, it's still used +# in the wild :-( We should find a proper way to deprecate it ... +AC_SUBST([AMTAR], ['$${TAR-tar}']) + +# We'll loop over all known methods to create a tar archive until one works. +_am_tools='gnutar m4_if([$1], [ustar], [plaintar]) pax cpio none' + +m4_if([$1], [v7], + [am__tar='$${TAR-tar} chof - "$$tardir"' am__untar='$${TAR-tar} xf -'], + + [m4_case([$1], + [ustar], + [# The POSIX 1988 'ustar' format is defined with fixed-size fields. + # There is notably a 21 bits limit for the UID and the GID. In fact, + # the 'pax' utility can hang on bigger UID/GID (see automake bug#8343 + # and bug#13588). + am_max_uid=2097151 # 2^21 - 1 + am_max_gid=$am_max_uid + # The $UID and $GID variables are not portable, so we need to resort + # to the POSIX-mandated id(1) utility. Errors in the 'id' calls + # below are definitely unexpected, so allow the users to see them + # (that is, avoid stderr redirection). + am_uid=`id -u || echo unknown` + am_gid=`id -g || echo unknown` + AC_MSG_CHECKING([whether UID '$am_uid' is supported by ustar format]) + if test $am_uid -le $am_max_uid; then + AC_MSG_RESULT([yes]) + else + AC_MSG_RESULT([no]) + _am_tools=none + fi + AC_MSG_CHECKING([whether GID '$am_gid' is supported by ustar format]) + if test $am_gid -le $am_max_gid; then + AC_MSG_RESULT([yes]) + else + AC_MSG_RESULT([no]) + _am_tools=none + fi], + + [pax], + [], + + [m4_fatal([Unknown tar format])]) + + AC_MSG_CHECKING([how to create a $1 tar archive]) + + # Go ahead even if we have the value already cached. We do so because we + # need to set the values for the 'am__tar' and 'am__untar' variables. + _am_tools=${am_cv_prog_tar_$1-$_am_tools} + + for _am_tool in $_am_tools; do + case $_am_tool in + gnutar) + for _am_tar in tar gnutar gtar; do + AM_RUN_LOG([$_am_tar --version]) && break + done + am__tar="$_am_tar --format=m4_if([$1], [pax], [posix], [$1]) -chf - "'"$$tardir"' + am__tar_="$_am_tar --format=m4_if([$1], [pax], [posix], [$1]) -chf - "'"$tardir"' + am__untar="$_am_tar -xf -" + ;; + plaintar) + # Must skip GNU tar: if it does not support --format= it doesn't create + # ustar tarball either. + (tar --version) >/dev/null 2>&1 && continue + am__tar='tar chf - "$$tardir"' + am__tar_='tar chf - "$tardir"' + am__untar='tar xf -' + ;; + pax) + am__tar='pax -L -x $1 -w "$$tardir"' + am__tar_='pax -L -x $1 -w "$tardir"' + am__untar='pax -r' + ;; + cpio) + am__tar='find "$$tardir" -print | cpio -o -H $1 -L' + am__tar_='find "$tardir" -print | cpio -o -H $1 -L' + am__untar='cpio -i -H $1 -d' + ;; + none) + am__tar=false + am__tar_=false + am__untar=false + ;; + esac + + # If the value was cached, stop now. We just wanted to have am__tar + # and am__untar set. + test -n "${am_cv_prog_tar_$1}" && break + + # tar/untar a dummy directory, and stop if the command works. + rm -rf conftest.dir + mkdir conftest.dir + echo GrepMe > conftest.dir/file + AM_RUN_LOG([tardir=conftest.dir && eval $am__tar_ >conftest.tar]) + rm -rf conftest.dir + if test -s conftest.tar; then + AM_RUN_LOG([$am__untar /dev/null 2>&1 && break + fi + done + rm -rf conftest.dir + + AC_CACHE_VAL([am_cv_prog_tar_$1], [am_cv_prog_tar_$1=$_am_tool]) + AC_MSG_RESULT([$am_cv_prog_tar_$1])]) + +AC_SUBST([am__tar]) +AC_SUBST([am__untar]) +]) # _AM_PROG_TAR + +m4_include([m4/libtool.m4]) +m4_include([m4/ltoptions.m4]) +m4_include([m4/ltsugar.m4]) +m4_include([m4/ltversion.m4]) +m4_include([m4/lt~obsolete.m4]) +m4_include([m4/pkg.m4]) diff --git a/compat/Makefile.am b/compat/Makefile.am new file mode 100644 index 0000000..010b4aa --- /dev/null +++ b/compat/Makefile.am @@ -0,0 +1,6 @@ +noinst_LTLIBRARIES = compat.la + +compat_la_SOURCES = strndup.c +compat_la_CPPFLAGS = -I$(top_srcdir) $(PTHREADS_CFLAGS) $(RSRT_CFLAGS) +compat_la_LDFLAGS = -module -avoid-version +compat_la_LIBADD = $(IMUDP_LIBS) diff --git a/compat/Makefile.in b/compat/Makefile.in new file mode 100644 index 0000000..eca9b22 --- /dev/null +++ b/compat/Makefile.in @@ -0,0 +1,614 @@ +# Makefile.in generated by automake 1.15 from Makefile.am. +# @configure_input@ + +# Copyright (C) 1994-2014 Free Software Foundation, Inc. + +# This Makefile.in is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY, to the extent permitted by law; without +# even the implied warranty of MERCHANTABILITY or FITNESS FOR A +# PARTICULAR PURPOSE. + +@SET_MAKE@ + +VPATH = @srcdir@ +am__is_gnu_make = { \ + if test -z '$(MAKELEVEL)'; then \ + false; \ + elif test -n '$(MAKE_HOST)'; then \ + true; \ + elif test -n '$(MAKE_VERSION)' && test -n '$(CURDIR)'; then \ + true; \ + else \ + false; \ + fi; \ +} +am__make_running_with_option = \ + case $${target_option-} in \ + ?) ;; \ + *) echo "am__make_running_with_option: internal error: invalid" \ + "target option '$${target_option-}' specified" >&2; \ + exit 1;; \ + esac; \ + has_opt=no; \ + sane_makeflags=$$MAKEFLAGS; \ + if $(am__is_gnu_make); then \ + sane_makeflags=$$MFLAGS; \ + else \ + case $$MAKEFLAGS in \ + *\\[\ \ ]*) \ + bs=\\; \ + sane_makeflags=`printf '%s\n' "$$MAKEFLAGS" \ + | sed "s/$$bs$$bs[$$bs $$bs ]*//g"`;; \ + esac; \ + fi; \ + skip_next=no; \ + strip_trailopt () \ + { \ + flg=`printf '%s\n' "$$flg" | sed "s/$$1.*$$//"`; \ + }; \ + for flg in $$sane_makeflags; do \ + test $$skip_next = yes && { skip_next=no; continue; }; \ + case $$flg in \ + *=*|--*) continue;; \ + -*I) strip_trailopt 'I'; skip_next=yes;; \ + -*I?*) strip_trailopt 'I';; \ + -*O) strip_trailopt 'O'; skip_next=yes;; \ + -*O?*) strip_trailopt 'O';; \ + -*l) strip_trailopt 'l'; skip_next=yes;; \ + -*l?*) strip_trailopt 'l';; \ + -[dEDm]) skip_next=yes;; \ + -[JT]) skip_next=yes;; \ + esac; \ + case $$flg in \ + *$$target_option*) has_opt=yes; break;; \ + esac; \ + done; \ + test $$has_opt = yes +am__make_dryrun = (target_option=n; $(am__make_running_with_option)) +am__make_keepgoing = (target_option=k; $(am__make_running_with_option)) +pkgdatadir = $(datadir)/@PACKAGE@ +pkgincludedir = $(includedir)/@PACKAGE@ +pkglibdir = $(libdir)/@PACKAGE@ +pkglibexecdir = $(libexecdir)/@PACKAGE@ +am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd +install_sh_DATA = $(install_sh) -c -m 644 +install_sh_PROGRAM = $(install_sh) -c +install_sh_SCRIPT = $(install_sh) -c +INSTALL_HEADER = $(INSTALL_DATA) +transform = $(program_transform_name) +NORMAL_INSTALL = : +PRE_INSTALL = : +POST_INSTALL = : +NORMAL_UNINSTALL = : +PRE_UNINSTALL = : +POST_UNINSTALL = : +build_triplet = @build@ +host_triplet = @host@ +subdir = compat +ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 +am__aclocal_m4_deps = $(top_srcdir)/m4/libtool.m4 \ + $(top_srcdir)/m4/ltoptions.m4 $(top_srcdir)/m4/ltsugar.m4 \ + $(top_srcdir)/m4/ltversion.m4 $(top_srcdir)/m4/lt~obsolete.m4 \ + $(top_srcdir)/m4/pkg.m4 $(top_srcdir)/configure.ac +am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \ + $(ACLOCAL_M4) +DIST_COMMON = $(srcdir)/Makefile.am $(am__DIST_COMMON) +mkinstalldirs = $(install_sh) -d +CONFIG_HEADER = $(top_builddir)/config.h +CONFIG_CLEAN_FILES = +CONFIG_CLEAN_VPATH_FILES = +LTLIBRARIES = $(noinst_LTLIBRARIES) +compat_la_DEPENDENCIES = +am_compat_la_OBJECTS = compat_la-strndup.lo +compat_la_OBJECTS = $(am_compat_la_OBJECTS) +AM_V_lt = $(am__v_lt_@AM_V@) +am__v_lt_ = $(am__v_lt_@AM_DEFAULT_V@) +am__v_lt_0 = --silent +am__v_lt_1 = +compat_la_LINK = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \ + $(compat_la_LDFLAGS) $(LDFLAGS) -o $@ +AM_V_P = $(am__v_P_@AM_V@) +am__v_P_ = $(am__v_P_@AM_DEFAULT_V@) +am__v_P_0 = false +am__v_P_1 = : +AM_V_GEN = $(am__v_GEN_@AM_V@) +am__v_GEN_ = $(am__v_GEN_@AM_DEFAULT_V@) +am__v_GEN_0 = @echo " GEN " $@; +am__v_GEN_1 = +AM_V_at = $(am__v_at_@AM_V@) +am__v_at_ = $(am__v_at_@AM_DEFAULT_V@) +am__v_at_0 = @ +am__v_at_1 = +DEFAULT_INCLUDES = -I.@am__isrc@ -I$(top_builddir) +depcomp = $(SHELL) $(top_srcdir)/depcomp +am__depfiles_maybe = depfiles +am__mv = mv -f +COMPILE = $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(AM_CPPFLAGS) \ + $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) +LTCOMPILE = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) \ + $(DEFAULT_INCLUDES) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) \ + $(AM_CFLAGS) $(CFLAGS) +AM_V_CC = $(am__v_CC_@AM_V@) +am__v_CC_ = $(am__v_CC_@AM_DEFAULT_V@) +am__v_CC_0 = @echo " CC " $@; +am__v_CC_1 = +CCLD = $(CC) +LINK = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \ + $(AM_LDFLAGS) $(LDFLAGS) -o $@ +AM_V_CCLD = $(am__v_CCLD_@AM_V@) +am__v_CCLD_ = $(am__v_CCLD_@AM_DEFAULT_V@) +am__v_CCLD_0 = @echo " CCLD " $@; +am__v_CCLD_1 = +SOURCES = $(compat_la_SOURCES) +DIST_SOURCES = $(compat_la_SOURCES) +am__can_run_installinfo = \ + case $$AM_UPDATE_INFO_DIR in \ + n|no|NO) false;; \ + *) (install-info --version) >/dev/null 2>&1;; \ + esac +am__tagged_files = $(HEADERS) $(SOURCES) $(TAGS_FILES) $(LISP) +# Read a list of newline-separated strings from the standard input, +# and print each of them once, without duplicates. Input order is +# *not* preserved. +am__uniquify_input = $(AWK) '\ + BEGIN { nonempty = 0; } \ + { items[$$0] = 1; nonempty = 1; } \ + END { if (nonempty) { for (i in items) print i; }; } \ +' +# Make sure the list of sources is unique. This is necessary because, +# e.g., the same source file might be shared among _SOURCES variables +# for different programs/libraries. +am__define_uniq_tagged_files = \ + list='$(am__tagged_files)'; \ + unique=`for i in $$list; do \ + if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \ + done | $(am__uniquify_input)` +ETAGS = etags +CTAGS = ctags +am__DIST_COMMON = $(srcdir)/Makefile.in $(top_srcdir)/depcomp +DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST) +ACLOCAL = @ACLOCAL@ +AMTAR = @AMTAR@ +AM_DEFAULT_VERBOSITY = @AM_DEFAULT_VERBOSITY@ +AR = @AR@ +AUTOCONF = @AUTOCONF@ +AUTOHEADER = @AUTOHEADER@ +AUTOMAKE = @AUTOMAKE@ +AWK = @AWK@ +CC = @CC@ +CCDEPMODE = @CCDEPMODE@ +CFLAGS = @CFLAGS@ +CPP = @CPP@ +CPPFLAGS = @CPPFLAGS@ +CYGPATH_W = @CYGPATH_W@ +DEFS = @DEFS@ +DEPDIR = @DEPDIR@ +DLLTOOL = @DLLTOOL@ +DSYMUTIL = @DSYMUTIL@ +DUMPBIN = @DUMPBIN@ +ECHO_C = @ECHO_C@ +ECHO_N = @ECHO_N@ +ECHO_T = @ECHO_T@ +EGREP = @EGREP@ +EXEEXT = @EXEEXT@ +FEATURE_REGEXP = @FEATURE_REGEXP@ +FGREP = @FGREP@ +GREP = @GREP@ +INSTALL = @INSTALL@ +INSTALL_DATA = @INSTALL_DATA@ +INSTALL_PROGRAM = @INSTALL_PROGRAM@ +INSTALL_SCRIPT = @INSTALL_SCRIPT@ +INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@ +JSON_C_CFLAGS = @JSON_C_CFLAGS@ +JSON_C_LIBS = @JSON_C_LIBS@ +LD = @LD@ +LDFLAGS = @LDFLAGS@ +LIBESTR_CFLAGS = @LIBESTR_CFLAGS@ +LIBESTR_LIBS = @LIBESTR_LIBS@ +LIBLOGNORM_CFLAGS = @LIBLOGNORM_CFLAGS@ +LIBLOGNORM_LIBS = @LIBLOGNORM_LIBS@ +LIBOBJS = @LIBOBJS@ +LIBS = @LIBS@ +LIBTOOL = @LIBTOOL@ +LIPO = @LIPO@ +LN_S = @LN_S@ +LTLIBOBJS = @LTLIBOBJS@ +LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAKEINFO = @MAKEINFO@ +MANIFEST_TOOL = @MANIFEST_TOOL@ +MKDIR_P = @MKDIR_P@ +NM = @NM@ +NMEDIT = @NMEDIT@ +OBJDUMP = @OBJDUMP@ +OBJEXT = @OBJEXT@ +OTOOL = @OTOOL@ +OTOOL64 = @OTOOL64@ +PACKAGE = @PACKAGE@ +PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@ +PACKAGE_NAME = @PACKAGE_NAME@ +PACKAGE_STRING = @PACKAGE_STRING@ +PACKAGE_TARNAME = @PACKAGE_TARNAME@ +PACKAGE_URL = @PACKAGE_URL@ +PACKAGE_VERSION = @PACKAGE_VERSION@ +PATH_SEPARATOR = @PATH_SEPARATOR@ +PCRE_CFLAGS = @PCRE_CFLAGS@ +PCRE_LIBS = @PCRE_LIBS@ +PKG_CONFIG = @PKG_CONFIG@ +PKG_CONFIG_LIBDIR = @PKG_CONFIG_LIBDIR@ +PKG_CONFIG_PATH = @PKG_CONFIG_PATH@ +RANLIB = @RANLIB@ +SED = @SED@ +SET_MAKE = @SET_MAKE@ +SHELL = @SHELL@ +SPHINXBUILD = @SPHINXBUILD@ +STRIP = @STRIP@ +VALGRIND = @VALGRIND@ +VERSION = @VERSION@ +WARN_CFLAGS = @WARN_CFLAGS@ +WARN_LDFLAGS = @WARN_LDFLAGS@ +WARN_SCANNERFLAGS = @WARN_SCANNERFLAGS@ +abs_builddir = @abs_builddir@ +abs_srcdir = @abs_srcdir@ +abs_top_builddir = @abs_top_builddir@ +abs_top_srcdir = @abs_top_srcdir@ +ac_ct_AR = @ac_ct_AR@ +ac_ct_CC = @ac_ct_CC@ +ac_ct_DUMPBIN = @ac_ct_DUMPBIN@ +am__include = @am__include@ +am__leading_dot = @am__leading_dot@ +am__quote = @am__quote@ +am__tar = @am__tar@ +am__untar = @am__untar@ +bindir = @bindir@ +build = @build@ +build_alias = @build_alias@ +build_cpu = @build_cpu@ +build_os = @build_os@ +build_vendor = @build_vendor@ +builddir = @builddir@ +datadir = @datadir@ +datarootdir = @datarootdir@ +docdir = @docdir@ +dvidir = @dvidir@ +exec_prefix = @exec_prefix@ +host = @host@ +host_alias = @host_alias@ +host_cpu = @host_cpu@ +host_os = @host_os@ +host_vendor = @host_vendor@ +htmldir = @htmldir@ +includedir = @includedir@ +infodir = @infodir@ +install_sh = @install_sh@ +libdir = @libdir@ +libexecdir = @libexecdir@ +localedir = @localedir@ +localstatedir = @localstatedir@ +mandir = @mandir@ +mkdir_p = @mkdir_p@ +oldincludedir = @oldincludedir@ +pdfdir = @pdfdir@ +pkg_config_libs_private = @pkg_config_libs_private@ +prefix = @prefix@ +program_transform_name = @program_transform_name@ +psdir = @psdir@ +runstatedir = @runstatedir@ +sbindir = @sbindir@ +sharedstatedir = @sharedstatedir@ +srcdir = @srcdir@ +sysconfdir = @sysconfdir@ +target_alias = @target_alias@ +top_build_prefix = @top_build_prefix@ +top_builddir = @top_builddir@ +top_srcdir = @top_srcdir@ +noinst_LTLIBRARIES = compat.la +compat_la_SOURCES = strndup.c +compat_la_CPPFLAGS = -I$(top_srcdir) $(PTHREADS_CFLAGS) $(RSRT_CFLAGS) +compat_la_LDFLAGS = -module -avoid-version +compat_la_LIBADD = $(IMUDP_LIBS) +all: all-am + +.SUFFIXES: +.SUFFIXES: .c .lo .o .obj +$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) + @for dep in $?; do \ + case '$(am__configure_deps)' in \ + *$$dep*) \ + ( cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh ) \ + && { if test -f $@; then exit 0; else break; fi; }; \ + exit 1;; \ + esac; \ + done; \ + echo ' cd $(top_srcdir) && $(AUTOMAKE) --gnu compat/Makefile'; \ + $(am__cd) $(top_srcdir) && \ + $(AUTOMAKE) --gnu compat/Makefile +Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status + @case '$?' in \ + *config.status*) \ + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \ + *) \ + echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \ + cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \ + esac; + +$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh + +$(top_srcdir)/configure: $(am__configure_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(ACLOCAL_M4): $(am__aclocal_m4_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(am__aclocal_m4_deps): + +clean-noinstLTLIBRARIES: + -test -z "$(noinst_LTLIBRARIES)" || rm -f $(noinst_LTLIBRARIES) + @list='$(noinst_LTLIBRARIES)'; \ + locs=`for p in $$list; do echo $$p; done | \ + sed 's|^[^/]*$$|.|; s|/[^/]*$$||; s|$$|/so_locations|' | \ + sort -u`; \ + test -z "$$locs" || { \ + echo rm -f $${locs}; \ + rm -f $${locs}; \ + } + +compat.la: $(compat_la_OBJECTS) $(compat_la_DEPENDENCIES) $(EXTRA_compat_la_DEPENDENCIES) + $(AM_V_CCLD)$(compat_la_LINK) $(compat_la_OBJECTS) $(compat_la_LIBADD) $(LIBS) + +mostlyclean-compile: + -rm -f *.$(OBJEXT) + +distclean-compile: + -rm -f *.tab.c + +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/compat_la-strndup.Plo@am__quote@ + +.c.o: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(COMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ $< +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(COMPILE) -c -o $@ $< + +.c.obj: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(COMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ `$(CYGPATH_W) '$<'` +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(COMPILE) -c -o $@ `$(CYGPATH_W) '$<'` + +.c.lo: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LTCOMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ $< +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LTCOMPILE) -c -o $@ $< + +compat_la-strndup.lo: strndup.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(compat_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT compat_la-strndup.lo -MD -MP -MF $(DEPDIR)/compat_la-strndup.Tpo -c -o compat_la-strndup.lo `test -f 'strndup.c' || echo '$(srcdir)/'`strndup.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/compat_la-strndup.Tpo $(DEPDIR)/compat_la-strndup.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='strndup.c' object='compat_la-strndup.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(compat_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o compat_la-strndup.lo `test -f 'strndup.c' || echo '$(srcdir)/'`strndup.c + +mostlyclean-libtool: + -rm -f *.lo + +clean-libtool: + -rm -rf .libs _libs + +ID: $(am__tagged_files) + $(am__define_uniq_tagged_files); mkid -fID $$unique +tags: tags-am +TAGS: tags + +tags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files) + set x; \ + here=`pwd`; \ + $(am__define_uniq_tagged_files); \ + shift; \ + if test -z "$(ETAGS_ARGS)$$*$$unique"; then :; else \ + test -n "$$unique" || unique=$$empty_fix; \ + if test $$# -gt 0; then \ + $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \ + "$$@" $$unique; \ + else \ + $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \ + $$unique; \ + fi; \ + fi +ctags: ctags-am + +CTAGS: ctags +ctags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files) + $(am__define_uniq_tagged_files); \ + test -z "$(CTAGS_ARGS)$$unique" \ + || $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \ + $$unique + +GTAGS: + here=`$(am__cd) $(top_builddir) && pwd` \ + && $(am__cd) $(top_srcdir) \ + && gtags -i $(GTAGS_ARGS) "$$here" +cscopelist: cscopelist-am + +cscopelist-am: $(am__tagged_files) + list='$(am__tagged_files)'; \ + case "$(srcdir)" in \ + [\\/]* | ?:[\\/]*) sdir="$(srcdir)" ;; \ + *) sdir=$(subdir)/$(srcdir) ;; \ + esac; \ + for i in $$list; do \ + if test -f "$$i"; then \ + echo "$(subdir)/$$i"; \ + else \ + echo "$$sdir/$$i"; \ + fi; \ + done >> $(top_builddir)/cscope.files + +distclean-tags: + -rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags + +distdir: $(DISTFILES) + @srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + list='$(DISTFILES)'; \ + dist_files=`for file in $$list; do echo $$file; done | \ + sed -e "s|^$$srcdirstrip/||;t" \ + -e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \ + case $$dist_files in \ + */*) $(MKDIR_P) `echo "$$dist_files" | \ + sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \ + sort -u` ;; \ + esac; \ + for file in $$dist_files; do \ + if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \ + if test -d $$d/$$file; then \ + dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \ + if test -d "$(distdir)/$$file"; then \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \ + cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \ + else \ + test -f "$(distdir)/$$file" \ + || cp -p $$d/$$file "$(distdir)/$$file" \ + || exit 1; \ + fi; \ + done +check-am: all-am +check: check-am +all-am: Makefile $(LTLIBRARIES) +installdirs: +install: install-am +install-exec: install-exec-am +install-data: install-data-am +uninstall: uninstall-am + +install-am: all-am + @$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am + +installcheck: installcheck-am +install-strip: + if test -z '$(STRIP)'; then \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + install; \ + else \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \ + fi +mostlyclean-generic: + +clean-generic: + +distclean-generic: + -test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES) + -test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES) + +maintainer-clean-generic: + @echo "This command is intended for maintainers to use" + @echo "it deletes files that may require special tools to rebuild." +clean: clean-am + +clean-am: clean-generic clean-libtool clean-noinstLTLIBRARIES \ + mostlyclean-am + +distclean: distclean-am + -rm -rf ./$(DEPDIR) + -rm -f Makefile +distclean-am: clean-am distclean-compile distclean-generic \ + distclean-tags + +dvi: dvi-am + +dvi-am: + +html: html-am + +html-am: + +info: info-am + +info-am: + +install-data-am: + +install-dvi: install-dvi-am + +install-dvi-am: + +install-exec-am: + +install-html: install-html-am + +install-html-am: + +install-info: install-info-am + +install-info-am: + +install-man: + +install-pdf: install-pdf-am + +install-pdf-am: + +install-ps: install-ps-am + +install-ps-am: + +installcheck-am: + +maintainer-clean: maintainer-clean-am + -rm -rf ./$(DEPDIR) + -rm -f Makefile +maintainer-clean-am: distclean-am maintainer-clean-generic + +mostlyclean: mostlyclean-am + +mostlyclean-am: mostlyclean-compile mostlyclean-generic \ + mostlyclean-libtool + +pdf: pdf-am + +pdf-am: + +ps: ps-am + +ps-am: + +uninstall-am: + +.MAKE: install-am install-strip + +.PHONY: CTAGS GTAGS TAGS all all-am check check-am clean clean-generic \ + clean-libtool clean-noinstLTLIBRARIES cscopelist-am ctags \ + ctags-am distclean distclean-compile distclean-generic \ + distclean-libtool distclean-tags distdir dvi dvi-am html \ + html-am info info-am install install-am install-data \ + install-data-am install-dvi install-dvi-am install-exec \ + install-exec-am install-html install-html-am install-info \ + install-info-am install-man install-pdf install-pdf-am \ + install-ps install-ps-am install-strip installcheck \ + installcheck-am installdirs maintainer-clean \ + maintainer-clean-generic mostlyclean mostlyclean-compile \ + mostlyclean-generic mostlyclean-libtool pdf pdf-am ps ps-am \ + tags tags-am uninstall uninstall-am + +.PRECIOUS: Makefile + + +# Tell versions [3.59,3.63) of GNU make to not export all variables. +# Otherwise a system limit (for SysV at least) may be exceeded. +.NOEXPORT: diff --git a/compat/strndup.c b/compat/strndup.c new file mode 100644 index 0000000..6bd231a --- /dev/null +++ b/compat/strndup.c @@ -0,0 +1,46 @@ +/* compatibility file for systems without strndup. + * + * Copyright 2015 Rainer Gerhards and Adiscon + * + * This file is part of liblognorm. + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * -or- + * see COPYING.ASL20 in the source distribution + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +#include "config.h" +#ifndef HAVE_STRNDUP + +#include +#include +char * +strndup(const char *s, size_t n) +{ + const size_t len = strlen(s); + if(len <= n) + return strdup(s); + char *const new_s = malloc(len+1); + if(new_s == NULL) + return NULL; + memcpy(new_s, s, len); + new_s[len] = '\0'; + return new_s; +} +#else /* #ifndef HAVE_STRNDUP */ +/* Solaris must have at least one symbol inside a file, so we provide + * it here ;-) + */ +void dummy_dummy_required_for_solaris_do_not_use(void) +{ +} +#endif /* #ifndef HAVE_STRNDUP */ diff --git a/compile b/compile new file mode 100755 index 0000000..a85b723 --- /dev/null +++ b/compile @@ -0,0 +1,347 @@ +#! /bin/sh +# Wrapper for compilers which do not understand '-c -o'. + +scriptversion=2012-10-14.11; # UTC + +# Copyright (C) 1999-2014 Free Software Foundation, Inc. +# Written by Tom Tromey . +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 2, or (at your option) +# any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +# As a special exception to the GNU General Public License, if you +# distribute this file as part of a program that contains a +# configuration script generated by Autoconf, you may include it under +# the same distribution terms that you use for the rest of that program. + +# This file is maintained in Automake, please report +# bugs to or send patches to +# . + +nl=' +' + +# We need space, tab and new line, in precisely that order. Quoting is +# there to prevent tools from complaining about whitespace usage. +IFS=" "" $nl" + +file_conv= + +# func_file_conv build_file lazy +# Convert a $build file to $host form and store it in $file +# Currently only supports Windows hosts. If the determined conversion +# type is listed in (the comma separated) LAZY, no conversion will +# take place. +func_file_conv () +{ + file=$1 + case $file in + / | /[!/]*) # absolute file, and not a UNC file + if test -z "$file_conv"; then + # lazily determine how to convert abs files + case `uname -s` in + MINGW*) + file_conv=mingw + ;; + CYGWIN*) + file_conv=cygwin + ;; + *) + file_conv=wine + ;; + esac + fi + case $file_conv/,$2, in + *,$file_conv,*) + ;; + mingw/*) + file=`cmd //C echo "$file " | sed -e 's/"\(.*\) " *$/\1/'` + ;; + cygwin/*) + file=`cygpath -m "$file" || echo "$file"` + ;; + wine/*) + file=`winepath -w "$file" || echo "$file"` + ;; + esac + ;; + esac +} + +# func_cl_dashL linkdir +# Make cl look for libraries in LINKDIR +func_cl_dashL () +{ + func_file_conv "$1" + if test -z "$lib_path"; then + lib_path=$file + else + lib_path="$lib_path;$file" + fi + linker_opts="$linker_opts -LIBPATH:$file" +} + +# func_cl_dashl library +# Do a library search-path lookup for cl +func_cl_dashl () +{ + lib=$1 + found=no + save_IFS=$IFS + IFS=';' + for dir in $lib_path $LIB + do + IFS=$save_IFS + if $shared && test -f "$dir/$lib.dll.lib"; then + found=yes + lib=$dir/$lib.dll.lib + break + fi + if test -f "$dir/$lib.lib"; then + found=yes + lib=$dir/$lib.lib + break + fi + if test -f "$dir/lib$lib.a"; then + found=yes + lib=$dir/lib$lib.a + break + fi + done + IFS=$save_IFS + + if test "$found" != yes; then + lib=$lib.lib + fi +} + +# func_cl_wrapper cl arg... +# Adjust compile command to suit cl +func_cl_wrapper () +{ + # Assume a capable shell + lib_path= + shared=: + linker_opts= + for arg + do + if test -n "$eat"; then + eat= + else + case $1 in + -o) + # configure might choose to run compile as 'compile cc -o foo foo.c'. + eat=1 + case $2 in + *.o | *.[oO][bB][jJ]) + func_file_conv "$2" + set x "$@" -Fo"$file" + shift + ;; + *) + func_file_conv "$2" + set x "$@" -Fe"$file" + shift + ;; + esac + ;; + -I) + eat=1 + func_file_conv "$2" mingw + set x "$@" -I"$file" + shift + ;; + -I*) + func_file_conv "${1#-I}" mingw + set x "$@" -I"$file" + shift + ;; + -l) + eat=1 + func_cl_dashl "$2" + set x "$@" "$lib" + shift + ;; + -l*) + func_cl_dashl "${1#-l}" + set x "$@" "$lib" + shift + ;; + -L) + eat=1 + func_cl_dashL "$2" + ;; + -L*) + func_cl_dashL "${1#-L}" + ;; + -static) + shared=false + ;; + -Wl,*) + arg=${1#-Wl,} + save_ifs="$IFS"; IFS=',' + for flag in $arg; do + IFS="$save_ifs" + linker_opts="$linker_opts $flag" + done + IFS="$save_ifs" + ;; + -Xlinker) + eat=1 + linker_opts="$linker_opts $2" + ;; + -*) + set x "$@" "$1" + shift + ;; + *.cc | *.CC | *.cxx | *.CXX | *.[cC]++) + func_file_conv "$1" + set x "$@" -Tp"$file" + shift + ;; + *.c | *.cpp | *.CPP | *.lib | *.LIB | *.Lib | *.OBJ | *.obj | *.[oO]) + func_file_conv "$1" mingw + set x "$@" "$file" + shift + ;; + *) + set x "$@" "$1" + shift + ;; + esac + fi + shift + done + if test -n "$linker_opts"; then + linker_opts="-link$linker_opts" + fi + exec "$@" $linker_opts + exit 1 +} + +eat= + +case $1 in + '') + echo "$0: No command. Try '$0 --help' for more information." 1>&2 + exit 1; + ;; + -h | --h*) + cat <<\EOF +Usage: compile [--help] [--version] PROGRAM [ARGS] + +Wrapper for compilers which do not understand '-c -o'. +Remove '-o dest.o' from ARGS, run PROGRAM with the remaining +arguments, and rename the output as expected. + +If you are trying to build a whole package this is not the +right script to run: please start by reading the file 'INSTALL'. + +Report bugs to . +EOF + exit $? + ;; + -v | --v*) + echo "compile $scriptversion" + exit $? + ;; + cl | *[/\\]cl | cl.exe | *[/\\]cl.exe ) + func_cl_wrapper "$@" # Doesn't return... + ;; +esac + +ofile= +cfile= + +for arg +do + if test -n "$eat"; then + eat= + else + case $1 in + -o) + # configure might choose to run compile as 'compile cc -o foo foo.c'. + # So we strip '-o arg' only if arg is an object. + eat=1 + case $2 in + *.o | *.obj) + ofile=$2 + ;; + *) + set x "$@" -o "$2" + shift + ;; + esac + ;; + *.c) + cfile=$1 + set x "$@" "$1" + shift + ;; + *) + set x "$@" "$1" + shift + ;; + esac + fi + shift +done + +if test -z "$ofile" || test -z "$cfile"; then + # If no '-o' option was seen then we might have been invoked from a + # pattern rule where we don't need one. That is ok -- this is a + # normal compilation that the losing compiler can handle. If no + # '.c' file was seen then we are probably linking. That is also + # ok. + exec "$@" +fi + +# Name of file we expect compiler to create. +cofile=`echo "$cfile" | sed 's|^.*[\\/]||; s|^[a-zA-Z]:||; s/\.c$/.o/'` + +# Create the lock directory. +# Note: use '[/\\:.-]' here to ensure that we don't use the same name +# that we are using for the .o file. Also, base the name on the expected +# object file name, since that is what matters with a parallel build. +lockdir=`echo "$cofile" | sed -e 's|[/\\:.-]|_|g'`.d +while true; do + if mkdir "$lockdir" >/dev/null 2>&1; then + break + fi + sleep 1 +done +# FIXME: race condition here if user kills between mkdir and trap. +trap "rmdir '$lockdir'; exit 1" 1 2 15 + +# Run the compile. +"$@" +ret=$? + +if test -f "$cofile"; then + test "$cofile" = "$ofile" || mv "$cofile" "$ofile" +elif test -f "${cofile}bj"; then + test "${cofile}bj" = "$ofile" || mv "${cofile}bj" "$ofile" +fi + +rmdir "$lockdir" +exit $ret + +# Local Variables: +# mode: shell-script +# sh-indentation: 2 +# eval: (add-hook 'write-file-hooks 'time-stamp) +# time-stamp-start: "scriptversion=" +# time-stamp-format: "%:y-%02m-%02d.%02H" +# time-stamp-time-zone: "UTC" +# time-stamp-end: "; # UTC" +# End: diff --git a/config.guess b/config.guess new file mode 100755 index 0000000..1659250 --- /dev/null +++ b/config.guess @@ -0,0 +1,1441 @@ +#! /bin/sh +# Attempt to guess a canonical system name. +# Copyright 1992-2015 Free Software Foundation, Inc. + +timestamp='2015-08-20' + +# This file is free software; you can redistribute it and/or modify it +# under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program; if not, see . +# +# As a special exception to the GNU General Public License, if you +# distribute this file as part of a program that contains a +# configuration script generated by Autoconf, you may include it under +# the same distribution terms that you use for the rest of that +# program. This Exception is an additional permission under section 7 +# of the GNU General Public License, version 3 ("GPLv3"). +# +# Originally written by Per Bothner; maintained since 2000 by Ben Elliston. +# +# You can get the latest version of this script from: +# http://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.guess;hb=HEAD +# +# Please send patches to . + + +me=`echo "$0" | sed -e 's,.*/,,'` + +usage="\ +Usage: $0 [OPTION] + +Output the configuration name of the system \`$me' is run on. + +Operation modes: + -h, --help print this help, then exit + -t, --time-stamp print date of last modification, then exit + -v, --version print version number, then exit + +Report bugs and patches to ." + +version="\ +GNU config.guess ($timestamp) + +Originally written by Per Bothner. +Copyright 1992-2015 Free Software Foundation, Inc. + +This is free software; see the source for copying conditions. There is NO +warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE." + +help=" +Try \`$me --help' for more information." + +# Parse command line +while test $# -gt 0 ; do + case $1 in + --time-stamp | --time* | -t ) + echo "$timestamp" ; exit ;; + --version | -v ) + echo "$version" ; exit ;; + --help | --h* | -h ) + echo "$usage"; exit ;; + -- ) # Stop option processing + shift; break ;; + - ) # Use stdin as input. + break ;; + -* ) + echo "$me: invalid option $1$help" >&2 + exit 1 ;; + * ) + break ;; + esac +done + +if test $# != 0; then + echo "$me: too many arguments$help" >&2 + exit 1 +fi + +trap 'exit 1' 1 2 15 + +# CC_FOR_BUILD -- compiler used by this script. Note that the use of a +# compiler to aid in system detection is discouraged as it requires +# temporary files to be created and, as you can see below, it is a +# headache to deal with in a portable fashion. + +# Historically, `CC_FOR_BUILD' used to be named `HOST_CC'. We still +# use `HOST_CC' if defined, but it is deprecated. + +# Portable tmp directory creation inspired by the Autoconf team. + +set_cc_for_build=' +trap "exitcode=\$?; (rm -f \$tmpfiles 2>/dev/null; rmdir \$tmp 2>/dev/null) && exit \$exitcode" 0 ; +trap "rm -f \$tmpfiles 2>/dev/null; rmdir \$tmp 2>/dev/null; exit 1" 1 2 13 15 ; +: ${TMPDIR=/tmp} ; + { tmp=`(umask 077 && mktemp -d "$TMPDIR/cgXXXXXX") 2>/dev/null` && test -n "$tmp" && test -d "$tmp" ; } || + { test -n "$RANDOM" && tmp=$TMPDIR/cg$$-$RANDOM && (umask 077 && mkdir $tmp) ; } || + { tmp=$TMPDIR/cg-$$ && (umask 077 && mkdir $tmp) && echo "Warning: creating insecure temp directory" >&2 ; } || + { echo "$me: cannot create a temporary directory in $TMPDIR" >&2 ; exit 1 ; } ; +dummy=$tmp/dummy ; +tmpfiles="$dummy.c $dummy.o $dummy.rel $dummy" ; +case $CC_FOR_BUILD,$HOST_CC,$CC in + ,,) echo "int x;" > $dummy.c ; + for c in cc gcc c89 c99 ; do + if ($c -c -o $dummy.o $dummy.c) >/dev/null 2>&1 ; then + CC_FOR_BUILD="$c"; break ; + fi ; + done ; + if test x"$CC_FOR_BUILD" = x ; then + CC_FOR_BUILD=no_compiler_found ; + fi + ;; + ,,*) CC_FOR_BUILD=$CC ;; + ,*,*) CC_FOR_BUILD=$HOST_CC ;; +esac ; set_cc_for_build= ;' + +# This is needed to find uname on a Pyramid OSx when run in the BSD universe. +# (ghazi@noc.rutgers.edu 1994-08-24) +if (test -f /.attbin/uname) >/dev/null 2>&1 ; then + PATH=$PATH:/.attbin ; export PATH +fi + +UNAME_MACHINE=`(uname -m) 2>/dev/null` || UNAME_MACHINE=unknown +UNAME_RELEASE=`(uname -r) 2>/dev/null` || UNAME_RELEASE=unknown +UNAME_SYSTEM=`(uname -s) 2>/dev/null` || UNAME_SYSTEM=unknown +UNAME_VERSION=`(uname -v) 2>/dev/null` || UNAME_VERSION=unknown + +case "${UNAME_SYSTEM}" in +Linux|GNU|GNU/*) + # If the system lacks a compiler, then just pick glibc. + # We could probably try harder. + LIBC=gnu + + eval $set_cc_for_build + cat <<-EOF > $dummy.c + #include + #if defined(__UCLIBC__) + LIBC=uclibc + #elif defined(__dietlibc__) + LIBC=dietlibc + #else + LIBC=gnu + #endif + EOF + eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC' | sed 's, ,,g'` + ;; +esac + +# Note: order is significant - the case branches are not exclusive. + +case "${UNAME_MACHINE}:${UNAME_SYSTEM}:${UNAME_RELEASE}:${UNAME_VERSION}" in + *:NetBSD:*:*) + # NetBSD (nbsd) targets should (where applicable) match one or + # more of the tuples: *-*-netbsdelf*, *-*-netbsdaout*, + # *-*-netbsdecoff* and *-*-netbsd*. For targets that recently + # switched to ELF, *-*-netbsd* would select the old + # object file format. This provides both forward + # compatibility and a consistent mechanism for selecting the + # object file format. + # + # Note: NetBSD doesn't particularly care about the vendor + # portion of the name. We always set it to "unknown". + sysctl="sysctl -n hw.machine_arch" + UNAME_MACHINE_ARCH=`(uname -p 2>/dev/null || \ + /sbin/$sysctl 2>/dev/null || \ + /usr/sbin/$sysctl 2>/dev/null || \ + echo unknown)` + case "${UNAME_MACHINE_ARCH}" in + armeb) machine=armeb-unknown ;; + arm*) machine=arm-unknown ;; + sh3el) machine=shl-unknown ;; + sh3eb) machine=sh-unknown ;; + sh5el) machine=sh5le-unknown ;; + earmv*) + arch=`echo ${UNAME_MACHINE_ARCH} | sed -e 's,^e\(armv[0-9]\).*$,\1,'` + endian=`echo ${UNAME_MACHINE_ARCH} | sed -ne 's,^.*\(eb\)$,\1,p'` + machine=${arch}${endian}-unknown + ;; + *) machine=${UNAME_MACHINE_ARCH}-unknown ;; + esac + # The Operating System including object format, if it has switched + # to ELF recently, or will in the future. + case "${UNAME_MACHINE_ARCH}" in + arm*|earm*|i386|m68k|ns32k|sh3*|sparc|vax) + eval $set_cc_for_build + if echo __ELF__ | $CC_FOR_BUILD -E - 2>/dev/null \ + | grep -q __ELF__ + then + # Once all utilities can be ECOFF (netbsdecoff) or a.out (netbsdaout). + # Return netbsd for either. FIX? + os=netbsd + else + os=netbsdelf + fi + ;; + *) + os=netbsd + ;; + esac + # Determine ABI tags. + case "${UNAME_MACHINE_ARCH}" in + earm*) + expr='s/^earmv[0-9]/-eabi/;s/eb$//' + abi=`echo ${UNAME_MACHINE_ARCH} | sed -e "$expr"` + ;; + esac + # The OS release + # Debian GNU/NetBSD machines have a different userland, and + # thus, need a distinct triplet. However, they do not need + # kernel version information, so it can be replaced with a + # suitable tag, in the style of linux-gnu. + case "${UNAME_VERSION}" in + Debian*) + release='-gnu' + ;; + *) + release=`echo ${UNAME_RELEASE} | sed -e 's/[-_].*//' | cut -d. -f1,2` + ;; + esac + # Since CPU_TYPE-MANUFACTURER-KERNEL-OPERATING_SYSTEM: + # contains redundant information, the shorter form: + # CPU_TYPE-MANUFACTURER-OPERATING_SYSTEM is used. + echo "${machine}-${os}${release}${abi}" + exit ;; + *:Bitrig:*:*) + UNAME_MACHINE_ARCH=`arch | sed 's/Bitrig.//'` + echo ${UNAME_MACHINE_ARCH}-unknown-bitrig${UNAME_RELEASE} + exit ;; + *:OpenBSD:*:*) + UNAME_MACHINE_ARCH=`arch | sed 's/OpenBSD.//'` + echo ${UNAME_MACHINE_ARCH}-unknown-openbsd${UNAME_RELEASE} + exit ;; + *:ekkoBSD:*:*) + echo ${UNAME_MACHINE}-unknown-ekkobsd${UNAME_RELEASE} + exit ;; + *:SolidBSD:*:*) + echo ${UNAME_MACHINE}-unknown-solidbsd${UNAME_RELEASE} + exit ;; + macppc:MirBSD:*:*) + echo powerpc-unknown-mirbsd${UNAME_RELEASE} + exit ;; + *:MirBSD:*:*) + echo ${UNAME_MACHINE}-unknown-mirbsd${UNAME_RELEASE} + exit ;; + *:Sortix:*:*) + echo ${UNAME_MACHINE}-unknown-sortix + exit ;; + alpha:OSF1:*:*) + case $UNAME_RELEASE in + *4.0) + UNAME_RELEASE=`/usr/sbin/sizer -v | awk '{print $3}'` + ;; + *5.*) + UNAME_RELEASE=`/usr/sbin/sizer -v | awk '{print $4}'` + ;; + esac + # According to Compaq, /usr/sbin/psrinfo has been available on + # OSF/1 and Tru64 systems produced since 1995. I hope that + # covers most systems running today. This code pipes the CPU + # types through head -n 1, so we only detect the type of CPU 0. + ALPHA_CPU_TYPE=`/usr/sbin/psrinfo -v | sed -n -e 's/^ The alpha \(.*\) processor.*$/\1/p' | head -n 1` + case "$ALPHA_CPU_TYPE" in + "EV4 (21064)") + UNAME_MACHINE="alpha" ;; + "EV4.5 (21064)") + UNAME_MACHINE="alpha" ;; + "LCA4 (21066/21068)") + UNAME_MACHINE="alpha" ;; + "EV5 (21164)") + UNAME_MACHINE="alphaev5" ;; + "EV5.6 (21164A)") + UNAME_MACHINE="alphaev56" ;; + "EV5.6 (21164PC)") + UNAME_MACHINE="alphapca56" ;; + "EV5.7 (21164PC)") + UNAME_MACHINE="alphapca57" ;; + "EV6 (21264)") + UNAME_MACHINE="alphaev6" ;; + "EV6.7 (21264A)") + UNAME_MACHINE="alphaev67" ;; + "EV6.8CB (21264C)") + UNAME_MACHINE="alphaev68" ;; + "EV6.8AL (21264B)") + UNAME_MACHINE="alphaev68" ;; + "EV6.8CX (21264D)") + UNAME_MACHINE="alphaev68" ;; + "EV6.9A (21264/EV69A)") + UNAME_MACHINE="alphaev69" ;; + "EV7 (21364)") + UNAME_MACHINE="alphaev7" ;; + "EV7.9 (21364A)") + UNAME_MACHINE="alphaev79" ;; + esac + # A Pn.n version is a patched version. + # A Vn.n version is a released version. + # A Tn.n version is a released field test version. + # A Xn.n version is an unreleased experimental baselevel. + # 1.2 uses "1.2" for uname -r. + echo ${UNAME_MACHINE}-dec-osf`echo ${UNAME_RELEASE} | sed -e 's/^[PVTX]//' | tr 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' 'abcdefghijklmnopqrstuvwxyz'` + # Reset EXIT trap before exiting to avoid spurious non-zero exit code. + exitcode=$? + trap '' 0 + exit $exitcode ;; + Alpha\ *:Windows_NT*:*) + # How do we know it's Interix rather than the generic POSIX subsystem? + # Should we change UNAME_MACHINE based on the output of uname instead + # of the specific Alpha model? + echo alpha-pc-interix + exit ;; + 21064:Windows_NT:50:3) + echo alpha-dec-winnt3.5 + exit ;; + Amiga*:UNIX_System_V:4.0:*) + echo m68k-unknown-sysv4 + exit ;; + *:[Aa]miga[Oo][Ss]:*:*) + echo ${UNAME_MACHINE}-unknown-amigaos + exit ;; + *:[Mm]orph[Oo][Ss]:*:*) + echo ${UNAME_MACHINE}-unknown-morphos + exit ;; + *:OS/390:*:*) + echo i370-ibm-openedition + exit ;; + *:z/VM:*:*) + echo s390-ibm-zvmoe + exit ;; + *:OS400:*:*) + echo powerpc-ibm-os400 + exit ;; + arm:RISC*:1.[012]*:*|arm:riscix:1.[012]*:*) + echo arm-acorn-riscix${UNAME_RELEASE} + exit ;; + arm*:riscos:*:*|arm*:RISCOS:*:*) + echo arm-unknown-riscos + exit ;; + SR2?01:HI-UX/MPP:*:* | SR8000:HI-UX/MPP:*:*) + echo hppa1.1-hitachi-hiuxmpp + exit ;; + Pyramid*:OSx*:*:* | MIS*:OSx*:*:* | MIS*:SMP_DC-OSx*:*:*) + # akee@wpdis03.wpafb.af.mil (Earle F. Ake) contributed MIS and NILE. + if test "`(/bin/universe) 2>/dev/null`" = att ; then + echo pyramid-pyramid-sysv3 + else + echo pyramid-pyramid-bsd + fi + exit ;; + NILE*:*:*:dcosx) + echo pyramid-pyramid-svr4 + exit ;; + DRS?6000:unix:4.0:6*) + echo sparc-icl-nx6 + exit ;; + DRS?6000:UNIX_SV:4.2*:7* | DRS?6000:isis:4.2*:7*) + case `/usr/bin/uname -p` in + sparc) echo sparc-icl-nx7; exit ;; + esac ;; + s390x:SunOS:*:*) + echo ${UNAME_MACHINE}-ibm-solaris2`echo ${UNAME_RELEASE}|sed -e 's/[^.]*//'` + exit ;; + sun4H:SunOS:5.*:*) + echo sparc-hal-solaris2`echo ${UNAME_RELEASE}|sed -e 's/[^.]*//'` + exit ;; + sun4*:SunOS:5.*:* | tadpole*:SunOS:5.*:*) + echo sparc-sun-solaris2`echo ${UNAME_RELEASE}|sed -e 's/[^.]*//'` + exit ;; + i86pc:AuroraUX:5.*:* | i86xen:AuroraUX:5.*:*) + echo i386-pc-auroraux${UNAME_RELEASE} + exit ;; + i86pc:SunOS:5.*:* | i86xen:SunOS:5.*:*) + eval $set_cc_for_build + SUN_ARCH="i386" + # If there is a compiler, see if it is configured for 64-bit objects. + # Note that the Sun cc does not turn __LP64__ into 1 like gcc does. + # This test works for both compilers. + if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then + if (echo '#ifdef __amd64'; echo IS_64BIT_ARCH; echo '#endif') | \ + (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \ + grep IS_64BIT_ARCH >/dev/null + then + SUN_ARCH="x86_64" + fi + fi + echo ${SUN_ARCH}-pc-solaris2`echo ${UNAME_RELEASE}|sed -e 's/[^.]*//'` + exit ;; + sun4*:SunOS:6*:*) + # According to config.sub, this is the proper way to canonicalize + # SunOS6. Hard to guess exactly what SunOS6 will be like, but + # it's likely to be more like Solaris than SunOS4. + echo sparc-sun-solaris3`echo ${UNAME_RELEASE}|sed -e 's/[^.]*//'` + exit ;; + sun4*:SunOS:*:*) + case "`/usr/bin/arch -k`" in + Series*|S4*) + UNAME_RELEASE=`uname -v` + ;; + esac + # Japanese Language versions have a version number like `4.1.3-JL'. + echo sparc-sun-sunos`echo ${UNAME_RELEASE}|sed -e 's/-/_/'` + exit ;; + sun3*:SunOS:*:*) + echo m68k-sun-sunos${UNAME_RELEASE} + exit ;; + sun*:*:4.2BSD:*) + UNAME_RELEASE=`(sed 1q /etc/motd | awk '{print substr($5,1,3)}') 2>/dev/null` + test "x${UNAME_RELEASE}" = "x" && UNAME_RELEASE=3 + case "`/bin/arch`" in + sun3) + echo m68k-sun-sunos${UNAME_RELEASE} + ;; + sun4) + echo sparc-sun-sunos${UNAME_RELEASE} + ;; + esac + exit ;; + aushp:SunOS:*:*) + echo sparc-auspex-sunos${UNAME_RELEASE} + exit ;; + # The situation for MiNT is a little confusing. The machine name + # can be virtually everything (everything which is not + # "atarist" or "atariste" at least should have a processor + # > m68000). The system name ranges from "MiNT" over "FreeMiNT" + # to the lowercase version "mint" (or "freemint"). Finally + # the system name "TOS" denotes a system which is actually not + # MiNT. But MiNT is downward compatible to TOS, so this should + # be no problem. + atarist[e]:*MiNT:*:* | atarist[e]:*mint:*:* | atarist[e]:*TOS:*:*) + echo m68k-atari-mint${UNAME_RELEASE} + exit ;; + atari*:*MiNT:*:* | atari*:*mint:*:* | atarist[e]:*TOS:*:*) + echo m68k-atari-mint${UNAME_RELEASE} + exit ;; + *falcon*:*MiNT:*:* | *falcon*:*mint:*:* | *falcon*:*TOS:*:*) + echo m68k-atari-mint${UNAME_RELEASE} + exit ;; + milan*:*MiNT:*:* | milan*:*mint:*:* | *milan*:*TOS:*:*) + echo m68k-milan-mint${UNAME_RELEASE} + exit ;; + hades*:*MiNT:*:* | hades*:*mint:*:* | *hades*:*TOS:*:*) + echo m68k-hades-mint${UNAME_RELEASE} + exit ;; + *:*MiNT:*:* | *:*mint:*:* | *:*TOS:*:*) + echo m68k-unknown-mint${UNAME_RELEASE} + exit ;; + m68k:machten:*:*) + echo m68k-apple-machten${UNAME_RELEASE} + exit ;; + powerpc:machten:*:*) + echo powerpc-apple-machten${UNAME_RELEASE} + exit ;; + RISC*:Mach:*:*) + echo mips-dec-mach_bsd4.3 + exit ;; + RISC*:ULTRIX:*:*) + echo mips-dec-ultrix${UNAME_RELEASE} + exit ;; + VAX*:ULTRIX*:*:*) + echo vax-dec-ultrix${UNAME_RELEASE} + exit ;; + 2020:CLIX:*:* | 2430:CLIX:*:*) + echo clipper-intergraph-clix${UNAME_RELEASE} + exit ;; + mips:*:*:UMIPS | mips:*:*:RISCos) + eval $set_cc_for_build + sed 's/^ //' << EOF >$dummy.c +#ifdef __cplusplus +#include /* for printf() prototype */ + int main (int argc, char *argv[]) { +#else + int main (argc, argv) int argc; char *argv[]; { +#endif + #if defined (host_mips) && defined (MIPSEB) + #if defined (SYSTYPE_SYSV) + printf ("mips-mips-riscos%ssysv\n", argv[1]); exit (0); + #endif + #if defined (SYSTYPE_SVR4) + printf ("mips-mips-riscos%ssvr4\n", argv[1]); exit (0); + #endif + #if defined (SYSTYPE_BSD43) || defined(SYSTYPE_BSD) + printf ("mips-mips-riscos%sbsd\n", argv[1]); exit (0); + #endif + #endif + exit (-1); + } +EOF + $CC_FOR_BUILD -o $dummy $dummy.c && + dummyarg=`echo "${UNAME_RELEASE}" | sed -n 's/\([0-9]*\).*/\1/p'` && + SYSTEM_NAME=`$dummy $dummyarg` && + { echo "$SYSTEM_NAME"; exit; } + echo mips-mips-riscos${UNAME_RELEASE} + exit ;; + Motorola:PowerMAX_OS:*:*) + echo powerpc-motorola-powermax + exit ;; + Motorola:*:4.3:PL8-*) + echo powerpc-harris-powermax + exit ;; + Night_Hawk:*:*:PowerMAX_OS | Synergy:PowerMAX_OS:*:*) + echo powerpc-harris-powermax + exit ;; + Night_Hawk:Power_UNIX:*:*) + echo powerpc-harris-powerunix + exit ;; + m88k:CX/UX:7*:*) + echo m88k-harris-cxux7 + exit ;; + m88k:*:4*:R4*) + echo m88k-motorola-sysv4 + exit ;; + m88k:*:3*:R3*) + echo m88k-motorola-sysv3 + exit ;; + AViiON:dgux:*:*) + # DG/UX returns AViiON for all architectures + UNAME_PROCESSOR=`/usr/bin/uname -p` + if [ $UNAME_PROCESSOR = mc88100 ] || [ $UNAME_PROCESSOR = mc88110 ] + then + if [ ${TARGET_BINARY_INTERFACE}x = m88kdguxelfx ] || \ + [ ${TARGET_BINARY_INTERFACE}x = x ] + then + echo m88k-dg-dgux${UNAME_RELEASE} + else + echo m88k-dg-dguxbcs${UNAME_RELEASE} + fi + else + echo i586-dg-dgux${UNAME_RELEASE} + fi + exit ;; + M88*:DolphinOS:*:*) # DolphinOS (SVR3) + echo m88k-dolphin-sysv3 + exit ;; + M88*:*:R3*:*) + # Delta 88k system running SVR3 + echo m88k-motorola-sysv3 + exit ;; + XD88*:*:*:*) # Tektronix XD88 system running UTekV (SVR3) + echo m88k-tektronix-sysv3 + exit ;; + Tek43[0-9][0-9]:UTek:*:*) # Tektronix 4300 system running UTek (BSD) + echo m68k-tektronix-bsd + exit ;; + *:IRIX*:*:*) + echo mips-sgi-irix`echo ${UNAME_RELEASE}|sed -e 's/-/_/g'` + exit ;; + ????????:AIX?:[12].1:2) # AIX 2.2.1 or AIX 2.1.1 is RT/PC AIX. + echo romp-ibm-aix # uname -m gives an 8 hex-code CPU id + exit ;; # Note that: echo "'`uname -s`'" gives 'AIX ' + i*86:AIX:*:*) + echo i386-ibm-aix + exit ;; + ia64:AIX:*:*) + if [ -x /usr/bin/oslevel ] ; then + IBM_REV=`/usr/bin/oslevel` + else + IBM_REV=${UNAME_VERSION}.${UNAME_RELEASE} + fi + echo ${UNAME_MACHINE}-ibm-aix${IBM_REV} + exit ;; + *:AIX:2:3) + if grep bos325 /usr/include/stdio.h >/dev/null 2>&1; then + eval $set_cc_for_build + sed 's/^ //' << EOF >$dummy.c + #include + + main() + { + if (!__power_pc()) + exit(1); + puts("powerpc-ibm-aix3.2.5"); + exit(0); + } +EOF + if $CC_FOR_BUILD -o $dummy $dummy.c && SYSTEM_NAME=`$dummy` + then + echo "$SYSTEM_NAME" + else + echo rs6000-ibm-aix3.2.5 + fi + elif grep bos324 /usr/include/stdio.h >/dev/null 2>&1; then + echo rs6000-ibm-aix3.2.4 + else + echo rs6000-ibm-aix3.2 + fi + exit ;; + *:AIX:*:[4567]) + IBM_CPU_ID=`/usr/sbin/lsdev -C -c processor -S available | sed 1q | awk '{ print $1 }'` + if /usr/sbin/lsattr -El ${IBM_CPU_ID} | grep ' POWER' >/dev/null 2>&1; then + IBM_ARCH=rs6000 + else + IBM_ARCH=powerpc + fi + if [ -x /usr/bin/lslpp ] ; then + IBM_REV=`/usr/bin/lslpp -Lqc bos.rte.libc | + awk -F: '{ print $3 }' | sed s/[0-9]*$/0/` + else + IBM_REV=${UNAME_VERSION}.${UNAME_RELEASE} + fi + echo ${IBM_ARCH}-ibm-aix${IBM_REV} + exit ;; + *:AIX:*:*) + echo rs6000-ibm-aix + exit ;; + ibmrt:4.4BSD:*|romp-ibm:BSD:*) + echo romp-ibm-bsd4.4 + exit ;; + ibmrt:*BSD:*|romp-ibm:BSD:*) # covers RT/PC BSD and + echo romp-ibm-bsd${UNAME_RELEASE} # 4.3 with uname added to + exit ;; # report: romp-ibm BSD 4.3 + *:BOSX:*:*) + echo rs6000-bull-bosx + exit ;; + DPX/2?00:B.O.S.:*:*) + echo m68k-bull-sysv3 + exit ;; + 9000/[34]??:4.3bsd:1.*:*) + echo m68k-hp-bsd + exit ;; + hp300:4.4BSD:*:* | 9000/[34]??:4.3bsd:2.*:*) + echo m68k-hp-bsd4.4 + exit ;; + 9000/[34678]??:HP-UX:*:*) + HPUX_REV=`echo ${UNAME_RELEASE}|sed -e 's/[^.]*.[0B]*//'` + case "${UNAME_MACHINE}" in + 9000/31? ) HP_ARCH=m68000 ;; + 9000/[34]?? ) HP_ARCH=m68k ;; + 9000/[678][0-9][0-9]) + if [ -x /usr/bin/getconf ]; then + sc_cpu_version=`/usr/bin/getconf SC_CPU_VERSION 2>/dev/null` + sc_kernel_bits=`/usr/bin/getconf SC_KERNEL_BITS 2>/dev/null` + case "${sc_cpu_version}" in + 523) HP_ARCH="hppa1.0" ;; # CPU_PA_RISC1_0 + 528) HP_ARCH="hppa1.1" ;; # CPU_PA_RISC1_1 + 532) # CPU_PA_RISC2_0 + case "${sc_kernel_bits}" in + 32) HP_ARCH="hppa2.0n" ;; + 64) HP_ARCH="hppa2.0w" ;; + '') HP_ARCH="hppa2.0" ;; # HP-UX 10.20 + esac ;; + esac + fi + if [ "${HP_ARCH}" = "" ]; then + eval $set_cc_for_build + sed 's/^ //' << EOF >$dummy.c + + #define _HPUX_SOURCE + #include + #include + + int main () + { + #if defined(_SC_KERNEL_BITS) + long bits = sysconf(_SC_KERNEL_BITS); + #endif + long cpu = sysconf (_SC_CPU_VERSION); + + switch (cpu) + { + case CPU_PA_RISC1_0: puts ("hppa1.0"); break; + case CPU_PA_RISC1_1: puts ("hppa1.1"); break; + case CPU_PA_RISC2_0: + #if defined(_SC_KERNEL_BITS) + switch (bits) + { + case 64: puts ("hppa2.0w"); break; + case 32: puts ("hppa2.0n"); break; + default: puts ("hppa2.0"); break; + } break; + #else /* !defined(_SC_KERNEL_BITS) */ + puts ("hppa2.0"); break; + #endif + default: puts ("hppa1.0"); break; + } + exit (0); + } +EOF + (CCOPTS= $CC_FOR_BUILD -o $dummy $dummy.c 2>/dev/null) && HP_ARCH=`$dummy` + test -z "$HP_ARCH" && HP_ARCH=hppa + fi ;; + esac + if [ ${HP_ARCH} = "hppa2.0w" ] + then + eval $set_cc_for_build + + # hppa2.0w-hp-hpux* has a 64-bit kernel and a compiler generating + # 32-bit code. hppa64-hp-hpux* has the same kernel and a compiler + # generating 64-bit code. GNU and HP use different nomenclature: + # + # $ CC_FOR_BUILD=cc ./config.guess + # => hppa2.0w-hp-hpux11.23 + # $ CC_FOR_BUILD="cc +DA2.0w" ./config.guess + # => hppa64-hp-hpux11.23 + + if echo __LP64__ | (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | + grep -q __LP64__ + then + HP_ARCH="hppa2.0w" + else + HP_ARCH="hppa64" + fi + fi + echo ${HP_ARCH}-hp-hpux${HPUX_REV} + exit ;; + ia64:HP-UX:*:*) + HPUX_REV=`echo ${UNAME_RELEASE}|sed -e 's/[^.]*.[0B]*//'` + echo ia64-hp-hpux${HPUX_REV} + exit ;; + 3050*:HI-UX:*:*) + eval $set_cc_for_build + sed 's/^ //' << EOF >$dummy.c + #include + int + main () + { + long cpu = sysconf (_SC_CPU_VERSION); + /* The order matters, because CPU_IS_HP_MC68K erroneously returns + true for CPU_PA_RISC1_0. CPU_IS_PA_RISC returns correct + results, however. */ + if (CPU_IS_PA_RISC (cpu)) + { + switch (cpu) + { + case CPU_PA_RISC1_0: puts ("hppa1.0-hitachi-hiuxwe2"); break; + case CPU_PA_RISC1_1: puts ("hppa1.1-hitachi-hiuxwe2"); break; + case CPU_PA_RISC2_0: puts ("hppa2.0-hitachi-hiuxwe2"); break; + default: puts ("hppa-hitachi-hiuxwe2"); break; + } + } + else if (CPU_IS_HP_MC68K (cpu)) + puts ("m68k-hitachi-hiuxwe2"); + else puts ("unknown-hitachi-hiuxwe2"); + exit (0); + } +EOF + $CC_FOR_BUILD -o $dummy $dummy.c && SYSTEM_NAME=`$dummy` && + { echo "$SYSTEM_NAME"; exit; } + echo unknown-hitachi-hiuxwe2 + exit ;; + 9000/7??:4.3bsd:*:* | 9000/8?[79]:4.3bsd:*:* ) + echo hppa1.1-hp-bsd + exit ;; + 9000/8??:4.3bsd:*:*) + echo hppa1.0-hp-bsd + exit ;; + *9??*:MPE/iX:*:* | *3000*:MPE/iX:*:*) + echo hppa1.0-hp-mpeix + exit ;; + hp7??:OSF1:*:* | hp8?[79]:OSF1:*:* ) + echo hppa1.1-hp-osf + exit ;; + hp8??:OSF1:*:*) + echo hppa1.0-hp-osf + exit ;; + i*86:OSF1:*:*) + if [ -x /usr/sbin/sysversion ] ; then + echo ${UNAME_MACHINE}-unknown-osf1mk + else + echo ${UNAME_MACHINE}-unknown-osf1 + fi + exit ;; + parisc*:Lites*:*:*) + echo hppa1.1-hp-lites + exit ;; + C1*:ConvexOS:*:* | convex:ConvexOS:C1*:*) + echo c1-convex-bsd + exit ;; + C2*:ConvexOS:*:* | convex:ConvexOS:C2*:*) + if getsysinfo -f scalar_acc + then echo c32-convex-bsd + else echo c2-convex-bsd + fi + exit ;; + C34*:ConvexOS:*:* | convex:ConvexOS:C34*:*) + echo c34-convex-bsd + exit ;; + C38*:ConvexOS:*:* | convex:ConvexOS:C38*:*) + echo c38-convex-bsd + exit ;; + C4*:ConvexOS:*:* | convex:ConvexOS:C4*:*) + echo c4-convex-bsd + exit ;; + CRAY*Y-MP:*:*:*) + echo ymp-cray-unicos${UNAME_RELEASE} | sed -e 's/\.[^.]*$/.X/' + exit ;; + CRAY*[A-Z]90:*:*:*) + echo ${UNAME_MACHINE}-cray-unicos${UNAME_RELEASE} \ + | sed -e 's/CRAY.*\([A-Z]90\)/\1/' \ + -e y/ABCDEFGHIJKLMNOPQRSTUVWXYZ/abcdefghijklmnopqrstuvwxyz/ \ + -e 's/\.[^.]*$/.X/' + exit ;; + CRAY*TS:*:*:*) + echo t90-cray-unicos${UNAME_RELEASE} | sed -e 's/\.[^.]*$/.X/' + exit ;; + CRAY*T3E:*:*:*) + echo alphaev5-cray-unicosmk${UNAME_RELEASE} | sed -e 's/\.[^.]*$/.X/' + exit ;; + CRAY*SV1:*:*:*) + echo sv1-cray-unicos${UNAME_RELEASE} | sed -e 's/\.[^.]*$/.X/' + exit ;; + *:UNICOS/mp:*:*) + echo craynv-cray-unicosmp${UNAME_RELEASE} | sed -e 's/\.[^.]*$/.X/' + exit ;; + F30[01]:UNIX_System_V:*:* | F700:UNIX_System_V:*:*) + FUJITSU_PROC=`uname -m | tr 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' 'abcdefghijklmnopqrstuvwxyz'` + FUJITSU_SYS=`uname -p | tr 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' 'abcdefghijklmnopqrstuvwxyz' | sed -e 's/\///'` + FUJITSU_REL=`echo ${UNAME_RELEASE} | sed -e 's/ /_/'` + echo "${FUJITSU_PROC}-fujitsu-${FUJITSU_SYS}${FUJITSU_REL}" + exit ;; + 5000:UNIX_System_V:4.*:*) + FUJITSU_SYS=`uname -p | tr 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' 'abcdefghijklmnopqrstuvwxyz' | sed -e 's/\///'` + FUJITSU_REL=`echo ${UNAME_RELEASE} | tr 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' 'abcdefghijklmnopqrstuvwxyz' | sed -e 's/ /_/'` + echo "sparc-fujitsu-${FUJITSU_SYS}${FUJITSU_REL}" + exit ;; + i*86:BSD/386:*:* | i*86:BSD/OS:*:* | *:Ascend\ Embedded/OS:*:*) + echo ${UNAME_MACHINE}-pc-bsdi${UNAME_RELEASE} + exit ;; + sparc*:BSD/OS:*:*) + echo sparc-unknown-bsdi${UNAME_RELEASE} + exit ;; + *:BSD/OS:*:*) + echo ${UNAME_MACHINE}-unknown-bsdi${UNAME_RELEASE} + exit ;; + *:FreeBSD:*:*) + UNAME_PROCESSOR=`/usr/bin/uname -p` + case ${UNAME_PROCESSOR} in + amd64) + echo x86_64-unknown-freebsd`echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'` ;; + *) + echo ${UNAME_PROCESSOR}-unknown-freebsd`echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'` ;; + esac + exit ;; + i*:CYGWIN*:*) + echo ${UNAME_MACHINE}-pc-cygwin + exit ;; + *:MINGW64*:*) + echo ${UNAME_MACHINE}-pc-mingw64 + exit ;; + *:MINGW*:*) + echo ${UNAME_MACHINE}-pc-mingw32 + exit ;; + *:MSYS*:*) + echo ${UNAME_MACHINE}-pc-msys + exit ;; + i*:windows32*:*) + # uname -m includes "-pc" on this system. + echo ${UNAME_MACHINE}-mingw32 + exit ;; + i*:PW*:*) + echo ${UNAME_MACHINE}-pc-pw32 + exit ;; + *:Interix*:*) + case ${UNAME_MACHINE} in + x86) + echo i586-pc-interix${UNAME_RELEASE} + exit ;; + authenticamd | genuineintel | EM64T) + echo x86_64-unknown-interix${UNAME_RELEASE} + exit ;; + IA64) + echo ia64-unknown-interix${UNAME_RELEASE} + exit ;; + esac ;; + [345]86:Windows_95:* | [345]86:Windows_98:* | [345]86:Windows_NT:*) + echo i${UNAME_MACHINE}-pc-mks + exit ;; + 8664:Windows_NT:*) + echo x86_64-pc-mks + exit ;; + i*:Windows_NT*:* | Pentium*:Windows_NT*:*) + # How do we know it's Interix rather than the generic POSIX subsystem? + # It also conflicts with pre-2.0 versions of AT&T UWIN. Should we + # UNAME_MACHINE based on the output of uname instead of i386? + echo i586-pc-interix + exit ;; + i*:UWIN*:*) + echo ${UNAME_MACHINE}-pc-uwin + exit ;; + amd64:CYGWIN*:*:* | x86_64:CYGWIN*:*:*) + echo x86_64-unknown-cygwin + exit ;; + p*:CYGWIN*:*) + echo powerpcle-unknown-cygwin + exit ;; + prep*:SunOS:5.*:*) + echo powerpcle-unknown-solaris2`echo ${UNAME_RELEASE}|sed -e 's/[^.]*//'` + exit ;; + *:GNU:*:*) + # the GNU system + echo `echo ${UNAME_MACHINE}|sed -e 's,[-/].*$,,'`-unknown-${LIBC}`echo ${UNAME_RELEASE}|sed -e 's,/.*$,,'` + exit ;; + *:GNU/*:*:*) + # other systems with GNU libc and userland + echo ${UNAME_MACHINE}-unknown-`echo ${UNAME_SYSTEM} | sed 's,^[^/]*/,,' | tr '[A-Z]' '[a-z]'``echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`-${LIBC} + exit ;; + i*86:Minix:*:*) + echo ${UNAME_MACHINE}-pc-minix + exit ;; + aarch64:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + aarch64_be:Linux:*:*) + UNAME_MACHINE=aarch64_be + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + alpha:Linux:*:*) + case `sed -n '/^cpu model/s/^.*: \(.*\)/\1/p' < /proc/cpuinfo` in + EV5) UNAME_MACHINE=alphaev5 ;; + EV56) UNAME_MACHINE=alphaev56 ;; + PCA56) UNAME_MACHINE=alphapca56 ;; + PCA57) UNAME_MACHINE=alphapca56 ;; + EV6) UNAME_MACHINE=alphaev6 ;; + EV67) UNAME_MACHINE=alphaev67 ;; + EV68*) UNAME_MACHINE=alphaev68 ;; + esac + objdump --private-headers /bin/sh | grep -q ld.so.1 + if test "$?" = 0 ; then LIBC="gnulibc1" ; fi + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + arc:Linux:*:* | arceb:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + arm*:Linux:*:*) + eval $set_cc_for_build + if echo __ARM_EABI__ | $CC_FOR_BUILD -E - 2>/dev/null \ + | grep -q __ARM_EABI__ + then + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + else + if echo __ARM_PCS_VFP | $CC_FOR_BUILD -E - 2>/dev/null \ + | grep -q __ARM_PCS_VFP + then + echo ${UNAME_MACHINE}-unknown-linux-${LIBC}eabi + else + echo ${UNAME_MACHINE}-unknown-linux-${LIBC}eabihf + fi + fi + exit ;; + avr32*:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + cris:Linux:*:*) + echo ${UNAME_MACHINE}-axis-linux-${LIBC} + exit ;; + crisv32:Linux:*:*) + echo ${UNAME_MACHINE}-axis-linux-${LIBC} + exit ;; + e2k:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + frv:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + hexagon:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + i*86:Linux:*:*) + echo ${UNAME_MACHINE}-pc-linux-${LIBC} + exit ;; + ia64:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + m32r*:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + m68*:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + mips:Linux:*:* | mips64:Linux:*:*) + eval $set_cc_for_build + sed 's/^ //' << EOF >$dummy.c + #undef CPU + #undef ${UNAME_MACHINE} + #undef ${UNAME_MACHINE}el + #if defined(__MIPSEL__) || defined(__MIPSEL) || defined(_MIPSEL) || defined(MIPSEL) + CPU=${UNAME_MACHINE}el + #else + #if defined(__MIPSEB__) || defined(__MIPSEB) || defined(_MIPSEB) || defined(MIPSEB) + CPU=${UNAME_MACHINE} + #else + CPU= + #endif + #endif +EOF + eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^CPU'` + test x"${CPU}" != x && { echo "${CPU}-unknown-linux-${LIBC}"; exit; } + ;; + openrisc*:Linux:*:*) + echo or1k-unknown-linux-${LIBC} + exit ;; + or32:Linux:*:* | or1k*:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + padre:Linux:*:*) + echo sparc-unknown-linux-${LIBC} + exit ;; + parisc64:Linux:*:* | hppa64:Linux:*:*) + echo hppa64-unknown-linux-${LIBC} + exit ;; + parisc:Linux:*:* | hppa:Linux:*:*) + # Look for CPU level + case `grep '^cpu[^a-z]*:' /proc/cpuinfo 2>/dev/null | cut -d' ' -f2` in + PA7*) echo hppa1.1-unknown-linux-${LIBC} ;; + PA8*) echo hppa2.0-unknown-linux-${LIBC} ;; + *) echo hppa-unknown-linux-${LIBC} ;; + esac + exit ;; + ppc64:Linux:*:*) + echo powerpc64-unknown-linux-${LIBC} + exit ;; + ppc:Linux:*:*) + echo powerpc-unknown-linux-${LIBC} + exit ;; + ppc64le:Linux:*:*) + echo powerpc64le-unknown-linux-${LIBC} + exit ;; + ppcle:Linux:*:*) + echo powerpcle-unknown-linux-${LIBC} + exit ;; + s390:Linux:*:* | s390x:Linux:*:*) + echo ${UNAME_MACHINE}-ibm-linux-${LIBC} + exit ;; + sh64*:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + sh*:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + sparc:Linux:*:* | sparc64:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + tile*:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + vax:Linux:*:*) + echo ${UNAME_MACHINE}-dec-linux-${LIBC} + exit ;; + x86_64:Linux:*:*) + echo ${UNAME_MACHINE}-pc-linux-${LIBC} + exit ;; + xtensa*:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + i*86:DYNIX/ptx:4*:*) + # ptx 4.0 does uname -s correctly, with DYNIX/ptx in there. + # earlier versions are messed up and put the nodename in both + # sysname and nodename. + echo i386-sequent-sysv4 + exit ;; + i*86:UNIX_SV:4.2MP:2.*) + # Unixware is an offshoot of SVR4, but it has its own version + # number series starting with 2... + # I am not positive that other SVR4 systems won't match this, + # I just have to hope. -- rms. + # Use sysv4.2uw... so that sysv4* matches it. + echo ${UNAME_MACHINE}-pc-sysv4.2uw${UNAME_VERSION} + exit ;; + i*86:OS/2:*:*) + # If we were able to find `uname', then EMX Unix compatibility + # is probably installed. + echo ${UNAME_MACHINE}-pc-os2-emx + exit ;; + i*86:XTS-300:*:STOP) + echo ${UNAME_MACHINE}-unknown-stop + exit ;; + i*86:atheos:*:*) + echo ${UNAME_MACHINE}-unknown-atheos + exit ;; + i*86:syllable:*:*) + echo ${UNAME_MACHINE}-pc-syllable + exit ;; + i*86:LynxOS:2.*:* | i*86:LynxOS:3.[01]*:* | i*86:LynxOS:4.[02]*:*) + echo i386-unknown-lynxos${UNAME_RELEASE} + exit ;; + i*86:*DOS:*:*) + echo ${UNAME_MACHINE}-pc-msdosdjgpp + exit ;; + i*86:*:4.*:* | i*86:SYSTEM_V:4.*:*) + UNAME_REL=`echo ${UNAME_RELEASE} | sed 's/\/MP$//'` + if grep Novell /usr/include/link.h >/dev/null 2>/dev/null; then + echo ${UNAME_MACHINE}-univel-sysv${UNAME_REL} + else + echo ${UNAME_MACHINE}-pc-sysv${UNAME_REL} + fi + exit ;; + i*86:*:5:[678]*) + # UnixWare 7.x, OpenUNIX and OpenServer 6. + case `/bin/uname -X | grep "^Machine"` in + *486*) UNAME_MACHINE=i486 ;; + *Pentium) UNAME_MACHINE=i586 ;; + *Pent*|*Celeron) UNAME_MACHINE=i686 ;; + esac + echo ${UNAME_MACHINE}-unknown-sysv${UNAME_RELEASE}${UNAME_SYSTEM}${UNAME_VERSION} + exit ;; + i*86:*:3.2:*) + if test -f /usr/options/cb.name; then + UNAME_REL=`sed -n 's/.*Version //p' /dev/null >/dev/null ; then + UNAME_REL=`(/bin/uname -X|grep Release|sed -e 's/.*= //')` + (/bin/uname -X|grep i80486 >/dev/null) && UNAME_MACHINE=i486 + (/bin/uname -X|grep '^Machine.*Pentium' >/dev/null) \ + && UNAME_MACHINE=i586 + (/bin/uname -X|grep '^Machine.*Pent *II' >/dev/null) \ + && UNAME_MACHINE=i686 + (/bin/uname -X|grep '^Machine.*Pentium Pro' >/dev/null) \ + && UNAME_MACHINE=i686 + echo ${UNAME_MACHINE}-pc-sco$UNAME_REL + else + echo ${UNAME_MACHINE}-pc-sysv32 + fi + exit ;; + pc:*:*:*) + # Left here for compatibility: + # uname -m prints for DJGPP always 'pc', but it prints nothing about + # the processor, so we play safe by assuming i586. + # Note: whatever this is, it MUST be the same as what config.sub + # prints for the "djgpp" host, or else GDB configury will decide that + # this is a cross-build. + echo i586-pc-msdosdjgpp + exit ;; + Intel:Mach:3*:*) + echo i386-pc-mach3 + exit ;; + paragon:*:*:*) + echo i860-intel-osf1 + exit ;; + i860:*:4.*:*) # i860-SVR4 + if grep Stardent /usr/include/sys/uadmin.h >/dev/null 2>&1 ; then + echo i860-stardent-sysv${UNAME_RELEASE} # Stardent Vistra i860-SVR4 + else # Add other i860-SVR4 vendors below as they are discovered. + echo i860-unknown-sysv${UNAME_RELEASE} # Unknown i860-SVR4 + fi + exit ;; + mini*:CTIX:SYS*5:*) + # "miniframe" + echo m68010-convergent-sysv + exit ;; + mc68k:UNIX:SYSTEM5:3.51m) + echo m68k-convergent-sysv + exit ;; + M680?0:D-NIX:5.3:*) + echo m68k-diab-dnix + exit ;; + M68*:*:R3V[5678]*:*) + test -r /sysV68 && { echo 'm68k-motorola-sysv'; exit; } ;; + 3[345]??:*:4.0:3.0 | 3[34]??A:*:4.0:3.0 | 3[34]??,*:*:4.0:3.0 | 3[34]??/*:*:4.0:3.0 | 4400:*:4.0:3.0 | 4850:*:4.0:3.0 | SKA40:*:4.0:3.0 | SDS2:*:4.0:3.0 | SHG2:*:4.0:3.0 | S7501*:*:4.0:3.0) + OS_REL='' + test -r /etc/.relid \ + && OS_REL=.`sed -n 's/[^ ]* [^ ]* \([0-9][0-9]\).*/\1/p' < /etc/.relid` + /bin/uname -p 2>/dev/null | grep 86 >/dev/null \ + && { echo i486-ncr-sysv4.3${OS_REL}; exit; } + /bin/uname -p 2>/dev/null | /bin/grep entium >/dev/null \ + && { echo i586-ncr-sysv4.3${OS_REL}; exit; } ;; + 3[34]??:*:4.0:* | 3[34]??,*:*:4.0:*) + /bin/uname -p 2>/dev/null | grep 86 >/dev/null \ + && { echo i486-ncr-sysv4; exit; } ;; + NCR*:*:4.2:* | MPRAS*:*:4.2:*) + OS_REL='.3' + test -r /etc/.relid \ + && OS_REL=.`sed -n 's/[^ ]* [^ ]* \([0-9][0-9]\).*/\1/p' < /etc/.relid` + /bin/uname -p 2>/dev/null | grep 86 >/dev/null \ + && { echo i486-ncr-sysv4.3${OS_REL}; exit; } + /bin/uname -p 2>/dev/null | /bin/grep entium >/dev/null \ + && { echo i586-ncr-sysv4.3${OS_REL}; exit; } + /bin/uname -p 2>/dev/null | /bin/grep pteron >/dev/null \ + && { echo i586-ncr-sysv4.3${OS_REL}; exit; } ;; + m68*:LynxOS:2.*:* | m68*:LynxOS:3.0*:*) + echo m68k-unknown-lynxos${UNAME_RELEASE} + exit ;; + mc68030:UNIX_System_V:4.*:*) + echo m68k-atari-sysv4 + exit ;; + TSUNAMI:LynxOS:2.*:*) + echo sparc-unknown-lynxos${UNAME_RELEASE} + exit ;; + rs6000:LynxOS:2.*:*) + echo rs6000-unknown-lynxos${UNAME_RELEASE} + exit ;; + PowerPC:LynxOS:2.*:* | PowerPC:LynxOS:3.[01]*:* | PowerPC:LynxOS:4.[02]*:*) + echo powerpc-unknown-lynxos${UNAME_RELEASE} + exit ;; + SM[BE]S:UNIX_SV:*:*) + echo mips-dde-sysv${UNAME_RELEASE} + exit ;; + RM*:ReliantUNIX-*:*:*) + echo mips-sni-sysv4 + exit ;; + RM*:SINIX-*:*:*) + echo mips-sni-sysv4 + exit ;; + *:SINIX-*:*:*) + if uname -p 2>/dev/null >/dev/null ; then + UNAME_MACHINE=`(uname -p) 2>/dev/null` + echo ${UNAME_MACHINE}-sni-sysv4 + else + echo ns32k-sni-sysv + fi + exit ;; + PENTIUM:*:4.0*:*) # Unisys `ClearPath HMP IX 4000' SVR4/MP effort + # says + echo i586-unisys-sysv4 + exit ;; + *:UNIX_System_V:4*:FTX*) + # From Gerald Hewes . + # How about differentiating between stratus architectures? -djm + echo hppa1.1-stratus-sysv4 + exit ;; + *:*:*:FTX*) + # From seanf@swdc.stratus.com. + echo i860-stratus-sysv4 + exit ;; + i*86:VOS:*:*) + # From Paul.Green@stratus.com. + echo ${UNAME_MACHINE}-stratus-vos + exit ;; + *:VOS:*:*) + # From Paul.Green@stratus.com. + echo hppa1.1-stratus-vos + exit ;; + mc68*:A/UX:*:*) + echo m68k-apple-aux${UNAME_RELEASE} + exit ;; + news*:NEWS-OS:6*:*) + echo mips-sony-newsos6 + exit ;; + R[34]000:*System_V*:*:* | R4000:UNIX_SYSV:*:* | R*000:UNIX_SV:*:*) + if [ -d /usr/nec ]; then + echo mips-nec-sysv${UNAME_RELEASE} + else + echo mips-unknown-sysv${UNAME_RELEASE} + fi + exit ;; + BeBox:BeOS:*:*) # BeOS running on hardware made by Be, PPC only. + echo powerpc-be-beos + exit ;; + BeMac:BeOS:*:*) # BeOS running on Mac or Mac clone, PPC only. + echo powerpc-apple-beos + exit ;; + BePC:BeOS:*:*) # BeOS running on Intel PC compatible. + echo i586-pc-beos + exit ;; + BePC:Haiku:*:*) # Haiku running on Intel PC compatible. + echo i586-pc-haiku + exit ;; + x86_64:Haiku:*:*) + echo x86_64-unknown-haiku + exit ;; + SX-4:SUPER-UX:*:*) + echo sx4-nec-superux${UNAME_RELEASE} + exit ;; + SX-5:SUPER-UX:*:*) + echo sx5-nec-superux${UNAME_RELEASE} + exit ;; + SX-6:SUPER-UX:*:*) + echo sx6-nec-superux${UNAME_RELEASE} + exit ;; + SX-7:SUPER-UX:*:*) + echo sx7-nec-superux${UNAME_RELEASE} + exit ;; + SX-8:SUPER-UX:*:*) + echo sx8-nec-superux${UNAME_RELEASE} + exit ;; + SX-8R:SUPER-UX:*:*) + echo sx8r-nec-superux${UNAME_RELEASE} + exit ;; + Power*:Rhapsody:*:*) + echo powerpc-apple-rhapsody${UNAME_RELEASE} + exit ;; + *:Rhapsody:*:*) + echo ${UNAME_MACHINE}-apple-rhapsody${UNAME_RELEASE} + exit ;; + *:Darwin:*:*) + UNAME_PROCESSOR=`uname -p` || UNAME_PROCESSOR=unknown + eval $set_cc_for_build + if test "$UNAME_PROCESSOR" = unknown ; then + UNAME_PROCESSOR=powerpc + fi + if test `echo "$UNAME_RELEASE" | sed -e 's/\..*//'` -le 10 ; then + if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then + if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \ + (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \ + grep IS_64BIT_ARCH >/dev/null + then + case $UNAME_PROCESSOR in + i386) UNAME_PROCESSOR=x86_64 ;; + powerpc) UNAME_PROCESSOR=powerpc64 ;; + esac + fi + fi + elif test "$UNAME_PROCESSOR" = i386 ; then + # Avoid executing cc on OS X 10.9, as it ships with a stub + # that puts up a graphical alert prompting to install + # developer tools. Any system running Mac OS X 10.7 or + # later (Darwin 11 and later) is required to have a 64-bit + # processor. This is not true of the ARM version of Darwin + # that Apple uses in portable devices. + UNAME_PROCESSOR=x86_64 + fi + echo ${UNAME_PROCESSOR}-apple-darwin${UNAME_RELEASE} + exit ;; + *:procnto*:*:* | *:QNX:[0123456789]*:*) + UNAME_PROCESSOR=`uname -p` + if test "$UNAME_PROCESSOR" = "x86"; then + UNAME_PROCESSOR=i386 + UNAME_MACHINE=pc + fi + echo ${UNAME_PROCESSOR}-${UNAME_MACHINE}-nto-qnx${UNAME_RELEASE} + exit ;; + *:QNX:*:4*) + echo i386-pc-qnx + exit ;; + NEO-?:NONSTOP_KERNEL:*:*) + echo neo-tandem-nsk${UNAME_RELEASE} + exit ;; + NSE-*:NONSTOP_KERNEL:*:*) + echo nse-tandem-nsk${UNAME_RELEASE} + exit ;; + NSR-?:NONSTOP_KERNEL:*:*) + echo nsr-tandem-nsk${UNAME_RELEASE} + exit ;; + *:NonStop-UX:*:*) + echo mips-compaq-nonstopux + exit ;; + BS2000:POSIX*:*:*) + echo bs2000-siemens-sysv + exit ;; + DS/*:UNIX_System_V:*:*) + echo ${UNAME_MACHINE}-${UNAME_SYSTEM}-${UNAME_RELEASE} + exit ;; + *:Plan9:*:*) + # "uname -m" is not consistent, so use $cputype instead. 386 + # is converted to i386 for consistency with other x86 + # operating systems. + if test "$cputype" = "386"; then + UNAME_MACHINE=i386 + else + UNAME_MACHINE="$cputype" + fi + echo ${UNAME_MACHINE}-unknown-plan9 + exit ;; + *:TOPS-10:*:*) + echo pdp10-unknown-tops10 + exit ;; + *:TENEX:*:*) + echo pdp10-unknown-tenex + exit ;; + KS10:TOPS-20:*:* | KL10:TOPS-20:*:* | TYPE4:TOPS-20:*:*) + echo pdp10-dec-tops20 + exit ;; + XKL-1:TOPS-20:*:* | TYPE5:TOPS-20:*:*) + echo pdp10-xkl-tops20 + exit ;; + *:TOPS-20:*:*) + echo pdp10-unknown-tops20 + exit ;; + *:ITS:*:*) + echo pdp10-unknown-its + exit ;; + SEI:*:*:SEIUX) + echo mips-sei-seiux${UNAME_RELEASE} + exit ;; + *:DragonFly:*:*) + echo ${UNAME_MACHINE}-unknown-dragonfly`echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'` + exit ;; + *:*VMS:*:*) + UNAME_MACHINE=`(uname -p) 2>/dev/null` + case "${UNAME_MACHINE}" in + A*) echo alpha-dec-vms ; exit ;; + I*) echo ia64-dec-vms ; exit ;; + V*) echo vax-dec-vms ; exit ;; + esac ;; + *:XENIX:*:SysV) + echo i386-pc-xenix + exit ;; + i*86:skyos:*:*) + echo ${UNAME_MACHINE}-pc-skyos`echo ${UNAME_RELEASE}` | sed -e 's/ .*$//' + exit ;; + i*86:rdos:*:*) + echo ${UNAME_MACHINE}-pc-rdos + exit ;; + i*86:AROS:*:*) + echo ${UNAME_MACHINE}-pc-aros + exit ;; + x86_64:VMkernel:*:*) + echo ${UNAME_MACHINE}-unknown-esx + exit ;; +esac + +cat >&2 < in order to provide the needed +information to handle your system. + +config.guess timestamp = $timestamp + +uname -m = `(uname -m) 2>/dev/null || echo unknown` +uname -r = `(uname -r) 2>/dev/null || echo unknown` +uname -s = `(uname -s) 2>/dev/null || echo unknown` +uname -v = `(uname -v) 2>/dev/null || echo unknown` + +/usr/bin/uname -p = `(/usr/bin/uname -p) 2>/dev/null` +/bin/uname -X = `(/bin/uname -X) 2>/dev/null` + +hostinfo = `(hostinfo) 2>/dev/null` +/bin/universe = `(/bin/universe) 2>/dev/null` +/usr/bin/arch -k = `(/usr/bin/arch -k) 2>/dev/null` +/bin/arch = `(/bin/arch) 2>/dev/null` +/usr/bin/oslevel = `(/usr/bin/oslevel) 2>/dev/null` +/usr/convex/getsysinfo = `(/usr/convex/getsysinfo) 2>/dev/null` + +UNAME_MACHINE = ${UNAME_MACHINE} +UNAME_RELEASE = ${UNAME_RELEASE} +UNAME_SYSTEM = ${UNAME_SYSTEM} +UNAME_VERSION = ${UNAME_VERSION} +EOF + +exit 1 + +# Local variables: +# eval: (add-hook 'write-file-hooks 'time-stamp) +# time-stamp-start: "timestamp='" +# time-stamp-format: "%:y-%02m-%02d" +# time-stamp-end: "'" +# End: diff --git a/config.h.in b/config.h.in new file mode 100644 index 0000000..a6cfd44 --- /dev/null +++ b/config.h.in @@ -0,0 +1,148 @@ +/* config.h.in. Generated from configure.ac by autoheader. */ + +/* Defined if advanced statistics are enabled. */ +#undef ADVANCED_STATS + +/* Defined if debug mode is enabled. */ +#undef DEBUG + +/* Regular expressions support enabled. */ +#undef FEATURE_REGEXP + +/* Define to 1 if you have the declaration of `strerror_r', and to 0 if you + don't. */ +#undef HAVE_DECL_STRERROR_R + +/* Define to 1 if you have the header file. */ +#undef HAVE_DLFCN_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_INTTYPES_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_MEMORY_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_STDINT_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_STDLIB_H + +/* Define to 1 if you have the `strdup' function. */ +#undef HAVE_STRDUP + +/* Define to 1 if you have the `strerror_r' function. */ +#undef HAVE_STRERROR_R + +/* Define to 1 if you have the header file. */ +#undef HAVE_STRINGS_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_STRING_H + +/* Define to 1 if you have the `strndup' function. */ +#undef HAVE_STRNDUP + +/* Define to 1 if you have the `strtok_r' function. */ +#undef HAVE_STRTOK_R + +/* Define to 1 if you have the header file. */ +#undef HAVE_SYS_SELECT_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_SYS_SOCKET_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_SYS_STAT_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_SYS_TYPES_H + +/* Define to 1 if you have the header file. */ +#undef HAVE_UNISTD_H + +/* Define to the sub-directory where libtool stores uninstalled libraries. */ +#undef LT_OBJDIR + +/* Defined if debug mode is disabled. */ +#undef NDEBUG + +/* Name of package */ +#undef PACKAGE + +/* Define to the address where bug reports for this package should be sent. */ +#undef PACKAGE_BUGREPORT + +/* Define to the full name of this package. */ +#undef PACKAGE_NAME + +/* Define to the full name and version of this package. */ +#undef PACKAGE_STRING + +/* Define to the one symbol short name of this package. */ +#undef PACKAGE_TARNAME + +/* Define to the home page for this package. */ +#undef PACKAGE_URL + +/* Define to the version of this package. */ +#undef PACKAGE_VERSION + +/* Define as the return type of signal handlers (`int' or `void'). */ +#undef RETSIGTYPE + +/* Define to the type of arg 1 for `select'. */ +#undef SELECT_TYPE_ARG1 + +/* Define to the type of args 2, 3 and 4 for `select'. */ +#undef SELECT_TYPE_ARG234 + +/* Define to the type of arg 5 for `select'. */ +#undef SELECT_TYPE_ARG5 + +/* Define to 1 if you have the ANSI C header files. */ +#undef STDC_HEADERS + +/* Define to 1 if strerror_r returns char *. */ +#undef STRERROR_R_CHAR_P + +/* Enable extensions on AIX 3, Interix. */ +#ifndef _ALL_SOURCE +# undef _ALL_SOURCE +#endif +/* Enable GNU extensions on systems that have them. */ +#ifndef _GNU_SOURCE +# undef _GNU_SOURCE +#endif +/* Enable threading extensions on Solaris. */ +#ifndef _POSIX_PTHREAD_SEMANTICS +# undef _POSIX_PTHREAD_SEMANTICS +#endif +/* Enable extensions on HP NonStop. */ +#ifndef _TANDEM_SOURCE +# undef _TANDEM_SOURCE +#endif +/* Enable general extensions on Solaris. */ +#ifndef __EXTENSIONS__ +# undef __EXTENSIONS__ +#endif + + +/* Version number of package */ +#undef VERSION + +/* Define to 1 if on MINIX. */ +#undef _MINIX + +/* Define to 2 if the system does not provide POSIX.1 features except with + this defined. */ +#undef _POSIX_1_SOURCE + +/* Define to 1 if you need to in order for `stat' and other things to work. */ +#undef _POSIX_SOURCE + +/* Define to empty if `const' does not conform to ANSI C. */ +#undef const + +/* Define to `unsigned int' if does not define. */ +#undef size_t diff --git a/config.sub b/config.sub new file mode 100755 index 0000000..1acc966 --- /dev/null +++ b/config.sub @@ -0,0 +1,1813 @@ +#! /bin/sh +# Configuration validation subroutine script. +# Copyright 1992-2015 Free Software Foundation, Inc. + +timestamp='2015-08-20' + +# This file is free software; you can redistribute it and/or modify it +# under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program; if not, see . +# +# As a special exception to the GNU General Public License, if you +# distribute this file as part of a program that contains a +# configuration script generated by Autoconf, you may include it under +# the same distribution terms that you use for the rest of that +# program. This Exception is an additional permission under section 7 +# of the GNU General Public License, version 3 ("GPLv3"). + + +# Please send patches to . +# +# Configuration subroutine to validate and canonicalize a configuration type. +# Supply the specified configuration type as an argument. +# If it is invalid, we print an error message on stderr and exit with code 1. +# Otherwise, we print the canonical config type on stdout and succeed. + +# You can get the latest version of this script from: +# http://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.sub;hb=HEAD + +# This file is supposed to be the same for all GNU packages +# and recognize all the CPU types, system types and aliases +# that are meaningful with *any* GNU software. +# Each package is responsible for reporting which valid configurations +# it does not support. The user should be able to distinguish +# a failure to support a valid configuration from a meaningless +# configuration. + +# The goal of this file is to map all the various variations of a given +# machine specification into a single specification in the form: +# CPU_TYPE-MANUFACTURER-OPERATING_SYSTEM +# or in some cases, the newer four-part form: +# CPU_TYPE-MANUFACTURER-KERNEL-OPERATING_SYSTEM +# It is wrong to echo any other type of specification. + +me=`echo "$0" | sed -e 's,.*/,,'` + +usage="\ +Usage: $0 [OPTION] CPU-MFR-OPSYS + $0 [OPTION] ALIAS + +Canonicalize a configuration name. + +Operation modes: + -h, --help print this help, then exit + -t, --time-stamp print date of last modification, then exit + -v, --version print version number, then exit + +Report bugs and patches to ." + +version="\ +GNU config.sub ($timestamp) + +Copyright 1992-2015 Free Software Foundation, Inc. + +This is free software; see the source for copying conditions. There is NO +warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE." + +help=" +Try \`$me --help' for more information." + +# Parse command line +while test $# -gt 0 ; do + case $1 in + --time-stamp | --time* | -t ) + echo "$timestamp" ; exit ;; + --version | -v ) + echo "$version" ; exit ;; + --help | --h* | -h ) + echo "$usage"; exit ;; + -- ) # Stop option processing + shift; break ;; + - ) # Use stdin as input. + break ;; + -* ) + echo "$me: invalid option $1$help" + exit 1 ;; + + *local*) + # First pass through any local machine types. + echo $1 + exit ;; + + * ) + break ;; + esac +done + +case $# in + 0) echo "$me: missing argument$help" >&2 + exit 1;; + 1) ;; + *) echo "$me: too many arguments$help" >&2 + exit 1;; +esac + +# Separate what the user gave into CPU-COMPANY and OS or KERNEL-OS (if any). +# Here we must recognize all the valid KERNEL-OS combinations. +maybe_os=`echo $1 | sed 's/^\(.*\)-\([^-]*-[^-]*\)$/\2/'` +case $maybe_os in + nto-qnx* | linux-gnu* | linux-android* | linux-dietlibc | linux-newlib* | \ + linux-musl* | linux-uclibc* | uclinux-uclibc* | uclinux-gnu* | kfreebsd*-gnu* | \ + knetbsd*-gnu* | netbsd*-gnu* | netbsd*-eabi* | \ + kopensolaris*-gnu* | \ + storm-chaos* | os2-emx* | rtmk-nova*) + os=-$maybe_os + basic_machine=`echo $1 | sed 's/^\(.*\)-\([^-]*-[^-]*\)$/\1/'` + ;; + android-linux) + os=-linux-android + basic_machine=`echo $1 | sed 's/^\(.*\)-\([^-]*-[^-]*\)$/\1/'`-unknown + ;; + *) + basic_machine=`echo $1 | sed 's/-[^-]*$//'` + if [ $basic_machine != $1 ] + then os=`echo $1 | sed 's/.*-/-/'` + else os=; fi + ;; +esac + +### Let's recognize common machines as not being operating systems so +### that things like config.sub decstation-3100 work. We also +### recognize some manufacturers as not being operating systems, so we +### can provide default operating systems below. +case $os in + -sun*os*) + # Prevent following clause from handling this invalid input. + ;; + -dec* | -mips* | -sequent* | -encore* | -pc532* | -sgi* | -sony* | \ + -att* | -7300* | -3300* | -delta* | -motorola* | -sun[234]* | \ + -unicom* | -ibm* | -next | -hp | -isi* | -apollo | -altos* | \ + -convergent* | -ncr* | -news | -32* | -3600* | -3100* | -hitachi* |\ + -c[123]* | -convex* | -sun | -crds | -omron* | -dg | -ultra | -tti* | \ + -harris | -dolphin | -highlevel | -gould | -cbm | -ns | -masscomp | \ + -apple | -axis | -knuth | -cray | -microblaze*) + os= + basic_machine=$1 + ;; + -bluegene*) + os=-cnk + ;; + -sim | -cisco | -oki | -wec | -winbond) + os= + basic_machine=$1 + ;; + -scout) + ;; + -wrs) + os=-vxworks + basic_machine=$1 + ;; + -chorusos*) + os=-chorusos + basic_machine=$1 + ;; + -chorusrdb) + os=-chorusrdb + basic_machine=$1 + ;; + -hiux*) + os=-hiuxwe2 + ;; + -sco6) + os=-sco5v6 + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -sco5) + os=-sco3.2v5 + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -sco4) + os=-sco3.2v4 + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -sco3.2.[4-9]*) + os=`echo $os | sed -e 's/sco3.2./sco3.2v/'` + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -sco3.2v[4-9]*) + # Don't forget version if it is 3.2v4 or newer. + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -sco5v6*) + # Don't forget version if it is 3.2v4 or newer. + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -sco*) + os=-sco3.2v2 + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -udk*) + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -isc) + os=-isc2.2 + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -clix*) + basic_machine=clipper-intergraph + ;; + -isc*) + basic_machine=`echo $1 | sed -e 's/86-.*/86-pc/'` + ;; + -lynx*178) + os=-lynxos178 + ;; + -lynx*5) + os=-lynxos5 + ;; + -lynx*) + os=-lynxos + ;; + -ptx*) + basic_machine=`echo $1 | sed -e 's/86-.*/86-sequent/'` + ;; + -windowsnt*) + os=`echo $os | sed -e 's/windowsnt/winnt/'` + ;; + -psos*) + os=-psos + ;; + -mint | -mint[0-9]*) + basic_machine=m68k-atari + os=-mint + ;; +esac + +# Decode aliases for certain CPU-COMPANY combinations. +case $basic_machine in + # Recognize the basic CPU types without company name. + # Some are omitted here because they have special meanings below. + 1750a | 580 \ + | a29k \ + | aarch64 | aarch64_be \ + | alpha | alphaev[4-8] | alphaev56 | alphaev6[78] | alphapca5[67] \ + | alpha64 | alpha64ev[4-8] | alpha64ev56 | alpha64ev6[78] | alpha64pca5[67] \ + | am33_2.0 \ + | arc | arceb \ + | arm | arm[bl]e | arme[lb] | armv[2-8] | armv[3-8][lb] | armv7[arm] \ + | avr | avr32 \ + | ba \ + | be32 | be64 \ + | bfin \ + | c4x | c8051 | clipper \ + | d10v | d30v | dlx | dsp16xx \ + | e2k | epiphany \ + | fido | fr30 | frv | ft32 \ + | h8300 | h8500 | hppa | hppa1.[01] | hppa2.0 | hppa2.0[nw] | hppa64 \ + | hexagon \ + | i370 | i860 | i960 | ia64 \ + | ip2k | iq2000 \ + | k1om \ + | le32 | le64 \ + | lm32 \ + | m32c | m32r | m32rle | m68000 | m68k | m88k \ + | maxq | mb | microblaze | microblazeel | mcore | mep | metag \ + | mips | mipsbe | mipseb | mipsel | mipsle \ + | mips16 \ + | mips64 | mips64el \ + | mips64octeon | mips64octeonel \ + | mips64orion | mips64orionel \ + | mips64r5900 | mips64r5900el \ + | mips64vr | mips64vrel \ + | mips64vr4100 | mips64vr4100el \ + | mips64vr4300 | mips64vr4300el \ + | mips64vr5000 | mips64vr5000el \ + | mips64vr5900 | mips64vr5900el \ + | mipsisa32 | mipsisa32el \ + | mipsisa32r2 | mipsisa32r2el \ + | mipsisa32r6 | mipsisa32r6el \ + | mipsisa64 | mipsisa64el \ + | mipsisa64r2 | mipsisa64r2el \ + | mipsisa64r6 | mipsisa64r6el \ + | mipsisa64sb1 | mipsisa64sb1el \ + | mipsisa64sr71k | mipsisa64sr71kel \ + | mipsr5900 | mipsr5900el \ + | mipstx39 | mipstx39el \ + | mn10200 | mn10300 \ + | moxie \ + | mt \ + | msp430 \ + | nds32 | nds32le | nds32be \ + | nios | nios2 | nios2eb | nios2el \ + | ns16k | ns32k \ + | open8 | or1k | or1knd | or32 \ + | pdp10 | pdp11 | pj | pjl \ + | powerpc | powerpc64 | powerpc64le | powerpcle \ + | pyramid \ + | riscv32 | riscv64 \ + | rl78 | rx \ + | score \ + | sh | sh[1234] | sh[24]a | sh[24]aeb | sh[23]e | sh[234]eb | sheb | shbe | shle | sh[1234]le | sh3ele \ + | sh64 | sh64le \ + | sparc | sparc64 | sparc64b | sparc64v | sparc86x | sparclet | sparclite \ + | sparcv8 | sparcv9 | sparcv9b | sparcv9v \ + | spu \ + | tahoe | tic4x | tic54x | tic55x | tic6x | tic80 | tron \ + | ubicom32 \ + | v850 | v850e | v850e1 | v850e2 | v850es | v850e2v3 \ + | visium \ + | we32k \ + | x86 | xc16x | xstormy16 | xtensa \ + | z8k | z80) + basic_machine=$basic_machine-unknown + ;; + c54x) + basic_machine=tic54x-unknown + ;; + c55x) + basic_machine=tic55x-unknown + ;; + c6x) + basic_machine=tic6x-unknown + ;; + leon|leon[3-9]) + basic_machine=sparc-$basic_machine + ;; + m6811 | m68hc11 | m6812 | m68hc12 | m68hcs12x | nvptx | picochip) + basic_machine=$basic_machine-unknown + os=-none + ;; + m88110 | m680[12346]0 | m683?2 | m68360 | m5200 | v70 | w65 | z8k) + ;; + ms1) + basic_machine=mt-unknown + ;; + + strongarm | thumb | xscale) + basic_machine=arm-unknown + ;; + xgate) + basic_machine=$basic_machine-unknown + os=-none + ;; + xscaleeb) + basic_machine=armeb-unknown + ;; + + xscaleel) + basic_machine=armel-unknown + ;; + + # We use `pc' rather than `unknown' + # because (1) that's what they normally are, and + # (2) the word "unknown" tends to confuse beginning users. + i*86 | x86_64) + basic_machine=$basic_machine-pc + ;; + # Object if more than one company name word. + *-*-*) + echo Invalid configuration \`$1\': machine \`$basic_machine\' not recognized 1>&2 + exit 1 + ;; + # Recognize the basic CPU types with company name. + 580-* \ + | a29k-* \ + | aarch64-* | aarch64_be-* \ + | alpha-* | alphaev[4-8]-* | alphaev56-* | alphaev6[78]-* \ + | alpha64-* | alpha64ev[4-8]-* | alpha64ev56-* | alpha64ev6[78]-* \ + | alphapca5[67]-* | alpha64pca5[67]-* | arc-* | arceb-* \ + | arm-* | armbe-* | armle-* | armeb-* | armv*-* \ + | avr-* | avr32-* \ + | ba-* \ + | be32-* | be64-* \ + | bfin-* | bs2000-* \ + | c[123]* | c30-* | [cjt]90-* | c4x-* \ + | c8051-* | clipper-* | craynv-* | cydra-* \ + | d10v-* | d30v-* | dlx-* \ + | e2k-* | elxsi-* \ + | f30[01]-* | f700-* | fido-* | fr30-* | frv-* | fx80-* \ + | h8300-* | h8500-* \ + | hppa-* | hppa1.[01]-* | hppa2.0-* | hppa2.0[nw]-* | hppa64-* \ + | hexagon-* \ + | i*86-* | i860-* | i960-* | ia64-* \ + | ip2k-* | iq2000-* \ + | k1om-* \ + | le32-* | le64-* \ + | lm32-* \ + | m32c-* | m32r-* | m32rle-* \ + | m68000-* | m680[012346]0-* | m68360-* | m683?2-* | m68k-* \ + | m88110-* | m88k-* | maxq-* | mcore-* | metag-* \ + | microblaze-* | microblazeel-* \ + | mips-* | mipsbe-* | mipseb-* | mipsel-* | mipsle-* \ + | mips16-* \ + | mips64-* | mips64el-* \ + | mips64octeon-* | mips64octeonel-* \ + | mips64orion-* | mips64orionel-* \ + | mips64r5900-* | mips64r5900el-* \ + | mips64vr-* | mips64vrel-* \ + | mips64vr4100-* | mips64vr4100el-* \ + | mips64vr4300-* | mips64vr4300el-* \ + | mips64vr5000-* | mips64vr5000el-* \ + | mips64vr5900-* | mips64vr5900el-* \ + | mipsisa32-* | mipsisa32el-* \ + | mipsisa32r2-* | mipsisa32r2el-* \ + | mipsisa32r6-* | mipsisa32r6el-* \ + | mipsisa64-* | mipsisa64el-* \ + | mipsisa64r2-* | mipsisa64r2el-* \ + | mipsisa64r6-* | mipsisa64r6el-* \ + | mipsisa64sb1-* | mipsisa64sb1el-* \ + | mipsisa64sr71k-* | mipsisa64sr71kel-* \ + | mipsr5900-* | mipsr5900el-* \ + | mipstx39-* | mipstx39el-* \ + | mmix-* \ + | mt-* \ + | msp430-* \ + | nds32-* | nds32le-* | nds32be-* \ + | nios-* | nios2-* | nios2eb-* | nios2el-* \ + | none-* | np1-* | ns16k-* | ns32k-* \ + | open8-* \ + | or1k*-* \ + | orion-* \ + | pdp10-* | pdp11-* | pj-* | pjl-* | pn-* | power-* \ + | powerpc-* | powerpc64-* | powerpc64le-* | powerpcle-* \ + | pyramid-* \ + | riscv32-* | riscv64-* \ + | rl78-* | romp-* | rs6000-* | rx-* \ + | sh-* | sh[1234]-* | sh[24]a-* | sh[24]aeb-* | sh[23]e-* | sh[34]eb-* | sheb-* | shbe-* \ + | shle-* | sh[1234]le-* | sh3ele-* | sh64-* | sh64le-* \ + | sparc-* | sparc64-* | sparc64b-* | sparc64v-* | sparc86x-* | sparclet-* \ + | sparclite-* \ + | sparcv8-* | sparcv9-* | sparcv9b-* | sparcv9v-* | sv1-* | sx*-* \ + | tahoe-* \ + | tic30-* | tic4x-* | tic54x-* | tic55x-* | tic6x-* | tic80-* \ + | tile*-* \ + | tron-* \ + | ubicom32-* \ + | v850-* | v850e-* | v850e1-* | v850es-* | v850e2-* | v850e2v3-* \ + | vax-* \ + | visium-* \ + | we32k-* \ + | x86-* | x86_64-* | xc16x-* | xps100-* \ + | xstormy16-* | xtensa*-* \ + | ymp-* \ + | z8k-* | z80-*) + ;; + # Recognize the basic CPU types without company name, with glob match. + xtensa*) + basic_machine=$basic_machine-unknown + ;; + # Recognize the various machine names and aliases which stand + # for a CPU type and a company and sometimes even an OS. + 386bsd) + basic_machine=i386-unknown + os=-bsd + ;; + 3b1 | 7300 | 7300-att | att-7300 | pc7300 | safari | unixpc) + basic_machine=m68000-att + ;; + 3b*) + basic_machine=we32k-att + ;; + a29khif) + basic_machine=a29k-amd + os=-udi + ;; + abacus) + basic_machine=abacus-unknown + ;; + adobe68k) + basic_machine=m68010-adobe + os=-scout + ;; + alliant | fx80) + basic_machine=fx80-alliant + ;; + altos | altos3068) + basic_machine=m68k-altos + ;; + am29k) + basic_machine=a29k-none + os=-bsd + ;; + amd64) + basic_machine=x86_64-pc + ;; + amd64-*) + basic_machine=x86_64-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + amdahl) + basic_machine=580-amdahl + os=-sysv + ;; + amiga | amiga-*) + basic_machine=m68k-unknown + ;; + amigaos | amigados) + basic_machine=m68k-unknown + os=-amigaos + ;; + amigaunix | amix) + basic_machine=m68k-unknown + os=-sysv4 + ;; + apollo68) + basic_machine=m68k-apollo + os=-sysv + ;; + apollo68bsd) + basic_machine=m68k-apollo + os=-bsd + ;; + aros) + basic_machine=i386-pc + os=-aros + ;; + asmjs) + basic_machine=asmjs-unknown + ;; + aux) + basic_machine=m68k-apple + os=-aux + ;; + balance) + basic_machine=ns32k-sequent + os=-dynix + ;; + blackfin) + basic_machine=bfin-unknown + os=-linux + ;; + blackfin-*) + basic_machine=bfin-`echo $basic_machine | sed 's/^[^-]*-//'` + os=-linux + ;; + bluegene*) + basic_machine=powerpc-ibm + os=-cnk + ;; + c54x-*) + basic_machine=tic54x-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + c55x-*) + basic_machine=tic55x-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + c6x-*) + basic_machine=tic6x-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + c90) + basic_machine=c90-cray + os=-unicos + ;; + cegcc) + basic_machine=arm-unknown + os=-cegcc + ;; + convex-c1) + basic_machine=c1-convex + os=-bsd + ;; + convex-c2) + basic_machine=c2-convex + os=-bsd + ;; + convex-c32) + basic_machine=c32-convex + os=-bsd + ;; + convex-c34) + basic_machine=c34-convex + os=-bsd + ;; + convex-c38) + basic_machine=c38-convex + os=-bsd + ;; + cray | j90) + basic_machine=j90-cray + os=-unicos + ;; + craynv) + basic_machine=craynv-cray + os=-unicosmp + ;; + cr16 | cr16-*) + basic_machine=cr16-unknown + os=-elf + ;; + crds | unos) + basic_machine=m68k-crds + ;; + crisv32 | crisv32-* | etraxfs*) + basic_machine=crisv32-axis + ;; + cris | cris-* | etrax*) + basic_machine=cris-axis + ;; + crx) + basic_machine=crx-unknown + os=-elf + ;; + da30 | da30-*) + basic_machine=m68k-da30 + ;; + decstation | decstation-3100 | pmax | pmax-* | pmin | dec3100 | decstatn) + basic_machine=mips-dec + ;; + decsystem10* | dec10*) + basic_machine=pdp10-dec + os=-tops10 + ;; + decsystem20* | dec20*) + basic_machine=pdp10-dec + os=-tops20 + ;; + delta | 3300 | motorola-3300 | motorola-delta \ + | 3300-motorola | delta-motorola) + basic_machine=m68k-motorola + ;; + delta88) + basic_machine=m88k-motorola + os=-sysv3 + ;; + dicos) + basic_machine=i686-pc + os=-dicos + ;; + djgpp) + basic_machine=i586-pc + os=-msdosdjgpp + ;; + dpx20 | dpx20-*) + basic_machine=rs6000-bull + os=-bosx + ;; + dpx2* | dpx2*-bull) + basic_machine=m68k-bull + os=-sysv3 + ;; + ebmon29k) + basic_machine=a29k-amd + os=-ebmon + ;; + elxsi) + basic_machine=elxsi-elxsi + os=-bsd + ;; + encore | umax | mmax) + basic_machine=ns32k-encore + ;; + es1800 | OSE68k | ose68k | ose | OSE) + basic_machine=m68k-ericsson + os=-ose + ;; + fx2800) + basic_machine=i860-alliant + ;; + genix) + basic_machine=ns32k-ns + ;; + gmicro) + basic_machine=tron-gmicro + os=-sysv + ;; + go32) + basic_machine=i386-pc + os=-go32 + ;; + h3050r* | hiux*) + basic_machine=hppa1.1-hitachi + os=-hiuxwe2 + ;; + h8300hms) + basic_machine=h8300-hitachi + os=-hms + ;; + h8300xray) + basic_machine=h8300-hitachi + os=-xray + ;; + h8500hms) + basic_machine=h8500-hitachi + os=-hms + ;; + harris) + basic_machine=m88k-harris + os=-sysv3 + ;; + hp300-*) + basic_machine=m68k-hp + ;; + hp300bsd) + basic_machine=m68k-hp + os=-bsd + ;; + hp300hpux) + basic_machine=m68k-hp + os=-hpux + ;; + hp3k9[0-9][0-9] | hp9[0-9][0-9]) + basic_machine=hppa1.0-hp + ;; + hp9k2[0-9][0-9] | hp9k31[0-9]) + basic_machine=m68000-hp + ;; + hp9k3[2-9][0-9]) + basic_machine=m68k-hp + ;; + hp9k6[0-9][0-9] | hp6[0-9][0-9]) + basic_machine=hppa1.0-hp + ;; + hp9k7[0-79][0-9] | hp7[0-79][0-9]) + basic_machine=hppa1.1-hp + ;; + hp9k78[0-9] | hp78[0-9]) + # FIXME: really hppa2.0-hp + basic_machine=hppa1.1-hp + ;; + hp9k8[67]1 | hp8[67]1 | hp9k80[24] | hp80[24] | hp9k8[78]9 | hp8[78]9 | hp9k893 | hp893) + # FIXME: really hppa2.0-hp + basic_machine=hppa1.1-hp + ;; + hp9k8[0-9][13679] | hp8[0-9][13679]) + basic_machine=hppa1.1-hp + ;; + hp9k8[0-9][0-9] | hp8[0-9][0-9]) + basic_machine=hppa1.0-hp + ;; + hppa-next) + os=-nextstep3 + ;; + hppaosf) + basic_machine=hppa1.1-hp + os=-osf + ;; + hppro) + basic_machine=hppa1.1-hp + os=-proelf + ;; + i370-ibm* | ibm*) + basic_machine=i370-ibm + ;; + i*86v32) + basic_machine=`echo $1 | sed -e 's/86.*/86-pc/'` + os=-sysv32 + ;; + i*86v4*) + basic_machine=`echo $1 | sed -e 's/86.*/86-pc/'` + os=-sysv4 + ;; + i*86v) + basic_machine=`echo $1 | sed -e 's/86.*/86-pc/'` + os=-sysv + ;; + i*86sol2) + basic_machine=`echo $1 | sed -e 's/86.*/86-pc/'` + os=-solaris2 + ;; + i386mach) + basic_machine=i386-mach + os=-mach + ;; + i386-vsta | vsta) + basic_machine=i386-unknown + os=-vsta + ;; + iris | iris4d) + basic_machine=mips-sgi + case $os in + -irix*) + ;; + *) + os=-irix4 + ;; + esac + ;; + isi68 | isi) + basic_machine=m68k-isi + os=-sysv + ;; + leon-*|leon[3-9]-*) + basic_machine=sparc-`echo $basic_machine | sed 's/-.*//'` + ;; + m68knommu) + basic_machine=m68k-unknown + os=-linux + ;; + m68knommu-*) + basic_machine=m68k-`echo $basic_machine | sed 's/^[^-]*-//'` + os=-linux + ;; + m88k-omron*) + basic_machine=m88k-omron + ;; + magnum | m3230) + basic_machine=mips-mips + os=-sysv + ;; + merlin) + basic_machine=ns32k-utek + os=-sysv + ;; + microblaze*) + basic_machine=microblaze-xilinx + ;; + mingw64) + basic_machine=x86_64-pc + os=-mingw64 + ;; + mingw32) + basic_machine=i686-pc + os=-mingw32 + ;; + mingw32ce) + basic_machine=arm-unknown + os=-mingw32ce + ;; + miniframe) + basic_machine=m68000-convergent + ;; + *mint | -mint[0-9]* | *MiNT | *MiNT[0-9]*) + basic_machine=m68k-atari + os=-mint + ;; + mips3*-*) + basic_machine=`echo $basic_machine | sed -e 's/mips3/mips64/'` + ;; + mips3*) + basic_machine=`echo $basic_machine | sed -e 's/mips3/mips64/'`-unknown + ;; + monitor) + basic_machine=m68k-rom68k + os=-coff + ;; + morphos) + basic_machine=powerpc-unknown + os=-morphos + ;; + moxiebox) + basic_machine=moxie-unknown + os=-moxiebox + ;; + msdos) + basic_machine=i386-pc + os=-msdos + ;; + ms1-*) + basic_machine=`echo $basic_machine | sed -e 's/ms1-/mt-/'` + ;; + msys) + basic_machine=i686-pc + os=-msys + ;; + mvs) + basic_machine=i370-ibm + os=-mvs + ;; + nacl) + basic_machine=le32-unknown + os=-nacl + ;; + ncr3000) + basic_machine=i486-ncr + os=-sysv4 + ;; + netbsd386) + basic_machine=i386-unknown + os=-netbsd + ;; + netwinder) + basic_machine=armv4l-rebel + os=-linux + ;; + news | news700 | news800 | news900) + basic_machine=m68k-sony + os=-newsos + ;; + news1000) + basic_machine=m68030-sony + os=-newsos + ;; + news-3600 | risc-news) + basic_machine=mips-sony + os=-newsos + ;; + necv70) + basic_machine=v70-nec + os=-sysv + ;; + next | m*-next ) + basic_machine=m68k-next + case $os in + -nextstep* ) + ;; + -ns2*) + os=-nextstep2 + ;; + *) + os=-nextstep3 + ;; + esac + ;; + nh3000) + basic_machine=m68k-harris + os=-cxux + ;; + nh[45]000) + basic_machine=m88k-harris + os=-cxux + ;; + nindy960) + basic_machine=i960-intel + os=-nindy + ;; + mon960) + basic_machine=i960-intel + os=-mon960 + ;; + nonstopux) + basic_machine=mips-compaq + os=-nonstopux + ;; + np1) + basic_machine=np1-gould + ;; + neo-tandem) + basic_machine=neo-tandem + ;; + nse-tandem) + basic_machine=nse-tandem + ;; + nsr-tandem) + basic_machine=nsr-tandem + ;; + op50n-* | op60c-*) + basic_machine=hppa1.1-oki + os=-proelf + ;; + openrisc | openrisc-*) + basic_machine=or32-unknown + ;; + os400) + basic_machine=powerpc-ibm + os=-os400 + ;; + OSE68000 | ose68000) + basic_machine=m68000-ericsson + os=-ose + ;; + os68k) + basic_machine=m68k-none + os=-os68k + ;; + pa-hitachi) + basic_machine=hppa1.1-hitachi + os=-hiuxwe2 + ;; + paragon) + basic_machine=i860-intel + os=-osf + ;; + parisc) + basic_machine=hppa-unknown + os=-linux + ;; + parisc-*) + basic_machine=hppa-`echo $basic_machine | sed 's/^[^-]*-//'` + os=-linux + ;; + pbd) + basic_machine=sparc-tti + ;; + pbb) + basic_machine=m68k-tti + ;; + pc532 | pc532-*) + basic_machine=ns32k-pc532 + ;; + pc98) + basic_machine=i386-pc + ;; + pc98-*) + basic_machine=i386-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + pentium | p5 | k5 | k6 | nexgen | viac3) + basic_machine=i586-pc + ;; + pentiumpro | p6 | 6x86 | athlon | athlon_*) + basic_machine=i686-pc + ;; + pentiumii | pentium2 | pentiumiii | pentium3) + basic_machine=i686-pc + ;; + pentium4) + basic_machine=i786-pc + ;; + pentium-* | p5-* | k5-* | k6-* | nexgen-* | viac3-*) + basic_machine=i586-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + pentiumpro-* | p6-* | 6x86-* | athlon-*) + basic_machine=i686-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + pentiumii-* | pentium2-* | pentiumiii-* | pentium3-*) + basic_machine=i686-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + pentium4-*) + basic_machine=i786-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + pn) + basic_machine=pn-gould + ;; + power) basic_machine=power-ibm + ;; + ppc | ppcbe) basic_machine=powerpc-unknown + ;; + ppc-* | ppcbe-*) + basic_machine=powerpc-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + ppcle | powerpclittle | ppc-le | powerpc-little) + basic_machine=powerpcle-unknown + ;; + ppcle-* | powerpclittle-*) + basic_machine=powerpcle-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + ppc64) basic_machine=powerpc64-unknown + ;; + ppc64-*) basic_machine=powerpc64-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + ppc64le | powerpc64little | ppc64-le | powerpc64-little) + basic_machine=powerpc64le-unknown + ;; + ppc64le-* | powerpc64little-*) + basic_machine=powerpc64le-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + ps2) + basic_machine=i386-ibm + ;; + pw32) + basic_machine=i586-unknown + os=-pw32 + ;; + rdos | rdos64) + basic_machine=x86_64-pc + os=-rdos + ;; + rdos32) + basic_machine=i386-pc + os=-rdos + ;; + rom68k) + basic_machine=m68k-rom68k + os=-coff + ;; + rm[46]00) + basic_machine=mips-siemens + ;; + rtpc | rtpc-*) + basic_machine=romp-ibm + ;; + s390 | s390-*) + basic_machine=s390-ibm + ;; + s390x | s390x-*) + basic_machine=s390x-ibm + ;; + sa29200) + basic_machine=a29k-amd + os=-udi + ;; + sb1) + basic_machine=mipsisa64sb1-unknown + ;; + sb1el) + basic_machine=mipsisa64sb1el-unknown + ;; + sde) + basic_machine=mipsisa32-sde + os=-elf + ;; + sei) + basic_machine=mips-sei + os=-seiux + ;; + sequent) + basic_machine=i386-sequent + ;; + sh) + basic_machine=sh-hitachi + os=-hms + ;; + sh5el) + basic_machine=sh5le-unknown + ;; + sh64) + basic_machine=sh64-unknown + ;; + sparclite-wrs | simso-wrs) + basic_machine=sparclite-wrs + os=-vxworks + ;; + sps7) + basic_machine=m68k-bull + os=-sysv2 + ;; + spur) + basic_machine=spur-unknown + ;; + st2000) + basic_machine=m68k-tandem + ;; + stratus) + basic_machine=i860-stratus + os=-sysv4 + ;; + strongarm-* | thumb-*) + basic_machine=arm-`echo $basic_machine | sed 's/^[^-]*-//'` + ;; + sun2) + basic_machine=m68000-sun + ;; + sun2os3) + basic_machine=m68000-sun + os=-sunos3 + ;; + sun2os4) + basic_machine=m68000-sun + os=-sunos4 + ;; + sun3os3) + basic_machine=m68k-sun + os=-sunos3 + ;; + sun3os4) + basic_machine=m68k-sun + os=-sunos4 + ;; + sun4os3) + basic_machine=sparc-sun + os=-sunos3 + ;; + sun4os4) + basic_machine=sparc-sun + os=-sunos4 + ;; + sun4sol2) + basic_machine=sparc-sun + os=-solaris2 + ;; + sun3 | sun3-*) + basic_machine=m68k-sun + ;; + sun4) + basic_machine=sparc-sun + ;; + sun386 | sun386i | roadrunner) + basic_machine=i386-sun + ;; + sv1) + basic_machine=sv1-cray + os=-unicos + ;; + symmetry) + basic_machine=i386-sequent + os=-dynix + ;; + t3e) + basic_machine=alphaev5-cray + os=-unicos + ;; + t90) + basic_machine=t90-cray + os=-unicos + ;; + tile*) + basic_machine=$basic_machine-unknown + os=-linux-gnu + ;; + tx39) + basic_machine=mipstx39-unknown + ;; + tx39el) + basic_machine=mipstx39el-unknown + ;; + toad1) + basic_machine=pdp10-xkl + os=-tops20 + ;; + tower | tower-32) + basic_machine=m68k-ncr + ;; + tpf) + basic_machine=s390x-ibm + os=-tpf + ;; + udi29k) + basic_machine=a29k-amd + os=-udi + ;; + ultra3) + basic_machine=a29k-nyu + os=-sym1 + ;; + v810 | necv810) + basic_machine=v810-nec + os=-none + ;; + vaxv) + basic_machine=vax-dec + os=-sysv + ;; + vms) + basic_machine=vax-dec + os=-vms + ;; + vpp*|vx|vx-*) + basic_machine=f301-fujitsu + ;; + vxworks960) + basic_machine=i960-wrs + os=-vxworks + ;; + vxworks68) + basic_machine=m68k-wrs + os=-vxworks + ;; + vxworks29k) + basic_machine=a29k-wrs + os=-vxworks + ;; + w65*) + basic_machine=w65-wdc + os=-none + ;; + w89k-*) + basic_machine=hppa1.1-winbond + os=-proelf + ;; + xbox) + basic_machine=i686-pc + os=-mingw32 + ;; + xps | xps100) + basic_machine=xps100-honeywell + ;; + xscale-* | xscalee[bl]-*) + basic_machine=`echo $basic_machine | sed 's/^xscale/arm/'` + ;; + ymp) + basic_machine=ymp-cray + os=-unicos + ;; + z8k-*-coff) + basic_machine=z8k-unknown + os=-sim + ;; + z80-*-coff) + basic_machine=z80-unknown + os=-sim + ;; + none) + basic_machine=none-none + os=-none + ;; + +# Here we handle the default manufacturer of certain CPU types. It is in +# some cases the only manufacturer, in others, it is the most popular. + w89k) + basic_machine=hppa1.1-winbond + ;; + op50n) + basic_machine=hppa1.1-oki + ;; + op60c) + basic_machine=hppa1.1-oki + ;; + romp) + basic_machine=romp-ibm + ;; + mmix) + basic_machine=mmix-knuth + ;; + rs6000) + basic_machine=rs6000-ibm + ;; + vax) + basic_machine=vax-dec + ;; + pdp10) + # there are many clones, so DEC is not a safe bet + basic_machine=pdp10-unknown + ;; + pdp11) + basic_machine=pdp11-dec + ;; + we32k) + basic_machine=we32k-att + ;; + sh[1234] | sh[24]a | sh[24]aeb | sh[34]eb | sh[1234]le | sh[23]ele) + basic_machine=sh-unknown + ;; + sparc | sparcv8 | sparcv9 | sparcv9b | sparcv9v) + basic_machine=sparc-sun + ;; + cydra) + basic_machine=cydra-cydrome + ;; + orion) + basic_machine=orion-highlevel + ;; + orion105) + basic_machine=clipper-highlevel + ;; + mac | mpw | mac-mpw) + basic_machine=m68k-apple + ;; + pmac | pmac-mpw) + basic_machine=powerpc-apple + ;; + *-unknown) + # Make sure to match an already-canonicalized machine name. + ;; + *) + echo Invalid configuration \`$1\': machine \`$basic_machine\' not recognized 1>&2 + exit 1 + ;; +esac + +# Here we canonicalize certain aliases for manufacturers. +case $basic_machine in + *-digital*) + basic_machine=`echo $basic_machine | sed 's/digital.*/dec/'` + ;; + *-commodore*) + basic_machine=`echo $basic_machine | sed 's/commodore.*/cbm/'` + ;; + *) + ;; +esac + +# Decode manufacturer-specific aliases for certain operating systems. + +if [ x"$os" != x"" ] +then +case $os in + # First match some system type aliases + # that might get confused with valid system types. + # -solaris* is a basic system type, with this one exception. + -auroraux) + os=-auroraux + ;; + -solaris1 | -solaris1.*) + os=`echo $os | sed -e 's|solaris1|sunos4|'` + ;; + -solaris) + os=-solaris2 + ;; + -svr4*) + os=-sysv4 + ;; + -unixware*) + os=-sysv4.2uw + ;; + -gnu/linux*) + os=`echo $os | sed -e 's|gnu/linux|linux-gnu|'` + ;; + # First accept the basic system types. + # The portable systems comes first. + # Each alternative MUST END IN A *, to match a version number. + # -sysv* is not here because it comes later, after sysvr4. + -gnu* | -bsd* | -mach* | -minix* | -genix* | -ultrix* | -irix* \ + | -*vms* | -sco* | -esix* | -isc* | -aix* | -cnk* | -sunos | -sunos[34]*\ + | -hpux* | -unos* | -osf* | -luna* | -dgux* | -auroraux* | -solaris* \ + | -sym* | -kopensolaris* | -plan9* \ + | -amigaos* | -amigados* | -msdos* | -newsos* | -unicos* | -aof* \ + | -aos* | -aros* | -cloudabi* | -sortix* \ + | -nindy* | -vxsim* | -vxworks* | -ebmon* | -hms* | -mvs* \ + | -clix* | -riscos* | -uniplus* | -iris* | -rtu* | -xenix* \ + | -hiux* | -386bsd* | -knetbsd* | -mirbsd* | -netbsd* \ + | -bitrig* | -openbsd* | -solidbsd* \ + | -ekkobsd* | -kfreebsd* | -freebsd* | -riscix* | -lynxos* \ + | -bosx* | -nextstep* | -cxux* | -aout* | -elf* | -oabi* \ + | -ptx* | -coff* | -ecoff* | -winnt* | -domain* | -vsta* \ + | -udi* | -eabi* | -lites* | -ieee* | -go32* | -aux* \ + | -chorusos* | -chorusrdb* | -cegcc* \ + | -cygwin* | -msys* | -pe* | -psos* | -moss* | -proelf* | -rtems* \ + | -mingw32* | -mingw64* | -linux-gnu* | -linux-android* \ + | -linux-newlib* | -linux-musl* | -linux-uclibc* \ + | -uxpv* | -beos* | -mpeix* | -udk* | -moxiebox* \ + | -interix* | -uwin* | -mks* | -rhapsody* | -darwin* | -opened* \ + | -openstep* | -oskit* | -conix* | -pw32* | -nonstopux* \ + | -storm-chaos* | -tops10* | -tenex* | -tops20* | -its* \ + | -os2* | -vos* | -palmos* | -uclinux* | -nucleus* \ + | -morphos* | -superux* | -rtmk* | -rtmk-nova* | -windiss* \ + | -powermax* | -dnix* | -nx6 | -nx7 | -sei* | -dragonfly* \ + | -skyos* | -haiku* | -rdos* | -toppers* | -drops* | -es* | -tirtos*) + # Remember, each alternative MUST END IN *, to match a version number. + ;; + -qnx*) + case $basic_machine in + x86-* | i*86-*) + ;; + *) + os=-nto$os + ;; + esac + ;; + -nto-qnx*) + ;; + -nto*) + os=`echo $os | sed -e 's|nto|nto-qnx|'` + ;; + -sim | -es1800* | -hms* | -xray | -os68k* | -none* | -v88r* \ + | -windows* | -osx | -abug | -netware* | -os9* | -beos* | -haiku* \ + | -macos* | -mpw* | -magic* | -mmixware* | -mon960* | -lnews*) + ;; + -mac*) + os=`echo $os | sed -e 's|mac|macos|'` + ;; + -linux-dietlibc) + os=-linux-dietlibc + ;; + -linux*) + os=`echo $os | sed -e 's|linux|linux-gnu|'` + ;; + -sunos5*) + os=`echo $os | sed -e 's|sunos5|solaris2|'` + ;; + -sunos6*) + os=`echo $os | sed -e 's|sunos6|solaris3|'` + ;; + -opened*) + os=-openedition + ;; + -os400*) + os=-os400 + ;; + -wince*) + os=-wince + ;; + -osfrose*) + os=-osfrose + ;; + -osf*) + os=-osf + ;; + -utek*) + os=-bsd + ;; + -dynix*) + os=-bsd + ;; + -acis*) + os=-aos + ;; + -atheos*) + os=-atheos + ;; + -syllable*) + os=-syllable + ;; + -386bsd) + os=-bsd + ;; + -ctix* | -uts*) + os=-sysv + ;; + -nova*) + os=-rtmk-nova + ;; + -ns2 ) + os=-nextstep2 + ;; + -nsk*) + os=-nsk + ;; + # Preserve the version number of sinix5. + -sinix5.*) + os=`echo $os | sed -e 's|sinix|sysv|'` + ;; + -sinix*) + os=-sysv4 + ;; + -tpf*) + os=-tpf + ;; + -triton*) + os=-sysv3 + ;; + -oss*) + os=-sysv3 + ;; + -svr4) + os=-sysv4 + ;; + -svr3) + os=-sysv3 + ;; + -sysvr4) + os=-sysv4 + ;; + # This must come after -sysvr4. + -sysv*) + ;; + -ose*) + os=-ose + ;; + -es1800*) + os=-ose + ;; + -xenix) + os=-xenix + ;; + -*mint | -mint[0-9]* | -*MiNT | -MiNT[0-9]*) + os=-mint + ;; + -aros*) + os=-aros + ;; + -zvmoe) + os=-zvmoe + ;; + -dicos*) + os=-dicos + ;; + -nacl*) + ;; + -none) + ;; + *) + # Get rid of the `-' at the beginning of $os. + os=`echo $os | sed 's/[^-]*-//'` + echo Invalid configuration \`$1\': system \`$os\' not recognized 1>&2 + exit 1 + ;; +esac +else + +# Here we handle the default operating systems that come with various machines. +# The value should be what the vendor currently ships out the door with their +# machine or put another way, the most popular os provided with the machine. + +# Note that if you're going to try to match "-MANUFACTURER" here (say, +# "-sun"), then you have to tell the case statement up towards the top +# that MANUFACTURER isn't an operating system. Otherwise, code above +# will signal an error saying that MANUFACTURER isn't an operating +# system, and we'll never get to this point. + +case $basic_machine in + score-*) + os=-elf + ;; + spu-*) + os=-elf + ;; + *-acorn) + os=-riscix1.2 + ;; + arm*-rebel) + os=-linux + ;; + arm*-semi) + os=-aout + ;; + c4x-* | tic4x-*) + os=-coff + ;; + c8051-*) + os=-elf + ;; + hexagon-*) + os=-elf + ;; + tic54x-*) + os=-coff + ;; + tic55x-*) + os=-coff + ;; + tic6x-*) + os=-coff + ;; + # This must come before the *-dec entry. + pdp10-*) + os=-tops20 + ;; + pdp11-*) + os=-none + ;; + *-dec | vax-*) + os=-ultrix4.2 + ;; + m68*-apollo) + os=-domain + ;; + i386-sun) + os=-sunos4.0.2 + ;; + m68000-sun) + os=-sunos3 + ;; + m68*-cisco) + os=-aout + ;; + mep-*) + os=-elf + ;; + mips*-cisco) + os=-elf + ;; + mips*-*) + os=-elf + ;; + or32-*) + os=-coff + ;; + *-tti) # must be before sparc entry or we get the wrong os. + os=-sysv3 + ;; + sparc-* | *-sun) + os=-sunos4.1.1 + ;; + *-be) + os=-beos + ;; + *-haiku) + os=-haiku + ;; + *-ibm) + os=-aix + ;; + *-knuth) + os=-mmixware + ;; + *-wec) + os=-proelf + ;; + *-winbond) + os=-proelf + ;; + *-oki) + os=-proelf + ;; + *-hp) + os=-hpux + ;; + *-hitachi) + os=-hiux + ;; + i860-* | *-att | *-ncr | *-altos | *-motorola | *-convergent) + os=-sysv + ;; + *-cbm) + os=-amigaos + ;; + *-dg) + os=-dgux + ;; + *-dolphin) + os=-sysv3 + ;; + m68k-ccur) + os=-rtu + ;; + m88k-omron*) + os=-luna + ;; + *-next ) + os=-nextstep + ;; + *-sequent) + os=-ptx + ;; + *-crds) + os=-unos + ;; + *-ns) + os=-genix + ;; + i370-*) + os=-mvs + ;; + *-next) + os=-nextstep3 + ;; + *-gould) + os=-sysv + ;; + *-highlevel) + os=-bsd + ;; + *-encore) + os=-bsd + ;; + *-sgi) + os=-irix + ;; + *-siemens) + os=-sysv4 + ;; + *-masscomp) + os=-rtu + ;; + f30[01]-fujitsu | f700-fujitsu) + os=-uxpv + ;; + *-rom68k) + os=-coff + ;; + *-*bug) + os=-coff + ;; + *-apple) + os=-macos + ;; + *-atari*) + os=-mint + ;; + *) + os=-none + ;; +esac +fi + +# Here we handle the case where we know the os, and the CPU type, but not the +# manufacturer. We pick the logical manufacturer. +vendor=unknown +case $basic_machine in + *-unknown) + case $os in + -riscix*) + vendor=acorn + ;; + -sunos*) + vendor=sun + ;; + -cnk*|-aix*) + vendor=ibm + ;; + -beos*) + vendor=be + ;; + -hpux*) + vendor=hp + ;; + -mpeix*) + vendor=hp + ;; + -hiux*) + vendor=hitachi + ;; + -unos*) + vendor=crds + ;; + -dgux*) + vendor=dg + ;; + -luna*) + vendor=omron + ;; + -genix*) + vendor=ns + ;; + -mvs* | -opened*) + vendor=ibm + ;; + -os400*) + vendor=ibm + ;; + -ptx*) + vendor=sequent + ;; + -tpf*) + vendor=ibm + ;; + -vxsim* | -vxworks* | -windiss*) + vendor=wrs + ;; + -aux*) + vendor=apple + ;; + -hms*) + vendor=hitachi + ;; + -mpw* | -macos*) + vendor=apple + ;; + -*mint | -mint[0-9]* | -*MiNT | -MiNT[0-9]*) + vendor=atari + ;; + -vos*) + vendor=stratus + ;; + esac + basic_machine=`echo $basic_machine | sed "s/unknown/$vendor/"` + ;; +esac + +echo $basic_machine$os +exit + +# Local variables: +# eval: (add-hook 'write-file-hooks 'time-stamp) +# time-stamp-start: "timestamp='" +# time-stamp-format: "%:y-%02m-%02d" +# time-stamp-end: "'" +# End: diff --git a/configure b/configure new file mode 100755 index 0000000..9e24c4b --- /dev/null +++ b/configure @@ -0,0 +1,17495 @@ +#! /bin/sh +# Guess values for system-dependent variables and create Makefiles. +# Generated by GNU Autoconf 2.69 for liblognorm 2.0.5. +# +# Report bugs to . +# +# +# Copyright (C) 1992-1996, 1998-2012 Free Software Foundation, Inc. +# +# +# This configure script is free software; the Free Software Foundation +# gives unlimited permission to copy, distribute and modify it. +## -------------------- ## +## M4sh Initialization. ## +## -------------------- ## + +# Be more Bourne compatible +DUALCASE=1; export DUALCASE # for MKS sh +if test -n "${ZSH_VERSION+set}" && (emulate sh) >/dev/null 2>&1; then : + emulate sh + NULLCMD=: + # Pre-4.2 versions of Zsh do word splitting on ${1+"$@"}, which + # is contrary to our usage. Disable this feature. + alias -g '${1+"$@"}'='"$@"' + setopt NO_GLOB_SUBST +else + case `(set -o) 2>/dev/null` in #( + *posix*) : + set -o posix ;; #( + *) : + ;; +esac +fi + + +as_nl=' +' +export as_nl +# Printing a long string crashes Solaris 7 /usr/bin/printf. +as_echo='\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\' +as_echo=$as_echo$as_echo$as_echo$as_echo$as_echo +as_echo=$as_echo$as_echo$as_echo$as_echo$as_echo$as_echo +# Prefer a ksh shell builtin over an external printf program on Solaris, +# but without wasting forks for bash or zsh. +if test -z "$BASH_VERSION$ZSH_VERSION" \ + && (test "X`print -r -- $as_echo`" = "X$as_echo") 2>/dev/null; then + as_echo='print -r --' + as_echo_n='print -rn --' +elif (test "X`printf %s $as_echo`" = "X$as_echo") 2>/dev/null; then + as_echo='printf %s\n' + as_echo_n='printf %s' +else + if test "X`(/usr/ucb/echo -n -n $as_echo) 2>/dev/null`" = "X-n $as_echo"; then + as_echo_body='eval /usr/ucb/echo -n "$1$as_nl"' + as_echo_n='/usr/ucb/echo -n' + else + as_echo_body='eval expr "X$1" : "X\\(.*\\)"' + as_echo_n_body='eval + arg=$1; + case $arg in #( + *"$as_nl"*) + expr "X$arg" : "X\\(.*\\)$as_nl"; + arg=`expr "X$arg" : ".*$as_nl\\(.*\\)"`;; + esac; + expr "X$arg" : "X\\(.*\\)" | tr -d "$as_nl" + ' + export as_echo_n_body + as_echo_n='sh -c $as_echo_n_body as_echo' + fi + export as_echo_body + as_echo='sh -c $as_echo_body as_echo' +fi + +# The user is always right. +if test "${PATH_SEPARATOR+set}" != set; then + PATH_SEPARATOR=: + (PATH='/bin;/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 && { + (PATH='/bin:/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 || + PATH_SEPARATOR=';' + } +fi + + +# IFS +# We need space, tab and new line, in precisely that order. Quoting is +# there to prevent editors from complaining about space-tab. +# (If _AS_PATH_WALK were called with IFS unset, it would disable word +# splitting by setting IFS to empty value.) +IFS=" "" $as_nl" + +# Find who we are. Look in the path if we contain no directory separator. +as_myself= +case $0 in #(( + *[\\/]* ) as_myself=$0 ;; + *) as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + test -r "$as_dir/$0" && as_myself=$as_dir/$0 && break + done +IFS=$as_save_IFS + + ;; +esac +# We did not find ourselves, most probably we were run as `sh COMMAND' +# in which case we are not to be found in the path. +if test "x$as_myself" = x; then + as_myself=$0 +fi +if test ! -f "$as_myself"; then + $as_echo "$as_myself: error: cannot find myself; rerun with an absolute file name" >&2 + exit 1 +fi + +# Unset variables that we do not need and which cause bugs (e.g. in +# pre-3.0 UWIN ksh). But do not cause bugs in bash 2.01; the "|| exit 1" +# suppresses any "Segmentation fault" message there. '((' could +# trigger a bug in pdksh 5.2.14. +for as_var in BASH_ENV ENV MAIL MAILPATH +do eval test x\${$as_var+set} = xset \ + && ( (unset $as_var) || exit 1) >/dev/null 2>&1 && unset $as_var || : +done +PS1='$ ' +PS2='> ' +PS4='+ ' + +# NLS nuisances. +LC_ALL=C +export LC_ALL +LANGUAGE=C +export LANGUAGE + +# CDPATH. +(unset CDPATH) >/dev/null 2>&1 && unset CDPATH + +# Use a proper internal environment variable to ensure we don't fall + # into an infinite loop, continuously re-executing ourselves. + if test x"${_as_can_reexec}" != xno && test "x$CONFIG_SHELL" != x; then + _as_can_reexec=no; export _as_can_reexec; + # We cannot yet assume a decent shell, so we have to provide a +# neutralization value for shells without unset; and this also +# works around shells that cannot unset nonexistent variables. +# Preserve -v and -x to the replacement shell. +BASH_ENV=/dev/null +ENV=/dev/null +(unset BASH_ENV) >/dev/null 2>&1 && unset BASH_ENV ENV +case $- in # (((( + *v*x* | *x*v* ) as_opts=-vx ;; + *v* ) as_opts=-v ;; + *x* ) as_opts=-x ;; + * ) as_opts= ;; +esac +exec $CONFIG_SHELL $as_opts "$as_myself" ${1+"$@"} +# Admittedly, this is quite paranoid, since all the known shells bail +# out after a failed `exec'. +$as_echo "$0: could not re-execute with $CONFIG_SHELL" >&2 +as_fn_exit 255 + fi + # We don't want this to propagate to other subprocesses. + { _as_can_reexec=; unset _as_can_reexec;} +if test "x$CONFIG_SHELL" = x; then + as_bourne_compatible="if test -n \"\${ZSH_VERSION+set}\" && (emulate sh) >/dev/null 2>&1; then : + emulate sh + NULLCMD=: + # Pre-4.2 versions of Zsh do word splitting on \${1+\"\$@\"}, which + # is contrary to our usage. Disable this feature. + alias -g '\${1+\"\$@\"}'='\"\$@\"' + setopt NO_GLOB_SUBST +else + case \`(set -o) 2>/dev/null\` in #( + *posix*) : + set -o posix ;; #( + *) : + ;; +esac +fi +" + as_required="as_fn_return () { (exit \$1); } +as_fn_success () { as_fn_return 0; } +as_fn_failure () { as_fn_return 1; } +as_fn_ret_success () { return 0; } +as_fn_ret_failure () { return 1; } + +exitcode=0 +as_fn_success || { exitcode=1; echo as_fn_success failed.; } +as_fn_failure && { exitcode=1; echo as_fn_failure succeeded.; } +as_fn_ret_success || { exitcode=1; echo as_fn_ret_success failed.; } +as_fn_ret_failure && { exitcode=1; echo as_fn_ret_failure succeeded.; } +if ( set x; as_fn_ret_success y && test x = \"\$1\" ); then : + +else + exitcode=1; echo positional parameters were not saved. +fi +test x\$exitcode = x0 || exit 1 +test -x / || exit 1" + as_suggested=" as_lineno_1=";as_suggested=$as_suggested$LINENO;as_suggested=$as_suggested" as_lineno_1a=\$LINENO + as_lineno_2=";as_suggested=$as_suggested$LINENO;as_suggested=$as_suggested" as_lineno_2a=\$LINENO + eval 'test \"x\$as_lineno_1'\$as_run'\" != \"x\$as_lineno_2'\$as_run'\" && + test \"x\`expr \$as_lineno_1'\$as_run' + 1\`\" = \"x\$as_lineno_2'\$as_run'\"' || exit 1 +test \$(( 1 + 1 )) = 2 || exit 1 + + test -n \"\${ZSH_VERSION+set}\${BASH_VERSION+set}\" || ( + ECHO='\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\' + ECHO=\$ECHO\$ECHO\$ECHO\$ECHO\$ECHO + ECHO=\$ECHO\$ECHO\$ECHO\$ECHO\$ECHO\$ECHO + PATH=/empty FPATH=/empty; export PATH FPATH + test \"X\`printf %s \$ECHO\`\" = \"X\$ECHO\" \\ + || test \"X\`print -r -- \$ECHO\`\" = \"X\$ECHO\" ) || exit 1" + if (eval "$as_required") 2>/dev/null; then : + as_have_required=yes +else + as_have_required=no +fi + if test x$as_have_required = xyes && (eval "$as_suggested") 2>/dev/null; then : + +else + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +as_found=false +for as_dir in /bin$PATH_SEPARATOR/usr/bin$PATH_SEPARATOR$PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + as_found=: + case $as_dir in #( + /*) + for as_base in sh bash ksh sh5; do + # Try only shells that exist, to save several forks. + as_shell=$as_dir/$as_base + if { test -f "$as_shell" || test -f "$as_shell.exe"; } && + { $as_echo "$as_bourne_compatible""$as_required" | as_run=a "$as_shell"; } 2>/dev/null; then : + CONFIG_SHELL=$as_shell as_have_required=yes + if { $as_echo "$as_bourne_compatible""$as_suggested" | as_run=a "$as_shell"; } 2>/dev/null; then : + break 2 +fi +fi + done;; + esac + as_found=false +done +$as_found || { if { test -f "$SHELL" || test -f "$SHELL.exe"; } && + { $as_echo "$as_bourne_compatible""$as_required" | as_run=a "$SHELL"; } 2>/dev/null; then : + CONFIG_SHELL=$SHELL as_have_required=yes +fi; } +IFS=$as_save_IFS + + + if test "x$CONFIG_SHELL" != x; then : + export CONFIG_SHELL + # We cannot yet assume a decent shell, so we have to provide a +# neutralization value for shells without unset; and this also +# works around shells that cannot unset nonexistent variables. +# Preserve -v and -x to the replacement shell. +BASH_ENV=/dev/null +ENV=/dev/null +(unset BASH_ENV) >/dev/null 2>&1 && unset BASH_ENV ENV +case $- in # (((( + *v*x* | *x*v* ) as_opts=-vx ;; + *v* ) as_opts=-v ;; + *x* ) as_opts=-x ;; + * ) as_opts= ;; +esac +exec $CONFIG_SHELL $as_opts "$as_myself" ${1+"$@"} +# Admittedly, this is quite paranoid, since all the known shells bail +# out after a failed `exec'. +$as_echo "$0: could not re-execute with $CONFIG_SHELL" >&2 +exit 255 +fi + + if test x$as_have_required = xno; then : + $as_echo "$0: This script requires a shell more modern than all" + $as_echo "$0: the shells that I found on your system." + if test x${ZSH_VERSION+set} = xset ; then + $as_echo "$0: In particular, zsh $ZSH_VERSION has bugs and should" + $as_echo "$0: be upgraded to zsh 4.3.4 or later." + else + $as_echo "$0: Please tell bug-autoconf@gnu.org and +$0: rgerhards@adiscon.com about your system, including any +$0: error possibly output before this message. Then install +$0: a modern shell, or manually run the script under such a +$0: shell if you do have one." + fi + exit 1 +fi +fi +fi +SHELL=${CONFIG_SHELL-/bin/sh} +export SHELL +# Unset more variables known to interfere with behavior of common tools. +CLICOLOR_FORCE= GREP_OPTIONS= +unset CLICOLOR_FORCE GREP_OPTIONS + +## --------------------- ## +## M4sh Shell Functions. ## +## --------------------- ## +# as_fn_unset VAR +# --------------- +# Portably unset VAR. +as_fn_unset () +{ + { eval $1=; unset $1;} +} +as_unset=as_fn_unset + +# as_fn_set_status STATUS +# ----------------------- +# Set $? to STATUS, without forking. +as_fn_set_status () +{ + return $1 +} # as_fn_set_status + +# as_fn_exit STATUS +# ----------------- +# Exit the shell with STATUS, even in a "trap 0" or "set -e" context. +as_fn_exit () +{ + set +e + as_fn_set_status $1 + exit $1 +} # as_fn_exit + +# as_fn_mkdir_p +# ------------- +# Create "$as_dir" as a directory, including parents if necessary. +as_fn_mkdir_p () +{ + + case $as_dir in #( + -*) as_dir=./$as_dir;; + esac + test -d "$as_dir" || eval $as_mkdir_p || { + as_dirs= + while :; do + case $as_dir in #( + *\'*) as_qdir=`$as_echo "$as_dir" | sed "s/'/'\\\\\\\\''/g"`;; #'( + *) as_qdir=$as_dir;; + esac + as_dirs="'$as_qdir' $as_dirs" + as_dir=`$as_dirname -- "$as_dir" || +$as_expr X"$as_dir" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \ + X"$as_dir" : 'X\(//\)[^/]' \| \ + X"$as_dir" : 'X\(//\)$' \| \ + X"$as_dir" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X"$as_dir" | + sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{ + s//\1/ + q + } + /^X\(\/\/\)[^/].*/{ + s//\1/ + q + } + /^X\(\/\/\)$/{ + s//\1/ + q + } + /^X\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'` + test -d "$as_dir" && break + done + test -z "$as_dirs" || eval "mkdir $as_dirs" + } || test -d "$as_dir" || as_fn_error $? "cannot create directory $as_dir" + + +} # as_fn_mkdir_p + +# as_fn_executable_p FILE +# ----------------------- +# Test if FILE is an executable regular file. +as_fn_executable_p () +{ + test -f "$1" && test -x "$1" +} # as_fn_executable_p +# as_fn_append VAR VALUE +# ---------------------- +# Append the text in VALUE to the end of the definition contained in VAR. Take +# advantage of any shell optimizations that allow amortized linear growth over +# repeated appends, instead of the typical quadratic growth present in naive +# implementations. +if (eval "as_var=1; as_var+=2; test x\$as_var = x12") 2>/dev/null; then : + eval 'as_fn_append () + { + eval $1+=\$2 + }' +else + as_fn_append () + { + eval $1=\$$1\$2 + } +fi # as_fn_append + +# as_fn_arith ARG... +# ------------------ +# Perform arithmetic evaluation on the ARGs, and store the result in the +# global $as_val. Take advantage of shells that can avoid forks. The arguments +# must be portable across $(()) and expr. +if (eval "test \$(( 1 + 1 )) = 2") 2>/dev/null; then : + eval 'as_fn_arith () + { + as_val=$(( $* )) + }' +else + as_fn_arith () + { + as_val=`expr "$@" || test $? -eq 1` + } +fi # as_fn_arith + + +# as_fn_error STATUS ERROR [LINENO LOG_FD] +# ---------------------------------------- +# Output "`basename $0`: error: ERROR" to stderr. If LINENO and LOG_FD are +# provided, also output the error to LOG_FD, referencing LINENO. Then exit the +# script with STATUS, using 1 if that was 0. +as_fn_error () +{ + as_status=$1; test $as_status -eq 0 && as_status=1 + if test "$4"; then + as_lineno=${as_lineno-"$3"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + $as_echo "$as_me:${as_lineno-$LINENO}: error: $2" >&$4 + fi + $as_echo "$as_me: error: $2" >&2 + as_fn_exit $as_status +} # as_fn_error + +if expr a : '\(a\)' >/dev/null 2>&1 && + test "X`expr 00001 : '.*\(...\)'`" = X001; then + as_expr=expr +else + as_expr=false +fi + +if (basename -- /) >/dev/null 2>&1 && test "X`basename -- / 2>&1`" = "X/"; then + as_basename=basename +else + as_basename=false +fi + +if (as_dir=`dirname -- /` && test "X$as_dir" = X/) >/dev/null 2>&1; then + as_dirname=dirname +else + as_dirname=false +fi + +as_me=`$as_basename -- "$0" || +$as_expr X/"$0" : '.*/\([^/][^/]*\)/*$' \| \ + X"$0" : 'X\(//\)$' \| \ + X"$0" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X/"$0" | + sed '/^.*\/\([^/][^/]*\)\/*$/{ + s//\1/ + q + } + /^X\/\(\/\/\)$/{ + s//\1/ + q + } + /^X\/\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'` + +# Avoid depending upon Character Ranges. +as_cr_letters='abcdefghijklmnopqrstuvwxyz' +as_cr_LETTERS='ABCDEFGHIJKLMNOPQRSTUVWXYZ' +as_cr_Letters=$as_cr_letters$as_cr_LETTERS +as_cr_digits='0123456789' +as_cr_alnum=$as_cr_Letters$as_cr_digits + + + as_lineno_1=$LINENO as_lineno_1a=$LINENO + as_lineno_2=$LINENO as_lineno_2a=$LINENO + eval 'test "x$as_lineno_1'$as_run'" != "x$as_lineno_2'$as_run'" && + test "x`expr $as_lineno_1'$as_run' + 1`" = "x$as_lineno_2'$as_run'"' || { + # Blame Lee E. McMahon (1931-1989) for sed's syntax. :-) + sed -n ' + p + /[$]LINENO/= + ' <$as_myself | + sed ' + s/[$]LINENO.*/&-/ + t lineno + b + :lineno + N + :loop + s/[$]LINENO\([^'$as_cr_alnum'_].*\n\)\(.*\)/\2\1\2/ + t loop + s/-\n.*// + ' >$as_me.lineno && + chmod +x "$as_me.lineno" || + { $as_echo "$as_me: error: cannot create $as_me.lineno; rerun with a POSIX shell" >&2; as_fn_exit 1; } + + # If we had to re-execute with $CONFIG_SHELL, we're ensured to have + # already done that, so ensure we don't try to do so again and fall + # in an infinite loop. This has already happened in practice. + _as_can_reexec=no; export _as_can_reexec + # Don't try to exec as it changes $[0], causing all sort of problems + # (the dirname of $[0] is not the place where we might find the + # original and so on. Autoconf is especially sensitive to this). + . "./$as_me.lineno" + # Exit status is that of the last command. + exit +} + +ECHO_C= ECHO_N= ECHO_T= +case `echo -n x` in #((((( +-n*) + case `echo 'xy\c'` in + *c*) ECHO_T=' ';; # ECHO_T is single tab character. + xy) ECHO_C='\c';; + *) echo `echo ksh88 bug on AIX 6.1` > /dev/null + ECHO_T=' ';; + esac;; +*) + ECHO_N='-n';; +esac + +rm -f conf$$ conf$$.exe conf$$.file +if test -d conf$$.dir; then + rm -f conf$$.dir/conf$$.file +else + rm -f conf$$.dir + mkdir conf$$.dir 2>/dev/null +fi +if (echo >conf$$.file) 2>/dev/null; then + if ln -s conf$$.file conf$$ 2>/dev/null; then + as_ln_s='ln -s' + # ... but there are two gotchas: + # 1) On MSYS, both `ln -s file dir' and `ln file dir' fail. + # 2) DJGPP < 2.04 has no symlinks; `ln -s' creates a wrapper executable. + # In both cases, we have to default to `cp -pR'. + ln -s conf$$.file conf$$.dir 2>/dev/null && test ! -f conf$$.exe || + as_ln_s='cp -pR' + elif ln conf$$.file conf$$ 2>/dev/null; then + as_ln_s=ln + else + as_ln_s='cp -pR' + fi +else + as_ln_s='cp -pR' +fi +rm -f conf$$ conf$$.exe conf$$.dir/conf$$.file conf$$.file +rmdir conf$$.dir 2>/dev/null + +if mkdir -p . 2>/dev/null; then + as_mkdir_p='mkdir -p "$as_dir"' +else + test -d ./-p && rmdir ./-p + as_mkdir_p=false +fi + +as_test_x='test -x' +as_executable_p=as_fn_executable_p + +# Sed expression to map a string onto a valid CPP name. +as_tr_cpp="eval sed 'y%*$as_cr_letters%P$as_cr_LETTERS%;s%[^_$as_cr_alnum]%_%g'" + +# Sed expression to map a string onto a valid variable name. +as_tr_sh="eval sed 'y%*+%pp%;s%[^_$as_cr_alnum]%_%g'" + +SHELL=${CONFIG_SHELL-/bin/sh} + + +test -n "$DJDIR" || exec 7<&0 &1 + +# Name of the host. +# hostname on some systems (SVR3.2, old GNU/Linux) returns a bogus exit status, +# so uname gets run too. +ac_hostname=`(hostname || uname -n) 2>/dev/null | sed 1q` + +# +# Initializations. +# +ac_default_prefix=/usr/local +ac_clean_files= +ac_config_libobj_dir=. +LIBOBJS= +cross_compiling=no +subdirs= +MFLAGS= +MAKEFLAGS= + +# Identity of this package. +PACKAGE_NAME='liblognorm' +PACKAGE_TARNAME='liblognorm' +PACKAGE_VERSION='2.0.5' +PACKAGE_STRING='liblognorm 2.0.5' +PACKAGE_BUGREPORT='rgerhards@adiscon.com' +PACKAGE_URL='' + +ac_unique_file="src/lognorm.c" +# Factoring default headers for most tests. +ac_includes_default="\ +#include +#ifdef HAVE_SYS_TYPES_H +# include +#endif +#ifdef HAVE_SYS_STAT_H +# include +#endif +#ifdef STDC_HEADERS +# include +# include +#else +# ifdef HAVE_STDLIB_H +# include +# endif +#endif +#ifdef HAVE_STRING_H +# if !defined STDC_HEADERS && defined HAVE_MEMORY_H +# include +# endif +# include +#endif +#ifdef HAVE_STRINGS_H +# include +#endif +#ifdef HAVE_INTTYPES_H +# include +#endif +#ifdef HAVE_STDINT_H +# include +#endif +#ifdef HAVE_UNISTD_H +# include +#endif" + +ac_subst_vars='am__EXEEXT_FALSE +am__EXEEXT_TRUE +LTLIBOBJS +LIBOBJS +ENABLE_TOOLS_FALSE +ENABLE_TOOLS_TRUE +VALGRIND +ENABLE_VALGRIND_FALSE +ENABLE_VALGRIND_TRUE +ENABLE_TESTBENCH_FALSE +ENABLE_TESTBENCH_TRUE +ENABLE_DOCS_FALSE +ENABLE_DOCS_TRUE +SPHINXBUILD +FEATURE_REGEXP +PCRE_LIBS +PCRE_CFLAGS +ENABLE_REGEXP_FALSE +ENABLE_REGEXP_TRUE +pkg_config_libs_private +JSON_C_LIBS +JSON_C_CFLAGS +LIBESTR_LIBS +LIBESTR_CFLAGS +PKG_CONFIG_LIBDIR +PKG_CONFIG_PATH +PKG_CONFIG +LIBLOGNORM_LIBS +LIBLOGNORM_CFLAGS +WARN_SCANNERFLAGS +WARN_LDFLAGS +WARN_CFLAGS +LT_SYS_LIBRARY_PATH +OTOOL64 +OTOOL +LIPO +NMEDIT +DSYMUTIL +MANIFEST_TOOL +RANLIB +ac_ct_AR +AR +DLLTOOL +OBJDUMP +LN_S +NM +ac_ct_DUMPBIN +DUMPBIN +LD +FGREP +SED +host_os +host_vendor +host_cpu +host +build_os +build_vendor +build_cpu +build +LIBTOOL +EGREP +GREP +CPP +am__fastdepCC_FALSE +am__fastdepCC_TRUE +CCDEPMODE +am__nodep +AMDEPBACKSLASH +AMDEP_FALSE +AMDEP_TRUE +am__quote +am__include +DEPDIR +OBJEXT +EXEEXT +ac_ct_CC +CPPFLAGS +LDFLAGS +CFLAGS +CC +AM_BACKSLASH +AM_DEFAULT_VERBOSITY +AM_DEFAULT_V +AM_V +am__untar +am__tar +AMTAR +am__leading_dot +SET_MAKE +AWK +mkdir_p +MKDIR_P +INSTALL_STRIP_PROGRAM +STRIP +install_sh +MAKEINFO +AUTOHEADER +AUTOMAKE +AUTOCONF +ACLOCAL +VERSION +PACKAGE +CYGPATH_W +am__isrc +INSTALL_DATA +INSTALL_SCRIPT +INSTALL_PROGRAM +target_alias +host_alias +build_alias +LIBS +ECHO_T +ECHO_N +ECHO_C +DEFS +mandir +localedir +libdir +psdir +pdfdir +dvidir +htmldir +infodir +docdir +oldincludedir +includedir +runstatedir +localstatedir +sharedstatedir +sysconfdir +datadir +datarootdir +libexecdir +sbindir +bindir +program_transform_name +prefix +exec_prefix +PACKAGE_URL +PACKAGE_BUGREPORT +PACKAGE_STRING +PACKAGE_VERSION +PACKAGE_TARNAME +PACKAGE_NAME +PATH_SEPARATOR +SHELL' +ac_subst_files='' +ac_user_opts=' +enable_option_checking +enable_silent_rules +enable_dependency_tracking +enable_shared +enable_static +with_pic +enable_fast_install +with_aix_soname +with_gnu_ld +with_sysroot +enable_libtool_lock +enable_compile_warnings +enable_Werror +enable_regexp +enable_debug +enable_advanced_stats +enable_docs +enable_testbench +enable_valgrind +enable_tools +' + ac_precious_vars='build_alias +host_alias +target_alias +CC +CFLAGS +LDFLAGS +LIBS +CPPFLAGS +CPP +LT_SYS_LIBRARY_PATH +PKG_CONFIG +PKG_CONFIG_PATH +PKG_CONFIG_LIBDIR +LIBESTR_CFLAGS +LIBESTR_LIBS +JSON_C_CFLAGS +JSON_C_LIBS +PCRE_CFLAGS +PCRE_LIBS' + + +# Initialize some variables set by options. +ac_init_help= +ac_init_version=false +ac_unrecognized_opts= +ac_unrecognized_sep= +# The variables have the same names as the options, with +# dashes changed to underlines. +cache_file=/dev/null +exec_prefix=NONE +no_create= +no_recursion= +prefix=NONE +program_prefix=NONE +program_suffix=NONE +program_transform_name=s,x,x, +silent= +site= +srcdir= +verbose= +x_includes=NONE +x_libraries=NONE + +# Installation directory options. +# These are left unexpanded so users can "make install exec_prefix=/foo" +# and all the variables that are supposed to be based on exec_prefix +# by default will actually change. +# Use braces instead of parens because sh, perl, etc. also accept them. +# (The list follows the same order as the GNU Coding Standards.) +bindir='${exec_prefix}/bin' +sbindir='${exec_prefix}/sbin' +libexecdir='${exec_prefix}/libexec' +datarootdir='${prefix}/share' +datadir='${datarootdir}' +sysconfdir='${prefix}/etc' +sharedstatedir='${prefix}/com' +localstatedir='${prefix}/var' +runstatedir='${localstatedir}/run' +includedir='${prefix}/include' +oldincludedir='/usr/include' +docdir='${datarootdir}/doc/${PACKAGE_TARNAME}' +infodir='${datarootdir}/info' +htmldir='${docdir}' +dvidir='${docdir}' +pdfdir='${docdir}' +psdir='${docdir}' +libdir='${exec_prefix}/lib' +localedir='${datarootdir}/locale' +mandir='${datarootdir}/man' + +ac_prev= +ac_dashdash= +for ac_option +do + # If the previous option needs an argument, assign it. + if test -n "$ac_prev"; then + eval $ac_prev=\$ac_option + ac_prev= + continue + fi + + case $ac_option in + *=?*) ac_optarg=`expr "X$ac_option" : '[^=]*=\(.*\)'` ;; + *=) ac_optarg= ;; + *) ac_optarg=yes ;; + esac + + # Accept the important Cygnus configure options, so we can diagnose typos. + + case $ac_dashdash$ac_option in + --) + ac_dashdash=yes ;; + + -bindir | --bindir | --bindi | --bind | --bin | --bi) + ac_prev=bindir ;; + -bindir=* | --bindir=* | --bindi=* | --bind=* | --bin=* | --bi=*) + bindir=$ac_optarg ;; + + -build | --build | --buil | --bui | --bu) + ac_prev=build_alias ;; + -build=* | --build=* | --buil=* | --bui=* | --bu=*) + build_alias=$ac_optarg ;; + + -cache-file | --cache-file | --cache-fil | --cache-fi \ + | --cache-f | --cache- | --cache | --cach | --cac | --ca | --c) + ac_prev=cache_file ;; + -cache-file=* | --cache-file=* | --cache-fil=* | --cache-fi=* \ + | --cache-f=* | --cache-=* | --cache=* | --cach=* | --cac=* | --ca=* | --c=*) + cache_file=$ac_optarg ;; + + --config-cache | -C) + cache_file=config.cache ;; + + -datadir | --datadir | --datadi | --datad) + ac_prev=datadir ;; + -datadir=* | --datadir=* | --datadi=* | --datad=*) + datadir=$ac_optarg ;; + + -datarootdir | --datarootdir | --datarootdi | --datarootd | --dataroot \ + | --dataroo | --dataro | --datar) + ac_prev=datarootdir ;; + -datarootdir=* | --datarootdir=* | --datarootdi=* | --datarootd=* \ + | --dataroot=* | --dataroo=* | --dataro=* | --datar=*) + datarootdir=$ac_optarg ;; + + -disable-* | --disable-*) + ac_useropt=`expr "x$ac_option" : 'x-*disable-\(.*\)'` + # Reject names that are not valid shell variable names. + expr "x$ac_useropt" : ".*[^-+._$as_cr_alnum]" >/dev/null && + as_fn_error $? "invalid feature name: $ac_useropt" + ac_useropt_orig=$ac_useropt + ac_useropt=`$as_echo "$ac_useropt" | sed 's/[-+.]/_/g'` + case $ac_user_opts in + *" +"enable_$ac_useropt" +"*) ;; + *) ac_unrecognized_opts="$ac_unrecognized_opts$ac_unrecognized_sep--disable-$ac_useropt_orig" + ac_unrecognized_sep=', ';; + esac + eval enable_$ac_useropt=no ;; + + -docdir | --docdir | --docdi | --doc | --do) + ac_prev=docdir ;; + -docdir=* | --docdir=* | --docdi=* | --doc=* | --do=*) + docdir=$ac_optarg ;; + + -dvidir | --dvidir | --dvidi | --dvid | --dvi | --dv) + ac_prev=dvidir ;; + -dvidir=* | --dvidir=* | --dvidi=* | --dvid=* | --dvi=* | --dv=*) + dvidir=$ac_optarg ;; + + -enable-* | --enable-*) + ac_useropt=`expr "x$ac_option" : 'x-*enable-\([^=]*\)'` + # Reject names that are not valid shell variable names. + expr "x$ac_useropt" : ".*[^-+._$as_cr_alnum]" >/dev/null && + as_fn_error $? "invalid feature name: $ac_useropt" + ac_useropt_orig=$ac_useropt + ac_useropt=`$as_echo "$ac_useropt" | sed 's/[-+.]/_/g'` + case $ac_user_opts in + *" +"enable_$ac_useropt" +"*) ;; + *) ac_unrecognized_opts="$ac_unrecognized_opts$ac_unrecognized_sep--enable-$ac_useropt_orig" + ac_unrecognized_sep=', ';; + esac + eval enable_$ac_useropt=\$ac_optarg ;; + + -exec-prefix | --exec_prefix | --exec-prefix | --exec-prefi \ + | --exec-pref | --exec-pre | --exec-pr | --exec-p | --exec- \ + | --exec | --exe | --ex) + ac_prev=exec_prefix ;; + -exec-prefix=* | --exec_prefix=* | --exec-prefix=* | --exec-prefi=* \ + | --exec-pref=* | --exec-pre=* | --exec-pr=* | --exec-p=* | --exec-=* \ + | --exec=* | --exe=* | --ex=*) + exec_prefix=$ac_optarg ;; + + -gas | --gas | --ga | --g) + # Obsolete; use --with-gas. + with_gas=yes ;; + + -help | --help | --hel | --he | -h) + ac_init_help=long ;; + -help=r* | --help=r* | --hel=r* | --he=r* | -hr*) + ac_init_help=recursive ;; + -help=s* | --help=s* | --hel=s* | --he=s* | -hs*) + ac_init_help=short ;; + + -host | --host | --hos | --ho) + ac_prev=host_alias ;; + -host=* | --host=* | --hos=* | --ho=*) + host_alias=$ac_optarg ;; + + -htmldir | --htmldir | --htmldi | --htmld | --html | --htm | --ht) + ac_prev=htmldir ;; + -htmldir=* | --htmldir=* | --htmldi=* | --htmld=* | --html=* | --htm=* \ + | --ht=*) + htmldir=$ac_optarg ;; + + -includedir | --includedir | --includedi | --included | --include \ + | --includ | --inclu | --incl | --inc) + ac_prev=includedir ;; + -includedir=* | --includedir=* | --includedi=* | --included=* | --include=* \ + | --includ=* | --inclu=* | --incl=* | --inc=*) + includedir=$ac_optarg ;; + + -infodir | --infodir | --infodi | --infod | --info | --inf) + ac_prev=infodir ;; + -infodir=* | --infodir=* | --infodi=* | --infod=* | --info=* | --inf=*) + infodir=$ac_optarg ;; + + -libdir | --libdir | --libdi | --libd) + ac_prev=libdir ;; + -libdir=* | --libdir=* | --libdi=* | --libd=*) + libdir=$ac_optarg ;; + + -libexecdir | --libexecdir | --libexecdi | --libexecd | --libexec \ + | --libexe | --libex | --libe) + ac_prev=libexecdir ;; + -libexecdir=* | --libexecdir=* | --libexecdi=* | --libexecd=* | --libexec=* \ + | --libexe=* | --libex=* | --libe=*) + libexecdir=$ac_optarg ;; + + -localedir | --localedir | --localedi | --localed | --locale) + ac_prev=localedir ;; + -localedir=* | --localedir=* | --localedi=* | --localed=* | --locale=*) + localedir=$ac_optarg ;; + + -localstatedir | --localstatedir | --localstatedi | --localstated \ + | --localstate | --localstat | --localsta | --localst | --locals) + ac_prev=localstatedir ;; + -localstatedir=* | --localstatedir=* | --localstatedi=* | --localstated=* \ + | --localstate=* | --localstat=* | --localsta=* | --localst=* | --locals=*) + localstatedir=$ac_optarg ;; + + -mandir | --mandir | --mandi | --mand | --man | --ma | --m) + ac_prev=mandir ;; + -mandir=* | --mandir=* | --mandi=* | --mand=* | --man=* | --ma=* | --m=*) + mandir=$ac_optarg ;; + + -nfp | --nfp | --nf) + # Obsolete; use --without-fp. + with_fp=no ;; + + -no-create | --no-create | --no-creat | --no-crea | --no-cre \ + | --no-cr | --no-c | -n) + no_create=yes ;; + + -no-recursion | --no-recursion | --no-recursio | --no-recursi \ + | --no-recurs | --no-recur | --no-recu | --no-rec | --no-re | --no-r) + no_recursion=yes ;; + + -oldincludedir | --oldincludedir | --oldincludedi | --oldincluded \ + | --oldinclude | --oldinclud | --oldinclu | --oldincl | --oldinc \ + | --oldin | --oldi | --old | --ol | --o) + ac_prev=oldincludedir ;; + -oldincludedir=* | --oldincludedir=* | --oldincludedi=* | --oldincluded=* \ + | --oldinclude=* | --oldinclud=* | --oldinclu=* | --oldincl=* | --oldinc=* \ + | --oldin=* | --oldi=* | --old=* | --ol=* | --o=*) + oldincludedir=$ac_optarg ;; + + -prefix | --prefix | --prefi | --pref | --pre | --pr | --p) + ac_prev=prefix ;; + -prefix=* | --prefix=* | --prefi=* | --pref=* | --pre=* | --pr=* | --p=*) + prefix=$ac_optarg ;; + + -program-prefix | --program-prefix | --program-prefi | --program-pref \ + | --program-pre | --program-pr | --program-p) + ac_prev=program_prefix ;; + -program-prefix=* | --program-prefix=* | --program-prefi=* \ + | --program-pref=* | --program-pre=* | --program-pr=* | --program-p=*) + program_prefix=$ac_optarg ;; + + -program-suffix | --program-suffix | --program-suffi | --program-suff \ + | --program-suf | --program-su | --program-s) + ac_prev=program_suffix ;; + -program-suffix=* | --program-suffix=* | --program-suffi=* \ + | --program-suff=* | --program-suf=* | --program-su=* | --program-s=*) + program_suffix=$ac_optarg ;; + + -program-transform-name | --program-transform-name \ + | --program-transform-nam | --program-transform-na \ + | --program-transform-n | --program-transform- \ + | --program-transform | --program-transfor \ + | --program-transfo | --program-transf \ + | --program-trans | --program-tran \ + | --progr-tra | --program-tr | --program-t) + ac_prev=program_transform_name ;; + -program-transform-name=* | --program-transform-name=* \ + | --program-transform-nam=* | --program-transform-na=* \ + | --program-transform-n=* | --program-transform-=* \ + | --program-transform=* | --program-transfor=* \ + | --program-transfo=* | --program-transf=* \ + | --program-trans=* | --program-tran=* \ + | --progr-tra=* | --program-tr=* | --program-t=*) + program_transform_name=$ac_optarg ;; + + -pdfdir | --pdfdir | --pdfdi | --pdfd | --pdf | --pd) + ac_prev=pdfdir ;; + -pdfdir=* | --pdfdir=* | --pdfdi=* | --pdfd=* | --pdf=* | --pd=*) + pdfdir=$ac_optarg ;; + + -psdir | --psdir | --psdi | --psd | --ps) + ac_prev=psdir ;; + -psdir=* | --psdir=* | --psdi=* | --psd=* | --ps=*) + psdir=$ac_optarg ;; + + -q | -quiet | --quiet | --quie | --qui | --qu | --q \ + | -silent | --silent | --silen | --sile | --sil) + silent=yes ;; + + -runstatedir | --runstatedir | --runstatedi | --runstated \ + | --runstate | --runstat | --runsta | --runst | --runs \ + | --run | --ru | --r) + ac_prev=runstatedir ;; + -runstatedir=* | --runstatedir=* | --runstatedi=* | --runstated=* \ + | --runstate=* | --runstat=* | --runsta=* | --runst=* | --runs=* \ + | --run=* | --ru=* | --r=*) + runstatedir=$ac_optarg ;; + + -sbindir | --sbindir | --sbindi | --sbind | --sbin | --sbi | --sb) + ac_prev=sbindir ;; + -sbindir=* | --sbindir=* | --sbindi=* | --sbind=* | --sbin=* \ + | --sbi=* | --sb=*) + sbindir=$ac_optarg ;; + + -sharedstatedir | --sharedstatedir | --sharedstatedi \ + | --sharedstated | --sharedstate | --sharedstat | --sharedsta \ + | --sharedst | --shareds | --shared | --share | --shar \ + | --sha | --sh) + ac_prev=sharedstatedir ;; + -sharedstatedir=* | --sharedstatedir=* | --sharedstatedi=* \ + | --sharedstated=* | --sharedstate=* | --sharedstat=* | --sharedsta=* \ + | --sharedst=* | --shareds=* | --shared=* | --share=* | --shar=* \ + | --sha=* | --sh=*) + sharedstatedir=$ac_optarg ;; + + -site | --site | --sit) + ac_prev=site ;; + -site=* | --site=* | --sit=*) + site=$ac_optarg ;; + + -srcdir | --srcdir | --srcdi | --srcd | --src | --sr) + ac_prev=srcdir ;; + -srcdir=* | --srcdir=* | --srcdi=* | --srcd=* | --src=* | --sr=*) + srcdir=$ac_optarg ;; + + -sysconfdir | --sysconfdir | --sysconfdi | --sysconfd | --sysconf \ + | --syscon | --sysco | --sysc | --sys | --sy) + ac_prev=sysconfdir ;; + -sysconfdir=* | --sysconfdir=* | --sysconfdi=* | --sysconfd=* | --sysconf=* \ + | --syscon=* | --sysco=* | --sysc=* | --sys=* | --sy=*) + sysconfdir=$ac_optarg ;; + + -target | --target | --targe | --targ | --tar | --ta | --t) + ac_prev=target_alias ;; + -target=* | --target=* | --targe=* | --targ=* | --tar=* | --ta=* | --t=*) + target_alias=$ac_optarg ;; + + -v | -verbose | --verbose | --verbos | --verbo | --verb) + verbose=yes ;; + + -version | --version | --versio | --versi | --vers | -V) + ac_init_version=: ;; + + -with-* | --with-*) + ac_useropt=`expr "x$ac_option" : 'x-*with-\([^=]*\)'` + # Reject names that are not valid shell variable names. + expr "x$ac_useropt" : ".*[^-+._$as_cr_alnum]" >/dev/null && + as_fn_error $? "invalid package name: $ac_useropt" + ac_useropt_orig=$ac_useropt + ac_useropt=`$as_echo "$ac_useropt" | sed 's/[-+.]/_/g'` + case $ac_user_opts in + *" +"with_$ac_useropt" +"*) ;; + *) ac_unrecognized_opts="$ac_unrecognized_opts$ac_unrecognized_sep--with-$ac_useropt_orig" + ac_unrecognized_sep=', ';; + esac + eval with_$ac_useropt=\$ac_optarg ;; + + -without-* | --without-*) + ac_useropt=`expr "x$ac_option" : 'x-*without-\(.*\)'` + # Reject names that are not valid shell variable names. + expr "x$ac_useropt" : ".*[^-+._$as_cr_alnum]" >/dev/null && + as_fn_error $? "invalid package name: $ac_useropt" + ac_useropt_orig=$ac_useropt + ac_useropt=`$as_echo "$ac_useropt" | sed 's/[-+.]/_/g'` + case $ac_user_opts in + *" +"with_$ac_useropt" +"*) ;; + *) ac_unrecognized_opts="$ac_unrecognized_opts$ac_unrecognized_sep--without-$ac_useropt_orig" + ac_unrecognized_sep=', ';; + esac + eval with_$ac_useropt=no ;; + + --x) + # Obsolete; use --with-x. + with_x=yes ;; + + -x-includes | --x-includes | --x-include | --x-includ | --x-inclu \ + | --x-incl | --x-inc | --x-in | --x-i) + ac_prev=x_includes ;; + -x-includes=* | --x-includes=* | --x-include=* | --x-includ=* | --x-inclu=* \ + | --x-incl=* | --x-inc=* | --x-in=* | --x-i=*) + x_includes=$ac_optarg ;; + + -x-libraries | --x-libraries | --x-librarie | --x-librari \ + | --x-librar | --x-libra | --x-libr | --x-lib | --x-li | --x-l) + ac_prev=x_libraries ;; + -x-libraries=* | --x-libraries=* | --x-librarie=* | --x-librari=* \ + | --x-librar=* | --x-libra=* | --x-libr=* | --x-lib=* | --x-li=* | --x-l=*) + x_libraries=$ac_optarg ;; + + -*) as_fn_error $? "unrecognized option: \`$ac_option' +Try \`$0 --help' for more information" + ;; + + *=*) + ac_envvar=`expr "x$ac_option" : 'x\([^=]*\)='` + # Reject names that are not valid shell variable names. + case $ac_envvar in #( + '' | [0-9]* | *[!_$as_cr_alnum]* ) + as_fn_error $? "invalid variable name: \`$ac_envvar'" ;; + esac + eval $ac_envvar=\$ac_optarg + export $ac_envvar ;; + + *) + # FIXME: should be removed in autoconf 3.0. + $as_echo "$as_me: WARNING: you should use --build, --host, --target" >&2 + expr "x$ac_option" : ".*[^-._$as_cr_alnum]" >/dev/null && + $as_echo "$as_me: WARNING: invalid host type: $ac_option" >&2 + : "${build_alias=$ac_option} ${host_alias=$ac_option} ${target_alias=$ac_option}" + ;; + + esac +done + +if test -n "$ac_prev"; then + ac_option=--`echo $ac_prev | sed 's/_/-/g'` + as_fn_error $? "missing argument to $ac_option" +fi + +if test -n "$ac_unrecognized_opts"; then + case $enable_option_checking in + no) ;; + fatal) as_fn_error $? "unrecognized options: $ac_unrecognized_opts" ;; + *) $as_echo "$as_me: WARNING: unrecognized options: $ac_unrecognized_opts" >&2 ;; + esac +fi + +# Check all directory arguments for consistency. +for ac_var in exec_prefix prefix bindir sbindir libexecdir datarootdir \ + datadir sysconfdir sharedstatedir localstatedir includedir \ + oldincludedir docdir infodir htmldir dvidir pdfdir psdir \ + libdir localedir mandir runstatedir +do + eval ac_val=\$$ac_var + # Remove trailing slashes. + case $ac_val in + */ ) + ac_val=`expr "X$ac_val" : 'X\(.*[^/]\)' \| "X$ac_val" : 'X\(.*\)'` + eval $ac_var=\$ac_val;; + esac + # Be sure to have absolute directory names. + case $ac_val in + [\\/$]* | ?:[\\/]* ) continue;; + NONE | '' ) case $ac_var in *prefix ) continue;; esac;; + esac + as_fn_error $? "expected an absolute directory name for --$ac_var: $ac_val" +done + +# There might be people who depend on the old broken behavior: `$host' +# used to hold the argument of --host etc. +# FIXME: To remove some day. +build=$build_alias +host=$host_alias +target=$target_alias + +# FIXME: To remove some day. +if test "x$host_alias" != x; then + if test "x$build_alias" = x; then + cross_compiling=maybe + elif test "x$build_alias" != "x$host_alias"; then + cross_compiling=yes + fi +fi + +ac_tool_prefix= +test -n "$host_alias" && ac_tool_prefix=$host_alias- + +test "$silent" = yes && exec 6>/dev/null + + +ac_pwd=`pwd` && test -n "$ac_pwd" && +ac_ls_di=`ls -di .` && +ac_pwd_ls_di=`cd "$ac_pwd" && ls -di .` || + as_fn_error $? "working directory cannot be determined" +test "X$ac_ls_di" = "X$ac_pwd_ls_di" || + as_fn_error $? "pwd does not report name of working directory" + + +# Find the source files, if location was not specified. +if test -z "$srcdir"; then + ac_srcdir_defaulted=yes + # Try the directory containing this script, then the parent directory. + ac_confdir=`$as_dirname -- "$as_myself" || +$as_expr X"$as_myself" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \ + X"$as_myself" : 'X\(//\)[^/]' \| \ + X"$as_myself" : 'X\(//\)$' \| \ + X"$as_myself" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X"$as_myself" | + sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{ + s//\1/ + q + } + /^X\(\/\/\)[^/].*/{ + s//\1/ + q + } + /^X\(\/\/\)$/{ + s//\1/ + q + } + /^X\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'` + srcdir=$ac_confdir + if test ! -r "$srcdir/$ac_unique_file"; then + srcdir=.. + fi +else + ac_srcdir_defaulted=no +fi +if test ! -r "$srcdir/$ac_unique_file"; then + test "$ac_srcdir_defaulted" = yes && srcdir="$ac_confdir or .." + as_fn_error $? "cannot find sources ($ac_unique_file) in $srcdir" +fi +ac_msg="sources are in $srcdir, but \`cd $srcdir' does not work" +ac_abs_confdir=`( + cd "$srcdir" && test -r "./$ac_unique_file" || as_fn_error $? "$ac_msg" + pwd)` +# When building in place, set srcdir=. +if test "$ac_abs_confdir" = "$ac_pwd"; then + srcdir=. +fi +# Remove unnecessary trailing slashes from srcdir. +# Double slashes in file names in object file debugging info +# mess up M-x gdb in Emacs. +case $srcdir in +*/) srcdir=`expr "X$srcdir" : 'X\(.*[^/]\)' \| "X$srcdir" : 'X\(.*\)'`;; +esac +for ac_var in $ac_precious_vars; do + eval ac_env_${ac_var}_set=\${${ac_var}+set} + eval ac_env_${ac_var}_value=\$${ac_var} + eval ac_cv_env_${ac_var}_set=\${${ac_var}+set} + eval ac_cv_env_${ac_var}_value=\$${ac_var} +done + +# +# Report the --help message. +# +if test "$ac_init_help" = "long"; then + # Omit some internal or obsolete options to make the list less imposing. + # This message is too long to be a string in the A/UX 3.1 sh. + cat <<_ACEOF +\`configure' configures liblognorm 2.0.5 to adapt to many kinds of systems. + +Usage: $0 [OPTION]... [VAR=VALUE]... + +To assign environment variables (e.g., CC, CFLAGS...), specify them as +VAR=VALUE. See below for descriptions of some of the useful variables. + +Defaults for the options are specified in brackets. + +Configuration: + -h, --help display this help and exit + --help=short display options specific to this package + --help=recursive display the short help of all the included packages + -V, --version display version information and exit + -q, --quiet, --silent do not print \`checking ...' messages + --cache-file=FILE cache test results in FILE [disabled] + -C, --config-cache alias for \`--cache-file=config.cache' + -n, --no-create do not create output files + --srcdir=DIR find the sources in DIR [configure dir or \`..'] + +Installation directories: + --prefix=PREFIX install architecture-independent files in PREFIX + [$ac_default_prefix] + --exec-prefix=EPREFIX install architecture-dependent files in EPREFIX + [PREFIX] + +By default, \`make install' will install all the files in +\`$ac_default_prefix/bin', \`$ac_default_prefix/lib' etc. You can specify +an installation prefix other than \`$ac_default_prefix' using \`--prefix', +for instance \`--prefix=\$HOME'. + +For better control, use the options below. + +Fine tuning of the installation directories: + --bindir=DIR user executables [EPREFIX/bin] + --sbindir=DIR system admin executables [EPREFIX/sbin] + --libexecdir=DIR program executables [EPREFIX/libexec] + --sysconfdir=DIR read-only single-machine data [PREFIX/etc] + --sharedstatedir=DIR modifiable architecture-independent data [PREFIX/com] + --localstatedir=DIR modifiable single-machine data [PREFIX/var] + --runstatedir=DIR modifiable per-process data [LOCALSTATEDIR/run] + --libdir=DIR object code libraries [EPREFIX/lib] + --includedir=DIR C header files [PREFIX/include] + --oldincludedir=DIR C header files for non-gcc [/usr/include] + --datarootdir=DIR read-only arch.-independent data root [PREFIX/share] + --datadir=DIR read-only architecture-independent data [DATAROOTDIR] + --infodir=DIR info documentation [DATAROOTDIR/info] + --localedir=DIR locale-dependent data [DATAROOTDIR/locale] + --mandir=DIR man documentation [DATAROOTDIR/man] + --docdir=DIR documentation root [DATAROOTDIR/doc/liblognorm] + --htmldir=DIR html documentation [DOCDIR] + --dvidir=DIR dvi documentation [DOCDIR] + --pdfdir=DIR pdf documentation [DOCDIR] + --psdir=DIR ps documentation [DOCDIR] +_ACEOF + + cat <<\_ACEOF + +Program names: + --program-prefix=PREFIX prepend PREFIX to installed program names + --program-suffix=SUFFIX append SUFFIX to installed program names + --program-transform-name=PROGRAM run sed PROGRAM on installed program names + +System types: + --build=BUILD configure for building on BUILD [guessed] + --host=HOST cross-compile to build programs to run on HOST [BUILD] +_ACEOF +fi + +if test -n "$ac_init_help"; then + case $ac_init_help in + short | recursive ) echo "Configuration of liblognorm 2.0.5:";; + esac + cat <<\_ACEOF + +Optional Features: + --disable-option-checking ignore unrecognized --enable/--with options + --disable-FEATURE do not include FEATURE (same as --enable-FEATURE=no) + --enable-FEATURE[=ARG] include FEATURE [ARG=yes] + --enable-silent-rules less verbose build output (undo: "make V=1") + --disable-silent-rules verbose build output (undo: "make V=0") + --enable-dependency-tracking + do not reject slow dependency extractors + --disable-dependency-tracking + speeds up one-time build + --enable-shared[=PKGS] build shared libraries [default=yes] + --enable-static[=PKGS] build static libraries [default=yes] + --enable-fast-install[=PKGS] + optimize for fast installation [default=yes] + --disable-libtool-lock avoid locking (might break parallel builds) + --enable-compile-warnings=[no/yes/error] + Enable compiler warnings and errors + --disable-Werror Unconditionally make all compiler warnings non-fatal + --enable-regexp Enable regular expressions support [default=no] + --enable-debug Enable debug mode [default=no] + --enable-advanced-stats Enable advanced statistics [default=no] + --enable-docs Enable building HTML docs (requires Sphinx) + [default=no] + --enable-testbench testbench enabled [default=yes] + --enable-valgrind valgrind enabled [default=no] + --enable-tools lognorm toolset enabled [default=yes] + +Optional Packages: + --with-PACKAGE[=ARG] use PACKAGE [ARG=yes] + --without-PACKAGE do not use PACKAGE (same as --with-PACKAGE=no) + --with-pic[=PKGS] try to use only PIC/non-PIC objects [default=use + both] + --with-aix-soname=aix|svr4|both + shared library versioning (aka "SONAME") variant to + provide on AIX, [default=aix]. + --with-gnu-ld assume the C compiler uses GNU ld [default=no] + --with-sysroot[=DIR] Search for dependent libraries within DIR (or the + compiler's sysroot if not specified). + +Some influential environment variables: + CC C compiler command + CFLAGS C compiler flags + LDFLAGS linker flags, e.g. -L if you have libraries in a + nonstandard directory + LIBS libraries to pass to the linker, e.g. -l + CPPFLAGS (Objective) C/C++ preprocessor flags, e.g. -I if + you have headers in a nonstandard directory + CPP C preprocessor + LT_SYS_LIBRARY_PATH + User-defined run-time library search path. + PKG_CONFIG path to pkg-config utility + PKG_CONFIG_PATH + directories to add to pkg-config's search path + PKG_CONFIG_LIBDIR + path overriding pkg-config's built-in search path + LIBESTR_CFLAGS + C compiler flags for LIBESTR, overriding pkg-config + LIBESTR_LIBS + linker flags for LIBESTR, overriding pkg-config + JSON_C_CFLAGS + C compiler flags for JSON_C, overriding pkg-config + JSON_C_LIBS linker flags for JSON_C, overriding pkg-config + PCRE_CFLAGS C compiler flags for PCRE, overriding pkg-config + PCRE_LIBS linker flags for PCRE, overriding pkg-config + +Use these variables to override the choices made by `configure' or to help +it to find libraries and programs with nonstandard names/locations. + +Report bugs to . +_ACEOF +ac_status=$? +fi + +if test "$ac_init_help" = "recursive"; then + # If there are subdirs, report their specific --help. + for ac_dir in : $ac_subdirs_all; do test "x$ac_dir" = x: && continue + test -d "$ac_dir" || + { cd "$srcdir" && ac_pwd=`pwd` && srcdir=. && test -d "$ac_dir"; } || + continue + ac_builddir=. + +case "$ac_dir" in +.) ac_dir_suffix= ac_top_builddir_sub=. ac_top_build_prefix= ;; +*) + ac_dir_suffix=/`$as_echo "$ac_dir" | sed 's|^\.[\\/]||'` + # A ".." for each directory in $ac_dir_suffix. + ac_top_builddir_sub=`$as_echo "$ac_dir_suffix" | sed 's|/[^\\/]*|/..|g;s|/||'` + case $ac_top_builddir_sub in + "") ac_top_builddir_sub=. ac_top_build_prefix= ;; + *) ac_top_build_prefix=$ac_top_builddir_sub/ ;; + esac ;; +esac +ac_abs_top_builddir=$ac_pwd +ac_abs_builddir=$ac_pwd$ac_dir_suffix +# for backward compatibility: +ac_top_builddir=$ac_top_build_prefix + +case $srcdir in + .) # We are building in place. + ac_srcdir=. + ac_top_srcdir=$ac_top_builddir_sub + ac_abs_top_srcdir=$ac_pwd ;; + [\\/]* | ?:[\\/]* ) # Absolute name. + ac_srcdir=$srcdir$ac_dir_suffix; + ac_top_srcdir=$srcdir + ac_abs_top_srcdir=$srcdir ;; + *) # Relative name. + ac_srcdir=$ac_top_build_prefix$srcdir$ac_dir_suffix + ac_top_srcdir=$ac_top_build_prefix$srcdir + ac_abs_top_srcdir=$ac_pwd/$srcdir ;; +esac +ac_abs_srcdir=$ac_abs_top_srcdir$ac_dir_suffix + + cd "$ac_dir" || { ac_status=$?; continue; } + # Check for guested configure. + if test -f "$ac_srcdir/configure.gnu"; then + echo && + $SHELL "$ac_srcdir/configure.gnu" --help=recursive + elif test -f "$ac_srcdir/configure"; then + echo && + $SHELL "$ac_srcdir/configure" --help=recursive + else + $as_echo "$as_me: WARNING: no configuration information is in $ac_dir" >&2 + fi || ac_status=$? + cd "$ac_pwd" || { ac_status=$?; break; } + done +fi + +test -n "$ac_init_help" && exit $ac_status +if $ac_init_version; then + cat <<\_ACEOF +liblognorm configure 2.0.5 +generated by GNU Autoconf 2.69 + +Copyright (C) 2012 Free Software Foundation, Inc. +This configure script is free software; the Free Software Foundation +gives unlimited permission to copy, distribute and modify it. +_ACEOF + exit +fi + +## ------------------------ ## +## Autoconf initialization. ## +## ------------------------ ## + +# ac_fn_c_try_compile LINENO +# -------------------------- +# Try to compile conftest.$ac_ext, and return whether this succeeded. +ac_fn_c_try_compile () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + rm -f conftest.$ac_objext + if { { ac_try="$ac_compile" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_compile") 2>conftest.err + ac_status=$? + if test -s conftest.err; then + grep -v '^ *+' conftest.err >conftest.er1 + cat conftest.er1 >&5 + mv -f conftest.er1 conftest.err + fi + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } && { + test -z "$ac_c_werror_flag" || + test ! -s conftest.err + } && test -s conftest.$ac_objext; then : + ac_retval=0 +else + $as_echo "$as_me: failed program was:" >&5 +sed 's/^/| /' conftest.$ac_ext >&5 + + ac_retval=1 +fi + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + as_fn_set_status $ac_retval + +} # ac_fn_c_try_compile + +# ac_fn_c_try_cpp LINENO +# ---------------------- +# Try to preprocess conftest.$ac_ext, and return whether this succeeded. +ac_fn_c_try_cpp () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + if { { ac_try="$ac_cpp conftest.$ac_ext" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_cpp conftest.$ac_ext") 2>conftest.err + ac_status=$? + if test -s conftest.err; then + grep -v '^ *+' conftest.err >conftest.er1 + cat conftest.er1 >&5 + mv -f conftest.er1 conftest.err + fi + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } > conftest.i && { + test -z "$ac_c_preproc_warn_flag$ac_c_werror_flag" || + test ! -s conftest.err + }; then : + ac_retval=0 +else + $as_echo "$as_me: failed program was:" >&5 +sed 's/^/| /' conftest.$ac_ext >&5 + + ac_retval=1 +fi + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + as_fn_set_status $ac_retval + +} # ac_fn_c_try_cpp + +# ac_fn_c_check_header_mongrel LINENO HEADER VAR INCLUDES +# ------------------------------------------------------- +# Tests whether HEADER exists, giving a warning if it cannot be compiled using +# the include files in INCLUDES and setting the cache variable VAR +# accordingly. +ac_fn_c_check_header_mongrel () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + if eval \${$3+:} false; then : + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 +$as_echo_n "checking for $2... " >&6; } +if eval \${$3+:} false; then : + $as_echo_n "(cached) " >&6 +fi +eval ac_res=\$$3 + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +else + # Is the header compilable? +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking $2 usability" >&5 +$as_echo_n "checking $2 usability... " >&6; } +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +$4 +#include <$2> +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_header_compiler=yes +else + ac_header_compiler=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_header_compiler" >&5 +$as_echo "$ac_header_compiler" >&6; } + +# Is the header present? +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking $2 presence" >&5 +$as_echo_n "checking $2 presence... " >&6; } +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include <$2> +_ACEOF +if ac_fn_c_try_cpp "$LINENO"; then : + ac_header_preproc=yes +else + ac_header_preproc=no +fi +rm -f conftest.err conftest.i conftest.$ac_ext +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_header_preproc" >&5 +$as_echo "$ac_header_preproc" >&6; } + +# So? What about this header? +case $ac_header_compiler:$ac_header_preproc:$ac_c_preproc_warn_flag in #(( + yes:no: ) + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $2: accepted by the compiler, rejected by the preprocessor!" >&5 +$as_echo "$as_me: WARNING: $2: accepted by the compiler, rejected by the preprocessor!" >&2;} + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $2: proceeding with the compiler's result" >&5 +$as_echo "$as_me: WARNING: $2: proceeding with the compiler's result" >&2;} + ;; + no:yes:* ) + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $2: present but cannot be compiled" >&5 +$as_echo "$as_me: WARNING: $2: present but cannot be compiled" >&2;} + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $2: check for missing prerequisite headers?" >&5 +$as_echo "$as_me: WARNING: $2: check for missing prerequisite headers?" >&2;} + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $2: see the Autoconf documentation" >&5 +$as_echo "$as_me: WARNING: $2: see the Autoconf documentation" >&2;} + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $2: section \"Present But Cannot Be Compiled\"" >&5 +$as_echo "$as_me: WARNING: $2: section \"Present But Cannot Be Compiled\"" >&2;} + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $2: proceeding with the compiler's result" >&5 +$as_echo "$as_me: WARNING: $2: proceeding with the compiler's result" >&2;} +( $as_echo "## ------------------------------------ ## +## Report this to rgerhards@adiscon.com ## +## ------------------------------------ ##" + ) | sed "s/^/$as_me: WARNING: /" >&2 + ;; +esac + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 +$as_echo_n "checking for $2... " >&6; } +if eval \${$3+:} false; then : + $as_echo_n "(cached) " >&6 +else + eval "$3=\$ac_header_compiler" +fi +eval ac_res=\$$3 + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +fi + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + +} # ac_fn_c_check_header_mongrel + +# ac_fn_c_try_run LINENO +# ---------------------- +# Try to link conftest.$ac_ext, and return whether this succeeded. Assumes +# that executables *can* be run. +ac_fn_c_try_run () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + if { { ac_try="$ac_link" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_link") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } && { ac_try='./conftest$ac_exeext' + { { case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_try") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; }; then : + ac_retval=0 +else + $as_echo "$as_me: program exited with status $ac_status" >&5 + $as_echo "$as_me: failed program was:" >&5 +sed 's/^/| /' conftest.$ac_ext >&5 + + ac_retval=$ac_status +fi + rm -rf conftest.dSYM conftest_ipa8_conftest.oo + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + as_fn_set_status $ac_retval + +} # ac_fn_c_try_run + +# ac_fn_c_check_header_compile LINENO HEADER VAR INCLUDES +# ------------------------------------------------------- +# Tests whether HEADER exists and can be compiled using the include files in +# INCLUDES, setting the cache variable VAR accordingly. +ac_fn_c_check_header_compile () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 +$as_echo_n "checking for $2... " >&6; } +if eval \${$3+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +$4 +#include <$2> +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + eval "$3=yes" +else + eval "$3=no" +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +eval ac_res=\$$3 + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + +} # ac_fn_c_check_header_compile + +# ac_fn_c_try_link LINENO +# ----------------------- +# Try to link conftest.$ac_ext, and return whether this succeeded. +ac_fn_c_try_link () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + rm -f conftest.$ac_objext conftest$ac_exeext + if { { ac_try="$ac_link" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_link") 2>conftest.err + ac_status=$? + if test -s conftest.err; then + grep -v '^ *+' conftest.err >conftest.er1 + cat conftest.er1 >&5 + mv -f conftest.er1 conftest.err + fi + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } && { + test -z "$ac_c_werror_flag" || + test ! -s conftest.err + } && test -s conftest$ac_exeext && { + test "$cross_compiling" = yes || + test -x conftest$ac_exeext + }; then : + ac_retval=0 +else + $as_echo "$as_me: failed program was:" >&5 +sed 's/^/| /' conftest.$ac_ext >&5 + + ac_retval=1 +fi + # Delete the IPA/IPO (Inter Procedural Analysis/Optimization) information + # created by the PGI compiler (conftest_ipa8_conftest.oo), as it would + # interfere with the next link command; also delete a directory that is + # left behind by Apple's compiler. We do this before executing the actions. + rm -rf conftest.dSYM conftest_ipa8_conftest.oo + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + as_fn_set_status $ac_retval + +} # ac_fn_c_try_link + +# ac_fn_c_check_func LINENO FUNC VAR +# ---------------------------------- +# Tests whether FUNC exists, setting the cache variable VAR accordingly +ac_fn_c_check_func () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 +$as_echo_n "checking for $2... " >&6; } +if eval \${$3+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +/* Define $2 to an innocuous variant, in case declares $2. + For example, HP-UX 11i declares gettimeofday. */ +#define $2 innocuous_$2 + +/* System header to define __stub macros and hopefully few prototypes, + which can conflict with char $2 (); below. + Prefer to if __STDC__ is defined, since + exists even on freestanding compilers. */ + +#ifdef __STDC__ +# include +#else +# include +#endif + +#undef $2 + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char $2 (); +/* The GNU C library defines this for functions which it implements + to always fail with ENOSYS. Some functions are actually named + something starting with __ and the normal name is an alias. */ +#if defined __stub_$2 || defined __stub___$2 +choke me +#endif + +int +main () +{ +return $2 (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + eval "$3=yes" +else + eval "$3=no" +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +fi +eval ac_res=\$$3 + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + +} # ac_fn_c_check_func + +# ac_fn_c_check_type LINENO TYPE VAR INCLUDES +# ------------------------------------------- +# Tests whether TYPE exists after having included INCLUDES, setting cache +# variable VAR accordingly. +ac_fn_c_check_type () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 +$as_echo_n "checking for $2... " >&6; } +if eval \${$3+:} false; then : + $as_echo_n "(cached) " >&6 +else + eval "$3=no" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +$4 +int +main () +{ +if (sizeof ($2)) + return 0; + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +$4 +int +main () +{ +if (sizeof (($2))) + return 0; + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + +else + eval "$3=yes" +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +eval ac_res=\$$3 + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + +} # ac_fn_c_check_type + +# ac_fn_c_check_decl LINENO SYMBOL VAR INCLUDES +# --------------------------------------------- +# Tests whether SYMBOL is declared in INCLUDES, setting cache variable VAR +# accordingly. +ac_fn_c_check_decl () +{ + as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + as_decl_name=`echo $2|sed 's/ *(.*//'` + as_decl_use=`echo $2|sed -e 's/(/((/' -e 's/)/) 0&/' -e 's/,/) 0& (/g'` + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $as_decl_name is declared" >&5 +$as_echo_n "checking whether $as_decl_name is declared... " >&6; } +if eval \${$3+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +$4 +int +main () +{ +#ifndef $as_decl_name +#ifdef __cplusplus + (void) $as_decl_use; +#else + (void) $as_decl_name; +#endif +#endif + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + eval "$3=yes" +else + eval "$3=no" +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +eval ac_res=\$$3 + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno + +} # ac_fn_c_check_decl +cat >config.log <<_ACEOF +This file contains any messages produced by compilers while +running configure, to aid debugging if configure makes a mistake. + +It was created by liblognorm $as_me 2.0.5, which was +generated by GNU Autoconf 2.69. Invocation command line was + + $ $0 $@ + +_ACEOF +exec 5>>config.log +{ +cat <<_ASUNAME +## --------- ## +## Platform. ## +## --------- ## + +hostname = `(hostname || uname -n) 2>/dev/null | sed 1q` +uname -m = `(uname -m) 2>/dev/null || echo unknown` +uname -r = `(uname -r) 2>/dev/null || echo unknown` +uname -s = `(uname -s) 2>/dev/null || echo unknown` +uname -v = `(uname -v) 2>/dev/null || echo unknown` + +/usr/bin/uname -p = `(/usr/bin/uname -p) 2>/dev/null || echo unknown` +/bin/uname -X = `(/bin/uname -X) 2>/dev/null || echo unknown` + +/bin/arch = `(/bin/arch) 2>/dev/null || echo unknown` +/usr/bin/arch -k = `(/usr/bin/arch -k) 2>/dev/null || echo unknown` +/usr/convex/getsysinfo = `(/usr/convex/getsysinfo) 2>/dev/null || echo unknown` +/usr/bin/hostinfo = `(/usr/bin/hostinfo) 2>/dev/null || echo unknown` +/bin/machine = `(/bin/machine) 2>/dev/null || echo unknown` +/usr/bin/oslevel = `(/usr/bin/oslevel) 2>/dev/null || echo unknown` +/bin/universe = `(/bin/universe) 2>/dev/null || echo unknown` + +_ASUNAME + +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + $as_echo "PATH: $as_dir" + done +IFS=$as_save_IFS + +} >&5 + +cat >&5 <<_ACEOF + + +## ----------- ## +## Core tests. ## +## ----------- ## + +_ACEOF + + +# Keep a trace of the command line. +# Strip out --no-create and --no-recursion so they do not pile up. +# Strip out --silent because we don't want to record it for future runs. +# Also quote any args containing shell meta-characters. +# Make two passes to allow for proper duplicate-argument suppression. +ac_configure_args= +ac_configure_args0= +ac_configure_args1= +ac_must_keep_next=false +for ac_pass in 1 2 +do + for ac_arg + do + case $ac_arg in + -no-create | --no-c* | -n | -no-recursion | --no-r*) continue ;; + -q | -quiet | --quiet | --quie | --qui | --qu | --q \ + | -silent | --silent | --silen | --sile | --sil) + continue ;; + *\'*) + ac_arg=`$as_echo "$ac_arg" | sed "s/'/'\\\\\\\\''/g"` ;; + esac + case $ac_pass in + 1) as_fn_append ac_configure_args0 " '$ac_arg'" ;; + 2) + as_fn_append ac_configure_args1 " '$ac_arg'" + if test $ac_must_keep_next = true; then + ac_must_keep_next=false # Got value, back to normal. + else + case $ac_arg in + *=* | --config-cache | -C | -disable-* | --disable-* \ + | -enable-* | --enable-* | -gas | --g* | -nfp | --nf* \ + | -q | -quiet | --q* | -silent | --sil* | -v | -verb* \ + | -with-* | --with-* | -without-* | --without-* | --x) + case "$ac_configure_args0 " in + "$ac_configure_args1"*" '$ac_arg' "* ) continue ;; + esac + ;; + -* ) ac_must_keep_next=true ;; + esac + fi + as_fn_append ac_configure_args " '$ac_arg'" + ;; + esac + done +done +{ ac_configure_args0=; unset ac_configure_args0;} +{ ac_configure_args1=; unset ac_configure_args1;} + +# When interrupted or exit'd, cleanup temporary files, and complete +# config.log. We remove comments because anyway the quotes in there +# would cause problems or look ugly. +# WARNING: Use '\'' to represent an apostrophe within the trap. +# WARNING: Do not start the trap code with a newline, due to a FreeBSD 4.0 bug. +trap 'exit_status=$? + # Save into config.log some information that might help in debugging. + { + echo + + $as_echo "## ---------------- ## +## Cache variables. ## +## ---------------- ##" + echo + # The following way of writing the cache mishandles newlines in values, +( + for ac_var in `(set) 2>&1 | sed -n '\''s/^\([a-zA-Z_][a-zA-Z0-9_]*\)=.*/\1/p'\''`; do + eval ac_val=\$$ac_var + case $ac_val in #( + *${as_nl}*) + case $ac_var in #( + *_cv_*) { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: cache variable $ac_var contains a newline" >&5 +$as_echo "$as_me: WARNING: cache variable $ac_var contains a newline" >&2;} ;; + esac + case $ac_var in #( + _ | IFS | as_nl) ;; #( + BASH_ARGV | BASH_SOURCE) eval $ac_var= ;; #( + *) { eval $ac_var=; unset $ac_var;} ;; + esac ;; + esac + done + (set) 2>&1 | + case $as_nl`(ac_space='\'' '\''; set) 2>&1` in #( + *${as_nl}ac_space=\ *) + sed -n \ + "s/'\''/'\''\\\\'\'''\''/g; + s/^\\([_$as_cr_alnum]*_cv_[_$as_cr_alnum]*\\)=\\(.*\\)/\\1='\''\\2'\''/p" + ;; #( + *) + sed -n "/^[_$as_cr_alnum]*_cv_[_$as_cr_alnum]*=/p" + ;; + esac | + sort +) + echo + + $as_echo "## ----------------- ## +## Output variables. ## +## ----------------- ##" + echo + for ac_var in $ac_subst_vars + do + eval ac_val=\$$ac_var + case $ac_val in + *\'\''*) ac_val=`$as_echo "$ac_val" | sed "s/'\''/'\''\\\\\\\\'\'''\''/g"`;; + esac + $as_echo "$ac_var='\''$ac_val'\''" + done | sort + echo + + if test -n "$ac_subst_files"; then + $as_echo "## ------------------- ## +## File substitutions. ## +## ------------------- ##" + echo + for ac_var in $ac_subst_files + do + eval ac_val=\$$ac_var + case $ac_val in + *\'\''*) ac_val=`$as_echo "$ac_val" | sed "s/'\''/'\''\\\\\\\\'\'''\''/g"`;; + esac + $as_echo "$ac_var='\''$ac_val'\''" + done | sort + echo + fi + + if test -s confdefs.h; then + $as_echo "## ----------- ## +## confdefs.h. ## +## ----------- ##" + echo + cat confdefs.h + echo + fi + test "$ac_signal" != 0 && + $as_echo "$as_me: caught signal $ac_signal" + $as_echo "$as_me: exit $exit_status" + } >&5 + rm -f core *.core core.conftest.* && + rm -f -r conftest* confdefs* conf$$* $ac_clean_files && + exit $exit_status +' 0 +for ac_signal in 1 2 13 15; do + trap 'ac_signal='$ac_signal'; as_fn_exit 1' $ac_signal +done +ac_signal=0 + +# confdefs.h avoids OS command line length limits that DEFS can exceed. +rm -f -r conftest* confdefs.h + +$as_echo "/* confdefs.h */" > confdefs.h + +# Predefined preprocessor variables. + +cat >>confdefs.h <<_ACEOF +#define PACKAGE_NAME "$PACKAGE_NAME" +_ACEOF + +cat >>confdefs.h <<_ACEOF +#define PACKAGE_TARNAME "$PACKAGE_TARNAME" +_ACEOF + +cat >>confdefs.h <<_ACEOF +#define PACKAGE_VERSION "$PACKAGE_VERSION" +_ACEOF + +cat >>confdefs.h <<_ACEOF +#define PACKAGE_STRING "$PACKAGE_STRING" +_ACEOF + +cat >>confdefs.h <<_ACEOF +#define PACKAGE_BUGREPORT "$PACKAGE_BUGREPORT" +_ACEOF + +cat >>confdefs.h <<_ACEOF +#define PACKAGE_URL "$PACKAGE_URL" +_ACEOF + + +# Let the site file select an alternate cache file if it wants to. +# Prefer an explicitly selected file to automatically selected ones. +ac_site_file1=NONE +ac_site_file2=NONE +if test -n "$CONFIG_SITE"; then + # We do not want a PATH search for config.site. + case $CONFIG_SITE in #(( + -*) ac_site_file1=./$CONFIG_SITE;; + */*) ac_site_file1=$CONFIG_SITE;; + *) ac_site_file1=./$CONFIG_SITE;; + esac +elif test "x$prefix" != xNONE; then + ac_site_file1=$prefix/share/config.site + ac_site_file2=$prefix/etc/config.site +else + ac_site_file1=$ac_default_prefix/share/config.site + ac_site_file2=$ac_default_prefix/etc/config.site +fi +for ac_site_file in "$ac_site_file1" "$ac_site_file2" +do + test "x$ac_site_file" = xNONE && continue + if test /dev/null != "$ac_site_file" && test -r "$ac_site_file"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: loading site script $ac_site_file" >&5 +$as_echo "$as_me: loading site script $ac_site_file" >&6;} + sed 's/^/| /' "$ac_site_file" >&5 + . "$ac_site_file" \ + || { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "failed to load site script $ac_site_file +See \`config.log' for more details" "$LINENO" 5; } + fi +done + +if test -r "$cache_file"; then + # Some versions of bash will fail to source /dev/null (special files + # actually), so we avoid doing that. DJGPP emulates it as a regular file. + if test /dev/null != "$cache_file" && test -f "$cache_file"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: loading cache $cache_file" >&5 +$as_echo "$as_me: loading cache $cache_file" >&6;} + case $cache_file in + [\\/]* | ?:[\\/]* ) . "$cache_file";; + *) . "./$cache_file";; + esac + fi +else + { $as_echo "$as_me:${as_lineno-$LINENO}: creating cache $cache_file" >&5 +$as_echo "$as_me: creating cache $cache_file" >&6;} + >$cache_file +fi + +# Check that the precious variables saved in the cache have kept the same +# value. +ac_cache_corrupted=false +for ac_var in $ac_precious_vars; do + eval ac_old_set=\$ac_cv_env_${ac_var}_set + eval ac_new_set=\$ac_env_${ac_var}_set + eval ac_old_val=\$ac_cv_env_${ac_var}_value + eval ac_new_val=\$ac_env_${ac_var}_value + case $ac_old_set,$ac_new_set in + set,) + { $as_echo "$as_me:${as_lineno-$LINENO}: error: \`$ac_var' was set to \`$ac_old_val' in the previous run" >&5 +$as_echo "$as_me: error: \`$ac_var' was set to \`$ac_old_val' in the previous run" >&2;} + ac_cache_corrupted=: ;; + ,set) + { $as_echo "$as_me:${as_lineno-$LINENO}: error: \`$ac_var' was not set in the previous run" >&5 +$as_echo "$as_me: error: \`$ac_var' was not set in the previous run" >&2;} + ac_cache_corrupted=: ;; + ,);; + *) + if test "x$ac_old_val" != "x$ac_new_val"; then + # differences in whitespace do not lead to failure. + ac_old_val_w=`echo x $ac_old_val` + ac_new_val_w=`echo x $ac_new_val` + if test "$ac_old_val_w" != "$ac_new_val_w"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: error: \`$ac_var' has changed since the previous run:" >&5 +$as_echo "$as_me: error: \`$ac_var' has changed since the previous run:" >&2;} + ac_cache_corrupted=: + else + { $as_echo "$as_me:${as_lineno-$LINENO}: warning: ignoring whitespace changes in \`$ac_var' since the previous run:" >&5 +$as_echo "$as_me: warning: ignoring whitespace changes in \`$ac_var' since the previous run:" >&2;} + eval $ac_var=\$ac_old_val + fi + { $as_echo "$as_me:${as_lineno-$LINENO}: former value: \`$ac_old_val'" >&5 +$as_echo "$as_me: former value: \`$ac_old_val'" >&2;} + { $as_echo "$as_me:${as_lineno-$LINENO}: current value: \`$ac_new_val'" >&5 +$as_echo "$as_me: current value: \`$ac_new_val'" >&2;} + fi;; + esac + # Pass precious variables to config.status. + if test "$ac_new_set" = set; then + case $ac_new_val in + *\'*) ac_arg=$ac_var=`$as_echo "$ac_new_val" | sed "s/'/'\\\\\\\\''/g"` ;; + *) ac_arg=$ac_var=$ac_new_val ;; + esac + case " $ac_configure_args " in + *" '$ac_arg' "*) ;; # Avoid dups. Use of quotes ensures accuracy. + *) as_fn_append ac_configure_args " '$ac_arg'" ;; + esac + fi +done +if $ac_cache_corrupted; then + { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} + { $as_echo "$as_me:${as_lineno-$LINENO}: error: changes in the environment can compromise the build" >&5 +$as_echo "$as_me: error: changes in the environment can compromise the build" >&2;} + as_fn_error $? "run \`make distclean' and/or \`rm $cache_file' and start over" "$LINENO" 5 +fi +## -------------------- ## +## Main body of script. ## +## -------------------- ## + +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + + +am__api_version='1.15' + +ac_aux_dir= +for ac_dir in "$srcdir" "$srcdir/.." "$srcdir/../.."; do + if test -f "$ac_dir/install-sh"; then + ac_aux_dir=$ac_dir + ac_install_sh="$ac_aux_dir/install-sh -c" + break + elif test -f "$ac_dir/install.sh"; then + ac_aux_dir=$ac_dir + ac_install_sh="$ac_aux_dir/install.sh -c" + break + elif test -f "$ac_dir/shtool"; then + ac_aux_dir=$ac_dir + ac_install_sh="$ac_aux_dir/shtool install -c" + break + fi +done +if test -z "$ac_aux_dir"; then + as_fn_error $? "cannot find install-sh, install.sh, or shtool in \"$srcdir\" \"$srcdir/..\" \"$srcdir/../..\"" "$LINENO" 5 +fi + +# These three variables are undocumented and unsupported, +# and are intended to be withdrawn in a future Autoconf release. +# They can cause serious problems if a builder's source tree is in a directory +# whose full name contains unusual characters. +ac_config_guess="$SHELL $ac_aux_dir/config.guess" # Please don't use this var. +ac_config_sub="$SHELL $ac_aux_dir/config.sub" # Please don't use this var. +ac_configure="$SHELL $ac_aux_dir/configure" # Please don't use this var. + + +# Find a good install program. We prefer a C program (faster), +# so one script is as good as another. But avoid the broken or +# incompatible versions: +# SysV /etc/install, /usr/sbin/install +# SunOS /usr/etc/install +# IRIX /sbin/install +# AIX /bin/install +# AmigaOS /C/install, which installs bootblocks on floppy discs +# AIX 4 /usr/bin/installbsd, which doesn't work without a -g flag +# AFS /usr/afsws/bin/install, which mishandles nonexistent args +# SVR4 /usr/ucb/install, which tries to use the nonexistent group "staff" +# OS/2's system install, which has a completely different semantic +# ./install, which can be erroneously created by make from ./install.sh. +# Reject install programs that cannot install multiple files. +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for a BSD-compatible install" >&5 +$as_echo_n "checking for a BSD-compatible install... " >&6; } +if test -z "$INSTALL"; then +if ${ac_cv_path_install+:} false; then : + $as_echo_n "(cached) " >&6 +else + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + # Account for people who put trailing slashes in PATH elements. +case $as_dir/ in #(( + ./ | .// | /[cC]/* | \ + /etc/* | /usr/sbin/* | /usr/etc/* | /sbin/* | /usr/afsws/bin/* | \ + ?:[\\/]os2[\\/]install[\\/]* | ?:[\\/]OS2[\\/]INSTALL[\\/]* | \ + /usr/ucb/* ) ;; + *) + # OSF1 and SCO ODT 3.0 have their own names for install. + # Don't use installbsd from OSF since it installs stuff as root + # by default. + for ac_prog in ginstall scoinst install; do + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_prog$ac_exec_ext"; then + if test $ac_prog = install && + grep dspmsg "$as_dir/$ac_prog$ac_exec_ext" >/dev/null 2>&1; then + # AIX install. It has an incompatible calling convention. + : + elif test $ac_prog = install && + grep pwplus "$as_dir/$ac_prog$ac_exec_ext" >/dev/null 2>&1; then + # program-specific install script used by HP pwplus--don't use. + : + else + rm -rf conftest.one conftest.two conftest.dir + echo one > conftest.one + echo two > conftest.two + mkdir conftest.dir + if "$as_dir/$ac_prog$ac_exec_ext" -c conftest.one conftest.two "`pwd`/conftest.dir" && + test -s conftest.one && test -s conftest.two && + test -s conftest.dir/conftest.one && + test -s conftest.dir/conftest.two + then + ac_cv_path_install="$as_dir/$ac_prog$ac_exec_ext -c" + break 3 + fi + fi + fi + done + done + ;; +esac + + done +IFS=$as_save_IFS + +rm -rf conftest.one conftest.two conftest.dir + +fi + if test "${ac_cv_path_install+set}" = set; then + INSTALL=$ac_cv_path_install + else + # As a last resort, use the slow shell script. Don't cache a + # value for INSTALL within a source directory, because that will + # break other packages using the cache if that directory is + # removed, or if the value is a relative name. + INSTALL=$ac_install_sh + fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $INSTALL" >&5 +$as_echo "$INSTALL" >&6; } + +# Use test -z because SunOS4 sh mishandles braces in ${var-val}. +# It thinks the first close brace ends the variable substitution. +test -z "$INSTALL_PROGRAM" && INSTALL_PROGRAM='${INSTALL}' + +test -z "$INSTALL_SCRIPT" && INSTALL_SCRIPT='${INSTALL}' + +test -z "$INSTALL_DATA" && INSTALL_DATA='${INSTALL} -m 644' + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether build environment is sane" >&5 +$as_echo_n "checking whether build environment is sane... " >&6; } +# Reject unsafe characters in $srcdir or the absolute working directory +# name. Accept space and tab only in the latter. +am_lf=' +' +case `pwd` in + *[\\\"\#\$\&\'\`$am_lf]*) + as_fn_error $? "unsafe absolute working directory name" "$LINENO" 5;; +esac +case $srcdir in + *[\\\"\#\$\&\'\`$am_lf\ \ ]*) + as_fn_error $? "unsafe srcdir value: '$srcdir'" "$LINENO" 5;; +esac + +# Do 'set' in a subshell so we don't clobber the current shell's +# arguments. Must try -L first in case configure is actually a +# symlink; some systems play weird games with the mod time of symlinks +# (eg FreeBSD returns the mod time of the symlink's containing +# directory). +if ( + am_has_slept=no + for am_try in 1 2; do + echo "timestamp, slept: $am_has_slept" > conftest.file + set X `ls -Lt "$srcdir/configure" conftest.file 2> /dev/null` + if test "$*" = "X"; then + # -L didn't work. + set X `ls -t "$srcdir/configure" conftest.file` + fi + if test "$*" != "X $srcdir/configure conftest.file" \ + && test "$*" != "X conftest.file $srcdir/configure"; then + + # If neither matched, then we have a broken ls. This can happen + # if, for instance, CONFIG_SHELL is bash and it inherits a + # broken ls alias from the environment. This has actually + # happened. Such a system could not be considered "sane". + as_fn_error $? "ls -t appears to fail. Make sure there is not a broken + alias in your environment" "$LINENO" 5 + fi + if test "$2" = conftest.file || test $am_try -eq 2; then + break + fi + # Just in case. + sleep 1 + am_has_slept=yes + done + test "$2" = conftest.file + ) +then + # Ok. + : +else + as_fn_error $? "newly created file is older than distributed files! +Check your system clock" "$LINENO" 5 +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } +# If we didn't sleep, we still need to ensure time stamps of config.status and +# generated files are strictly newer. +am_sleep_pid= +if grep 'slept: no' conftest.file >/dev/null 2>&1; then + ( sleep 1 ) & + am_sleep_pid=$! +fi + +rm -f conftest.file + +test "$program_prefix" != NONE && + program_transform_name="s&^&$program_prefix&;$program_transform_name" +# Use a double $ so make ignores it. +test "$program_suffix" != NONE && + program_transform_name="s&\$&$program_suffix&;$program_transform_name" +# Double any \ or $. +# By default was `s,x,x', remove it if useless. +ac_script='s/[\\$]/&&/g;s/;s,x,x,$//' +program_transform_name=`$as_echo "$program_transform_name" | sed "$ac_script"` + +# Expand $ac_aux_dir to an absolute path. +am_aux_dir=`cd "$ac_aux_dir" && pwd` + +if test x"${MISSING+set}" != xset; then + case $am_aux_dir in + *\ * | *\ *) + MISSING="\${SHELL} \"$am_aux_dir/missing\"" ;; + *) + MISSING="\${SHELL} $am_aux_dir/missing" ;; + esac +fi +# Use eval to expand $SHELL +if eval "$MISSING --is-lightweight"; then + am_missing_run="$MISSING " +else + am_missing_run= + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: 'missing' script is too old or missing" >&5 +$as_echo "$as_me: WARNING: 'missing' script is too old or missing" >&2;} +fi + +if test x"${install_sh+set}" != xset; then + case $am_aux_dir in + *\ * | *\ *) + install_sh="\${SHELL} '$am_aux_dir/install-sh'" ;; + *) + install_sh="\${SHELL} $am_aux_dir/install-sh" + esac +fi + +# Installed binaries are usually stripped using 'strip' when the user +# run "make install-strip". However 'strip' might not be the right +# tool to use in cross-compilation environments, therefore Automake +# will honor the 'STRIP' environment variable to overrule this program. +if test "$cross_compiling" != no; then + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}strip", so it can be a program name with args. +set dummy ${ac_tool_prefix}strip; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_STRIP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$STRIP"; then + ac_cv_prog_STRIP="$STRIP" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_STRIP="${ac_tool_prefix}strip" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +STRIP=$ac_cv_prog_STRIP +if test -n "$STRIP"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $STRIP" >&5 +$as_echo "$STRIP" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_STRIP"; then + ac_ct_STRIP=$STRIP + # Extract the first word of "strip", so it can be a program name with args. +set dummy strip; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_STRIP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_STRIP"; then + ac_cv_prog_ac_ct_STRIP="$ac_ct_STRIP" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_STRIP="strip" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_STRIP=$ac_cv_prog_ac_ct_STRIP +if test -n "$ac_ct_STRIP"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_STRIP" >&5 +$as_echo "$ac_ct_STRIP" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_STRIP" = x; then + STRIP=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + STRIP=$ac_ct_STRIP + fi +else + STRIP="$ac_cv_prog_STRIP" +fi + +fi +INSTALL_STRIP_PROGRAM="\$(install_sh) -c -s" + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for a thread-safe mkdir -p" >&5 +$as_echo_n "checking for a thread-safe mkdir -p... " >&6; } +if test -z "$MKDIR_P"; then + if ${ac_cv_path_mkdir+:} false; then : + $as_echo_n "(cached) " >&6 +else + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH$PATH_SEPARATOR/opt/sfw/bin +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_prog in mkdir gmkdir; do + for ac_exec_ext in '' $ac_executable_extensions; do + as_fn_executable_p "$as_dir/$ac_prog$ac_exec_ext" || continue + case `"$as_dir/$ac_prog$ac_exec_ext" --version 2>&1` in #( + 'mkdir (GNU coreutils) '* | \ + 'mkdir (coreutils) '* | \ + 'mkdir (fileutils) '4.1*) + ac_cv_path_mkdir=$as_dir/$ac_prog$ac_exec_ext + break 3;; + esac + done + done + done +IFS=$as_save_IFS + +fi + + test -d ./--version && rmdir ./--version + if test "${ac_cv_path_mkdir+set}" = set; then + MKDIR_P="$ac_cv_path_mkdir -p" + else + # As a last resort, use the slow shell script. Don't cache a + # value for MKDIR_P within a source directory, because that will + # break other packages using the cache if that directory is + # removed, or if the value is a relative name. + MKDIR_P="$ac_install_sh -d" + fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $MKDIR_P" >&5 +$as_echo "$MKDIR_P" >&6; } + +for ac_prog in gawk mawk nawk awk +do + # Extract the first word of "$ac_prog", so it can be a program name with args. +set dummy $ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_AWK+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$AWK"; then + ac_cv_prog_AWK="$AWK" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_AWK="$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +AWK=$ac_cv_prog_AWK +if test -n "$AWK"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $AWK" >&5 +$as_echo "$AWK" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$AWK" && break +done + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether ${MAKE-make} sets \$(MAKE)" >&5 +$as_echo_n "checking whether ${MAKE-make} sets \$(MAKE)... " >&6; } +set x ${MAKE-make} +ac_make=`$as_echo "$2" | sed 's/+/p/g; s/[^a-zA-Z0-9_]/_/g'` +if eval \${ac_cv_prog_make_${ac_make}_set+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat >conftest.make <<\_ACEOF +SHELL = /bin/sh +all: + @echo '@@@%%%=$(MAKE)=@@@%%%' +_ACEOF +# GNU make sometimes prints "make[1]: Entering ...", which would confuse us. +case `${MAKE-make} -f conftest.make 2>/dev/null` in + *@@@%%%=?*=@@@%%%*) + eval ac_cv_prog_make_${ac_make}_set=yes;; + *) + eval ac_cv_prog_make_${ac_make}_set=no;; +esac +rm -f conftest.make +fi +if eval test \$ac_cv_prog_make_${ac_make}_set = yes; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } + SET_MAKE= +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + SET_MAKE="MAKE=${MAKE-make}" +fi + +rm -rf .tst 2>/dev/null +mkdir .tst 2>/dev/null +if test -d .tst; then + am__leading_dot=. +else + am__leading_dot=_ +fi +rmdir .tst 2>/dev/null + +# Check whether --enable-silent-rules was given. +if test "${enable_silent_rules+set}" = set; then : + enableval=$enable_silent_rules; +fi + +case $enable_silent_rules in # ((( + yes) AM_DEFAULT_VERBOSITY=0;; + no) AM_DEFAULT_VERBOSITY=1;; + *) AM_DEFAULT_VERBOSITY=1;; +esac +am_make=${MAKE-make} +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $am_make supports nested variables" >&5 +$as_echo_n "checking whether $am_make supports nested variables... " >&6; } +if ${am_cv_make_support_nested_variables+:} false; then : + $as_echo_n "(cached) " >&6 +else + if $as_echo 'TRUE=$(BAR$(V)) +BAR0=false +BAR1=true +V=1 +am__doit: + @$(TRUE) +.PHONY: am__doit' | $am_make -f - >/dev/null 2>&1; then + am_cv_make_support_nested_variables=yes +else + am_cv_make_support_nested_variables=no +fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $am_cv_make_support_nested_variables" >&5 +$as_echo "$am_cv_make_support_nested_variables" >&6; } +if test $am_cv_make_support_nested_variables = yes; then + AM_V='$(V)' + AM_DEFAULT_V='$(AM_DEFAULT_VERBOSITY)' +else + AM_V=$AM_DEFAULT_VERBOSITY + AM_DEFAULT_V=$AM_DEFAULT_VERBOSITY +fi +AM_BACKSLASH='\' + +if test "`cd $srcdir && pwd`" != "`pwd`"; then + # Use -I$(srcdir) only when $(srcdir) != ., so that make's output + # is not polluted with repeated "-I." + am__isrc=' -I$(srcdir)' + # test to see if srcdir already configured + if test -f $srcdir/config.status; then + as_fn_error $? "source directory already configured; run \"make distclean\" there first" "$LINENO" 5 + fi +fi + +# test whether we have cygpath +if test -z "$CYGPATH_W"; then + if (cygpath --version) >/dev/null 2>/dev/null; then + CYGPATH_W='cygpath -w' + else + CYGPATH_W=echo + fi +fi + + +# Define the identity of the package. + PACKAGE='liblognorm' + VERSION='2.0.5' + + +cat >>confdefs.h <<_ACEOF +#define PACKAGE "$PACKAGE" +_ACEOF + + +cat >>confdefs.h <<_ACEOF +#define VERSION "$VERSION" +_ACEOF + +# Some tools Automake needs. + +ACLOCAL=${ACLOCAL-"${am_missing_run}aclocal-${am__api_version}"} + + +AUTOCONF=${AUTOCONF-"${am_missing_run}autoconf"} + + +AUTOMAKE=${AUTOMAKE-"${am_missing_run}automake-${am__api_version}"} + + +AUTOHEADER=${AUTOHEADER-"${am_missing_run}autoheader"} + + +MAKEINFO=${MAKEINFO-"${am_missing_run}makeinfo"} + +# For better backward compatibility. To be removed once Automake 1.9.x +# dies out for good. For more background, see: +# +# +mkdir_p='$(MKDIR_P)' + +# We need awk for the "check" target (and possibly the TAP driver). The +# system "awk" is bad on some platforms. +# Always define AMTAR for backward compatibility. Yes, it's still used +# in the wild :-( We should find a proper way to deprecate it ... +AMTAR='$${TAR-tar}' + + +# We'll loop over all known methods to create a tar archive until one works. +_am_tools='gnutar pax cpio none' + +am__tar='$${TAR-tar} chof - "$$tardir"' am__untar='$${TAR-tar} xf -' + + + + + + +# POSIX will say in a future version that running "rm -f" with no argument +# is OK; and we want to be able to make that assumption in our Makefile +# recipes. So use an aggressive probe to check that the usage we want is +# actually supported "in the wild" to an acceptable degree. +# See automake bug#10828. +# To make any issue more visible, cause the running configure to be aborted +# by default if the 'rm' program in use doesn't match our expectations; the +# user can still override this though. +if rm -f && rm -fr && rm -rf; then : OK; else + cat >&2 <<'END' +Oops! + +Your 'rm' program seems unable to run without file operands specified +on the command line, even when the '-f' option is present. This is contrary +to the behaviour of most rm programs out there, and not conforming with +the upcoming POSIX standard: + +Please tell bug-automake@gnu.org about your system, including the value +of your $PATH and any error possibly output before this message. This +can help us improve future automake versions. + +END + if test x"$ACCEPT_INFERIOR_RM_PROGRAM" = x"yes"; then + echo 'Configuration will proceed anyway, since you have set the' >&2 + echo 'ACCEPT_INFERIOR_RM_PROGRAM variable to "yes"' >&2 + echo >&2 + else + cat >&2 <<'END' +Aborting the configuration process, to ensure you take notice of the issue. + +You can download and install GNU coreutils to get an 'rm' implementation +that behaves properly: . + +If you want to complete the configuration process using your problematic +'rm' anyway, export the environment variable ACCEPT_INFERIOR_RM_PROGRAM +to "yes", and re-run configure. + +END + as_fn_error $? "Your 'rm' program is bad, sorry." "$LINENO" 5 + fi +fi + +# Check whether --enable-silent-rules was given. +if test "${enable_silent_rules+set}" = set; then : + enableval=$enable_silent_rules; +fi + +case $enable_silent_rules in # ((( + yes) AM_DEFAULT_VERBOSITY=0;; + no) AM_DEFAULT_VERBOSITY=1;; + *) AM_DEFAULT_VERBOSITY=0;; +esac +am_make=${MAKE-make} +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $am_make supports nested variables" >&5 +$as_echo_n "checking whether $am_make supports nested variables... " >&6; } +if ${am_cv_make_support_nested_variables+:} false; then : + $as_echo_n "(cached) " >&6 +else + if $as_echo 'TRUE=$(BAR$(V)) +BAR0=false +BAR1=true +V=1 +am__doit: + @$(TRUE) +.PHONY: am__doit' | $am_make -f - >/dev/null 2>&1; then + am_cv_make_support_nested_variables=yes +else + am_cv_make_support_nested_variables=no +fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $am_cv_make_support_nested_variables" >&5 +$as_echo "$am_cv_make_support_nested_variables" >&6; } +if test $am_cv_make_support_nested_variables = yes; then + AM_V='$(V)' + AM_DEFAULT_V='$(AM_DEFAULT_VERBOSITY)' +else + AM_V=$AM_DEFAULT_VERBOSITY + AM_DEFAULT_V=$AM_DEFAULT_VERBOSITY +fi +AM_BACKSLASH='\' + + +ac_config_headers="$ac_config_headers config.h" + + +DEPDIR="${am__leading_dot}deps" + +ac_config_commands="$ac_config_commands depfiles" + + +am_make=${MAKE-make} +cat > confinc << 'END' +am__doit: + @echo this is the am__doit target +.PHONY: am__doit +END +# If we don't find an include directive, just comment out the code. +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for style of include used by $am_make" >&5 +$as_echo_n "checking for style of include used by $am_make... " >&6; } +am__include="#" +am__quote= +_am_result=none +# First try GNU make style include. +echo "include confinc" > confmf +# Ignore all kinds of additional output from 'make'. +case `$am_make -s -f confmf 2> /dev/null` in #( +*the\ am__doit\ target*) + am__include=include + am__quote= + _am_result=GNU + ;; +esac +# Now try BSD make style include. +if test "$am__include" = "#"; then + echo '.include "confinc"' > confmf + case `$am_make -s -f confmf 2> /dev/null` in #( + *the\ am__doit\ target*) + am__include=.include + am__quote="\"" + _am_result=BSD + ;; + esac +fi + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $_am_result" >&5 +$as_echo "$_am_result" >&6; } +rm -f confinc confmf + +# Check whether --enable-dependency-tracking was given. +if test "${enable_dependency_tracking+set}" = set; then : + enableval=$enable_dependency_tracking; +fi + +if test "x$enable_dependency_tracking" != xno; then + am_depcomp="$ac_aux_dir/depcomp" + AMDEPBACKSLASH='\' + am__nodep='_no' +fi + if test "x$enable_dependency_tracking" != xno; then + AMDEP_TRUE= + AMDEP_FALSE='#' +else + AMDEP_TRUE='#' + AMDEP_FALSE= +fi + + +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu +if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}gcc", so it can be a program name with args. +set dummy ${ac_tool_prefix}gcc; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$CC"; then + ac_cv_prog_CC="$CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_CC="${ac_tool_prefix}gcc" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +CC=$ac_cv_prog_CC +if test -n "$CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CC" >&5 +$as_echo "$CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_CC"; then + ac_ct_CC=$CC + # Extract the first word of "gcc", so it can be a program name with args. +set dummy gcc; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_CC"; then + ac_cv_prog_ac_ct_CC="$ac_ct_CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_CC="gcc" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_CC=$ac_cv_prog_ac_ct_CC +if test -n "$ac_ct_CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_CC" >&5 +$as_echo "$ac_ct_CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_CC" = x; then + CC="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + CC=$ac_ct_CC + fi +else + CC="$ac_cv_prog_CC" +fi + +if test -z "$CC"; then + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}cc", so it can be a program name with args. +set dummy ${ac_tool_prefix}cc; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$CC"; then + ac_cv_prog_CC="$CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_CC="${ac_tool_prefix}cc" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +CC=$ac_cv_prog_CC +if test -n "$CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CC" >&5 +$as_echo "$CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + fi +fi +if test -z "$CC"; then + # Extract the first word of "cc", so it can be a program name with args. +set dummy cc; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$CC"; then + ac_cv_prog_CC="$CC" # Let the user override the test. +else + ac_prog_rejected=no +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + if test "$as_dir/$ac_word$ac_exec_ext" = "/usr/ucb/cc"; then + ac_prog_rejected=yes + continue + fi + ac_cv_prog_CC="cc" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +if test $ac_prog_rejected = yes; then + # We found a bogon in the path, so make sure we never use it. + set dummy $ac_cv_prog_CC + shift + if test $# != 0; then + # We chose a different compiler from the bogus one. + # However, it has the same basename, so the bogon will be chosen + # first if we set CC to just the basename; use the full file name. + shift + ac_cv_prog_CC="$as_dir/$ac_word${1+' '}$@" + fi +fi +fi +fi +CC=$ac_cv_prog_CC +if test -n "$CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CC" >&5 +$as_echo "$CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$CC"; then + if test -n "$ac_tool_prefix"; then + for ac_prog in cl.exe + do + # Extract the first word of "$ac_tool_prefix$ac_prog", so it can be a program name with args. +set dummy $ac_tool_prefix$ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$CC"; then + ac_cv_prog_CC="$CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_CC="$ac_tool_prefix$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +CC=$ac_cv_prog_CC +if test -n "$CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CC" >&5 +$as_echo "$CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$CC" && break + done +fi +if test -z "$CC"; then + ac_ct_CC=$CC + for ac_prog in cl.exe +do + # Extract the first word of "$ac_prog", so it can be a program name with args. +set dummy $ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_CC"; then + ac_cv_prog_ac_ct_CC="$ac_ct_CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_CC="$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_CC=$ac_cv_prog_ac_ct_CC +if test -n "$ac_ct_CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_CC" >&5 +$as_echo "$ac_ct_CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$ac_ct_CC" && break +done + + if test "x$ac_ct_CC" = x; then + CC="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + CC=$ac_ct_CC + fi +fi + +fi + + +test -z "$CC" && { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "no acceptable C compiler found in \$PATH +See \`config.log' for more details" "$LINENO" 5; } + +# Provide some information about the compiler. +$as_echo "$as_me:${as_lineno-$LINENO}: checking for C compiler version" >&5 +set X $ac_compile +ac_compiler=$2 +for ac_option in --version -v -V -qversion; do + { { ac_try="$ac_compiler $ac_option >&5" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_compiler $ac_option >&5") 2>conftest.err + ac_status=$? + if test -s conftest.err; then + sed '10a\ +... rest of stderr output deleted ... + 10q' conftest.err >conftest.er1 + cat conftest.er1 >&5 + fi + rm -f conftest.er1 conftest.err + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } +done + +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +ac_clean_files_save=$ac_clean_files +ac_clean_files="$ac_clean_files a.out a.out.dSYM a.exe b.out" +# Try to create an executable without -o first, disregard a.out. +# It will help us diagnose broken compilers, and finding out an intuition +# of exeext. +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether the C compiler works" >&5 +$as_echo_n "checking whether the C compiler works... " >&6; } +ac_link_default=`$as_echo "$ac_link" | sed 's/ -o *conftest[^ ]*//'` + +# The possible output files: +ac_files="a.out conftest.exe conftest a.exe a_out.exe b.out conftest.*" + +ac_rmfiles= +for ac_file in $ac_files +do + case $ac_file in + *.$ac_ext | *.xcoff | *.tds | *.d | *.pdb | *.xSYM | *.bb | *.bbg | *.map | *.inf | *.dSYM | *.o | *.obj ) ;; + * ) ac_rmfiles="$ac_rmfiles $ac_file";; + esac +done +rm -f $ac_rmfiles + +if { { ac_try="$ac_link_default" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_link_default") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then : + # Autoconf-2.13 could set the ac_cv_exeext variable to `no'. +# So ignore a value of `no', otherwise this would lead to `EXEEXT = no' +# in a Makefile. We should not override ac_cv_exeext if it was cached, +# so that the user can short-circuit this test for compilers unknown to +# Autoconf. +for ac_file in $ac_files '' +do + test -f "$ac_file" || continue + case $ac_file in + *.$ac_ext | *.xcoff | *.tds | *.d | *.pdb | *.xSYM | *.bb | *.bbg | *.map | *.inf | *.dSYM | *.o | *.obj ) + ;; + [ab].out ) + # We found the default executable, but exeext='' is most + # certainly right. + break;; + *.* ) + if test "${ac_cv_exeext+set}" = set && test "$ac_cv_exeext" != no; + then :; else + ac_cv_exeext=`expr "$ac_file" : '[^.]*\(\..*\)'` + fi + # We set ac_cv_exeext here because the later test for it is not + # safe: cross compilers may not add the suffix if given an `-o' + # argument, so we may need to know it at that point already. + # Even if this section looks crufty: it has the advantage of + # actually working. + break;; + * ) + break;; + esac +done +test "$ac_cv_exeext" = no && ac_cv_exeext= + +else + ac_file='' +fi +if test -z "$ac_file"; then : + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +$as_echo "$as_me: failed program was:" >&5 +sed 's/^/| /' conftest.$ac_ext >&5 + +{ { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error 77 "C compiler cannot create executables +See \`config.log' for more details" "$LINENO" 5; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for C compiler default output file name" >&5 +$as_echo_n "checking for C compiler default output file name... " >&6; } +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_file" >&5 +$as_echo "$ac_file" >&6; } +ac_exeext=$ac_cv_exeext + +rm -f -r a.out a.out.dSYM a.exe conftest$ac_cv_exeext b.out +ac_clean_files=$ac_clean_files_save +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for suffix of executables" >&5 +$as_echo_n "checking for suffix of executables... " >&6; } +if { { ac_try="$ac_link" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_link") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then : + # If both `conftest.exe' and `conftest' are `present' (well, observable) +# catch `conftest.exe'. For instance with Cygwin, `ls conftest' will +# work properly (i.e., refer to `conftest.exe'), while it won't with +# `rm'. +for ac_file in conftest.exe conftest conftest.*; do + test -f "$ac_file" || continue + case $ac_file in + *.$ac_ext | *.xcoff | *.tds | *.d | *.pdb | *.xSYM | *.bb | *.bbg | *.map | *.inf | *.dSYM | *.o | *.obj ) ;; + *.* ) ac_cv_exeext=`expr "$ac_file" : '[^.]*\(\..*\)'` + break;; + * ) break;; + esac +done +else + { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "cannot compute suffix of executables: cannot compile and link +See \`config.log' for more details" "$LINENO" 5; } +fi +rm -f conftest conftest$ac_cv_exeext +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_exeext" >&5 +$as_echo "$ac_cv_exeext" >&6; } + +rm -f conftest.$ac_ext +EXEEXT=$ac_cv_exeext +ac_exeext=$EXEEXT +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +int +main () +{ +FILE *f = fopen ("conftest.out", "w"); + return ferror (f) || fclose (f) != 0; + + ; + return 0; +} +_ACEOF +ac_clean_files="$ac_clean_files conftest.out" +# Check that the compiler produces executables we can run. If not, either +# the compiler is broken, or we cross compile. +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether we are cross compiling" >&5 +$as_echo_n "checking whether we are cross compiling... " >&6; } +if test "$cross_compiling" != yes; then + { { ac_try="$ac_link" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_link") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + if { ac_try='./conftest$ac_cv_exeext' + { { case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_try") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; }; then + cross_compiling=no + else + if test "$cross_compiling" = maybe; then + cross_compiling=yes + else + { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "cannot run C compiled programs. +If you meant to cross compile, use \`--host'. +See \`config.log' for more details" "$LINENO" 5; } + fi + fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $cross_compiling" >&5 +$as_echo "$cross_compiling" >&6; } + +rm -f conftest.$ac_ext conftest$ac_cv_exeext conftest.out +ac_clean_files=$ac_clean_files_save +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for suffix of object files" >&5 +$as_echo_n "checking for suffix of object files... " >&6; } +if ${ac_cv_objext+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +rm -f conftest.o conftest.obj +if { { ac_try="$ac_compile" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_compile") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then : + for ac_file in conftest.o conftest.obj conftest.*; do + test -f "$ac_file" || continue; + case $ac_file in + *.$ac_ext | *.xcoff | *.tds | *.d | *.pdb | *.xSYM | *.bb | *.bbg | *.map | *.inf | *.dSYM ) ;; + *) ac_cv_objext=`expr "$ac_file" : '.*\.\(.*\)'` + break;; + esac +done +else + $as_echo "$as_me: failed program was:" >&5 +sed 's/^/| /' conftest.$ac_ext >&5 + +{ { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "cannot compute suffix of object files: cannot compile +See \`config.log' for more details" "$LINENO" 5; } +fi +rm -f conftest.$ac_cv_objext conftest.$ac_ext +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_objext" >&5 +$as_echo "$ac_cv_objext" >&6; } +OBJEXT=$ac_cv_objext +ac_objext=$OBJEXT +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether we are using the GNU C compiler" >&5 +$as_echo_n "checking whether we are using the GNU C compiler... " >&6; } +if ${ac_cv_c_compiler_gnu+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ +#ifndef __GNUC__ + choke me +#endif + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_compiler_gnu=yes +else + ac_compiler_gnu=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +ac_cv_c_compiler_gnu=$ac_compiler_gnu + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_c_compiler_gnu" >&5 +$as_echo "$ac_cv_c_compiler_gnu" >&6; } +if test $ac_compiler_gnu = yes; then + GCC=yes +else + GCC= +fi +ac_test_CFLAGS=${CFLAGS+set} +ac_save_CFLAGS=$CFLAGS +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC accepts -g" >&5 +$as_echo_n "checking whether $CC accepts -g... " >&6; } +if ${ac_cv_prog_cc_g+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_save_c_werror_flag=$ac_c_werror_flag + ac_c_werror_flag=yes + ac_cv_prog_cc_g=no + CFLAGS="-g" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_prog_cc_g=yes +else + CFLAGS="" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + +else + ac_c_werror_flag=$ac_save_c_werror_flag + CFLAGS="-g" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_prog_cc_g=yes +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + ac_c_werror_flag=$ac_save_c_werror_flag +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_prog_cc_g" >&5 +$as_echo "$ac_cv_prog_cc_g" >&6; } +if test "$ac_test_CFLAGS" = set; then + CFLAGS=$ac_save_CFLAGS +elif test $ac_cv_prog_cc_g = yes; then + if test "$GCC" = yes; then + CFLAGS="-g -O2" + else + CFLAGS="-g" + fi +else + if test "$GCC" = yes; then + CFLAGS="-O2" + else + CFLAGS= + fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $CC option to accept ISO C89" >&5 +$as_echo_n "checking for $CC option to accept ISO C89... " >&6; } +if ${ac_cv_prog_cc_c89+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_cv_prog_cc_c89=no +ac_save_CC=$CC +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +#include +struct stat; +/* Most of the following tests are stolen from RCS 5.7's src/conf.sh. */ +struct buf { int x; }; +FILE * (*rcsopen) (struct buf *, struct stat *, int); +static char *e (p, i) + char **p; + int i; +{ + return p[i]; +} +static char *f (char * (*g) (char **, int), char **p, ...) +{ + char *s; + va_list v; + va_start (v,p); + s = g (p, va_arg (v,int)); + va_end (v); + return s; +} + +/* OSF 4.0 Compaq cc is some sort of almost-ANSI by default. It has + function prototypes and stuff, but not '\xHH' hex character constants. + These don't provoke an error unfortunately, instead are silently treated + as 'x'. The following induces an error, until -std is added to get + proper ANSI mode. Curiously '\x00'!='x' always comes out true, for an + array size at least. It's necessary to write '\x00'==0 to get something + that's true only with -std. */ +int osf4_cc_array ['\x00' == 0 ? 1 : -1]; + +/* IBM C 6 for AIX is almost-ANSI by default, but it replaces macro parameters + inside strings and character constants. */ +#define FOO(x) 'x' +int xlc6_cc_array[FOO(a) == 'x' ? 1 : -1]; + +int test (int i, double x); +struct s1 {int (*f) (int a);}; +struct s2 {int (*f) (double a);}; +int pairnames (int, char **, FILE *(*)(struct buf *, struct stat *, int), int, int); +int argc; +char **argv; +int +main () +{ +return f (e, argv, 0) != argv[0] || f (e, argv, 1) != argv[1]; + ; + return 0; +} +_ACEOF +for ac_arg in '' -qlanglvl=extc89 -qlanglvl=ansi -std \ + -Ae "-Aa -D_HPUX_SOURCE" "-Xc -D__EXTENSIONS__" +do + CC="$ac_save_CC $ac_arg" + if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_prog_cc_c89=$ac_arg +fi +rm -f core conftest.err conftest.$ac_objext + test "x$ac_cv_prog_cc_c89" != "xno" && break +done +rm -f conftest.$ac_ext +CC=$ac_save_CC + +fi +# AC_CACHE_VAL +case "x$ac_cv_prog_cc_c89" in + x) + { $as_echo "$as_me:${as_lineno-$LINENO}: result: none needed" >&5 +$as_echo "none needed" >&6; } ;; + xno) + { $as_echo "$as_me:${as_lineno-$LINENO}: result: unsupported" >&5 +$as_echo "unsupported" >&6; } ;; + *) + CC="$CC $ac_cv_prog_cc_c89" + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_prog_cc_c89" >&5 +$as_echo "$ac_cv_prog_cc_c89" >&6; } ;; +esac +if test "x$ac_cv_prog_cc_c89" != xno; then : + +fi + +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC understands -c and -o together" >&5 +$as_echo_n "checking whether $CC understands -c and -o together... " >&6; } +if ${am_cv_prog_cc_c_o+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF + # Make sure it works both with $CC and with simple cc. + # Following AC_PROG_CC_C_O, we do the test twice because some + # compilers refuse to overwrite an existing .o file with -o, + # though they will create one. + am_cv_prog_cc_c_o=yes + for am_i in 1 2; do + if { echo "$as_me:$LINENO: $CC -c conftest.$ac_ext -o conftest2.$ac_objext" >&5 + ($CC -c conftest.$ac_ext -o conftest2.$ac_objext) >&5 2>&5 + ac_status=$? + echo "$as_me:$LINENO: \$? = $ac_status" >&5 + (exit $ac_status); } \ + && test -f conftest2.$ac_objext; then + : OK + else + am_cv_prog_cc_c_o=no + break + fi + done + rm -f core conftest* + unset am_i +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $am_cv_prog_cc_c_o" >&5 +$as_echo "$am_cv_prog_cc_c_o" >&6; } +if test "$am_cv_prog_cc_c_o" != yes; then + # Losing compiler, so override with the script. + # FIXME: It is wrong to rewrite CC. + # But if we don't then we get into trouble of one sort or another. + # A longer-term fix would be to have automake use am__CC in this case, + # and then we could set am__CC="\$(top_srcdir)/compile \$(CC)" + CC="$am_aux_dir/compile $CC" +fi +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + + +depcc="$CC" am_compiler_list= + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking dependency style of $depcc" >&5 +$as_echo_n "checking dependency style of $depcc... " >&6; } +if ${am_cv_CC_dependencies_compiler_type+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -z "$AMDEP_TRUE" && test -f "$am_depcomp"; then + # We make a subdir and do the tests there. Otherwise we can end up + # making bogus files that we don't know about and never remove. For + # instance it was reported that on HP-UX the gcc test will end up + # making a dummy file named 'D' -- because '-MD' means "put the output + # in D". + rm -rf conftest.dir + mkdir conftest.dir + # Copy depcomp to subdir because otherwise we won't find it if we're + # using a relative directory. + cp "$am_depcomp" conftest.dir + cd conftest.dir + # We will build objects and dependencies in a subdirectory because + # it helps to detect inapplicable dependency modes. For instance + # both Tru64's cc and ICC support -MD to output dependencies as a + # side effect of compilation, but ICC will put the dependencies in + # the current directory while Tru64 will put them in the object + # directory. + mkdir sub + + am_cv_CC_dependencies_compiler_type=none + if test "$am_compiler_list" = ""; then + am_compiler_list=`sed -n 's/^#*\([a-zA-Z0-9]*\))$/\1/p' < ./depcomp` + fi + am__universal=false + case " $depcc " in #( + *\ -arch\ *\ -arch\ *) am__universal=true ;; + esac + + for depmode in $am_compiler_list; do + # Setup a source with many dependencies, because some compilers + # like to wrap large dependency lists on column 80 (with \), and + # we should not choose a depcomp mode which is confused by this. + # + # We need to recreate these files for each test, as the compiler may + # overwrite some of them when testing with obscure command lines. + # This happens at least with the AIX C compiler. + : > sub/conftest.c + for i in 1 2 3 4 5 6; do + echo '#include "conftst'$i'.h"' >> sub/conftest.c + # Using ": > sub/conftst$i.h" creates only sub/conftst1.h with + # Solaris 10 /bin/sh. + echo '/* dummy */' > sub/conftst$i.h + done + echo "${am__include} ${am__quote}sub/conftest.Po${am__quote}" > confmf + + # We check with '-c' and '-o' for the sake of the "dashmstdout" + # mode. It turns out that the SunPro C++ compiler does not properly + # handle '-M -o', and we need to detect this. Also, some Intel + # versions had trouble with output in subdirs. + am__obj=sub/conftest.${OBJEXT-o} + am__minus_obj="-o $am__obj" + case $depmode in + gcc) + # This depmode causes a compiler race in universal mode. + test "$am__universal" = false || continue + ;; + nosideeffect) + # After this tag, mechanisms are not by side-effect, so they'll + # only be used when explicitly requested. + if test "x$enable_dependency_tracking" = xyes; then + continue + else + break + fi + ;; + msvc7 | msvc7msys | msvisualcpp | msvcmsys) + # This compiler won't grok '-c -o', but also, the minuso test has + # not run yet. These depmodes are late enough in the game, and + # so weak that their functioning should not be impacted. + am__obj=conftest.${OBJEXT-o} + am__minus_obj= + ;; + none) break ;; + esac + if depmode=$depmode \ + source=sub/conftest.c object=$am__obj \ + depfile=sub/conftest.Po tmpdepfile=sub/conftest.TPo \ + $SHELL ./depcomp $depcc -c $am__minus_obj sub/conftest.c \ + >/dev/null 2>conftest.err && + grep sub/conftst1.h sub/conftest.Po > /dev/null 2>&1 && + grep sub/conftst6.h sub/conftest.Po > /dev/null 2>&1 && + grep $am__obj sub/conftest.Po > /dev/null 2>&1 && + ${MAKE-make} -s -f confmf > /dev/null 2>&1; then + # icc doesn't choke on unknown options, it will just issue warnings + # or remarks (even with -Werror). So we grep stderr for any message + # that says an option was ignored or not supported. + # When given -MP, icc 7.0 and 7.1 complain thusly: + # icc: Command line warning: ignoring option '-M'; no argument required + # The diagnosis changed in icc 8.0: + # icc: Command line remark: option '-MP' not supported + if (grep 'ignoring option' conftest.err || + grep 'not supported' conftest.err) >/dev/null 2>&1; then :; else + am_cv_CC_dependencies_compiler_type=$depmode + break + fi + fi + done + + cd .. + rm -rf conftest.dir +else + am_cv_CC_dependencies_compiler_type=none +fi + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $am_cv_CC_dependencies_compiler_type" >&5 +$as_echo "$am_cv_CC_dependencies_compiler_type" >&6; } +CCDEPMODE=depmode=$am_cv_CC_dependencies_compiler_type + + if + test "x$enable_dependency_tracking" != xno \ + && test "$am_cv_CC_dependencies_compiler_type" = gcc3; then + am__fastdepCC_TRUE= + am__fastdepCC_FALSE='#' +else + am__fastdepCC_TRUE='#' + am__fastdepCC_FALSE= +fi + + + +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking how to run the C preprocessor" >&5 +$as_echo_n "checking how to run the C preprocessor... " >&6; } +# On Suns, sometimes $CPP names a directory. +if test -n "$CPP" && test -d "$CPP"; then + CPP= +fi +if test -z "$CPP"; then + if ${ac_cv_prog_CPP+:} false; then : + $as_echo_n "(cached) " >&6 +else + # Double quotes because CPP needs to be expanded + for CPP in "$CC -E" "$CC -E -traditional-cpp" "/lib/cpp" + do + ac_preproc_ok=false +for ac_c_preproc_warn_flag in '' yes +do + # Use a header file that comes with gcc, so configuring glibc + # with a fresh cross-compiler works. + # Prefer to if __STDC__ is defined, since + # exists even on freestanding compilers. + # On the NeXT, cc -E runs the code through the compiler's parser, + # not just through cpp. "Syntax error" is here to catch this case. + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#ifdef __STDC__ +# include +#else +# include +#endif + Syntax error +_ACEOF +if ac_fn_c_try_cpp "$LINENO"; then : + +else + # Broken: fails on valid input. +continue +fi +rm -f conftest.err conftest.i conftest.$ac_ext + + # OK, works on sane cases. Now check whether nonexistent headers + # can be detected and how. + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +_ACEOF +if ac_fn_c_try_cpp "$LINENO"; then : + # Broken: success on invalid input. +continue +else + # Passes both tests. +ac_preproc_ok=: +break +fi +rm -f conftest.err conftest.i conftest.$ac_ext + +done +# Because of `break', _AC_PREPROC_IFELSE's cleaning code was skipped. +rm -f conftest.i conftest.err conftest.$ac_ext +if $ac_preproc_ok; then : + break +fi + + done + ac_cv_prog_CPP=$CPP + +fi + CPP=$ac_cv_prog_CPP +else + ac_cv_prog_CPP=$CPP +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $CPP" >&5 +$as_echo "$CPP" >&6; } +ac_preproc_ok=false +for ac_c_preproc_warn_flag in '' yes +do + # Use a header file that comes with gcc, so configuring glibc + # with a fresh cross-compiler works. + # Prefer to if __STDC__ is defined, since + # exists even on freestanding compilers. + # On the NeXT, cc -E runs the code through the compiler's parser, + # not just through cpp. "Syntax error" is here to catch this case. + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#ifdef __STDC__ +# include +#else +# include +#endif + Syntax error +_ACEOF +if ac_fn_c_try_cpp "$LINENO"; then : + +else + # Broken: fails on valid input. +continue +fi +rm -f conftest.err conftest.i conftest.$ac_ext + + # OK, works on sane cases. Now check whether nonexistent headers + # can be detected and how. + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +_ACEOF +if ac_fn_c_try_cpp "$LINENO"; then : + # Broken: success on invalid input. +continue +else + # Passes both tests. +ac_preproc_ok=: +break +fi +rm -f conftest.err conftest.i conftest.$ac_ext + +done +# Because of `break', _AC_PREPROC_IFELSE's cleaning code was skipped. +rm -f conftest.i conftest.err conftest.$ac_ext +if $ac_preproc_ok; then : + +else + { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "C preprocessor \"$CPP\" fails sanity check +See \`config.log' for more details" "$LINENO" 5; } +fi + +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for grep that handles long lines and -e" >&5 +$as_echo_n "checking for grep that handles long lines and -e... " >&6; } +if ${ac_cv_path_GREP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -z "$GREP"; then + ac_path_GREP_found=false + # Loop through the user's path and test for each of PROGNAME-LIST + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH$PATH_SEPARATOR/usr/xpg4/bin +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_prog in grep ggrep; do + for ac_exec_ext in '' $ac_executable_extensions; do + ac_path_GREP="$as_dir/$ac_prog$ac_exec_ext" + as_fn_executable_p "$ac_path_GREP" || continue +# Check for GNU ac_path_GREP and select it if it is found. + # Check for GNU $ac_path_GREP +case `"$ac_path_GREP" --version 2>&1` in +*GNU*) + ac_cv_path_GREP="$ac_path_GREP" ac_path_GREP_found=:;; +*) + ac_count=0 + $as_echo_n 0123456789 >"conftest.in" + while : + do + cat "conftest.in" "conftest.in" >"conftest.tmp" + mv "conftest.tmp" "conftest.in" + cp "conftest.in" "conftest.nl" + $as_echo 'GREP' >> "conftest.nl" + "$ac_path_GREP" -e 'GREP$' -e '-(cannot match)-' < "conftest.nl" >"conftest.out" 2>/dev/null || break + diff "conftest.out" "conftest.nl" >/dev/null 2>&1 || break + as_fn_arith $ac_count + 1 && ac_count=$as_val + if test $ac_count -gt ${ac_path_GREP_max-0}; then + # Best one so far, save it but keep looking for a better one + ac_cv_path_GREP="$ac_path_GREP" + ac_path_GREP_max=$ac_count + fi + # 10*(2^10) chars as input seems more than enough + test $ac_count -gt 10 && break + done + rm -f conftest.in conftest.tmp conftest.nl conftest.out;; +esac + + $ac_path_GREP_found && break 3 + done + done + done +IFS=$as_save_IFS + if test -z "$ac_cv_path_GREP"; then + as_fn_error $? "no acceptable grep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin" "$LINENO" 5 + fi +else + ac_cv_path_GREP=$GREP +fi + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_path_GREP" >&5 +$as_echo "$ac_cv_path_GREP" >&6; } + GREP="$ac_cv_path_GREP" + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for egrep" >&5 +$as_echo_n "checking for egrep... " >&6; } +if ${ac_cv_path_EGREP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if echo a | $GREP -E '(a|b)' >/dev/null 2>&1 + then ac_cv_path_EGREP="$GREP -E" + else + if test -z "$EGREP"; then + ac_path_EGREP_found=false + # Loop through the user's path and test for each of PROGNAME-LIST + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH$PATH_SEPARATOR/usr/xpg4/bin +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_prog in egrep; do + for ac_exec_ext in '' $ac_executable_extensions; do + ac_path_EGREP="$as_dir/$ac_prog$ac_exec_ext" + as_fn_executable_p "$ac_path_EGREP" || continue +# Check for GNU ac_path_EGREP and select it if it is found. + # Check for GNU $ac_path_EGREP +case `"$ac_path_EGREP" --version 2>&1` in +*GNU*) + ac_cv_path_EGREP="$ac_path_EGREP" ac_path_EGREP_found=:;; +*) + ac_count=0 + $as_echo_n 0123456789 >"conftest.in" + while : + do + cat "conftest.in" "conftest.in" >"conftest.tmp" + mv "conftest.tmp" "conftest.in" + cp "conftest.in" "conftest.nl" + $as_echo 'EGREP' >> "conftest.nl" + "$ac_path_EGREP" 'EGREP$' < "conftest.nl" >"conftest.out" 2>/dev/null || break + diff "conftest.out" "conftest.nl" >/dev/null 2>&1 || break + as_fn_arith $ac_count + 1 && ac_count=$as_val + if test $ac_count -gt ${ac_path_EGREP_max-0}; then + # Best one so far, save it but keep looking for a better one + ac_cv_path_EGREP="$ac_path_EGREP" + ac_path_EGREP_max=$ac_count + fi + # 10*(2^10) chars as input seems more than enough + test $ac_count -gt 10 && break + done + rm -f conftest.in conftest.tmp conftest.nl conftest.out;; +esac + + $ac_path_EGREP_found && break 3 + done + done + done +IFS=$as_save_IFS + if test -z "$ac_cv_path_EGREP"; then + as_fn_error $? "no acceptable egrep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin" "$LINENO" 5 + fi +else + ac_cv_path_EGREP=$EGREP +fi + + fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_path_EGREP" >&5 +$as_echo "$ac_cv_path_EGREP" >&6; } + EGREP="$ac_cv_path_EGREP" + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for ANSI C header files" >&5 +$as_echo_n "checking for ANSI C header files... " >&6; } +if ${ac_cv_header_stdc+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +#include +#include +#include + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_header_stdc=yes +else + ac_cv_header_stdc=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + +if test $ac_cv_header_stdc = yes; then + # SunOS 4.x string.h does not declare mem*, contrary to ANSI. + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include + +_ACEOF +if (eval "$ac_cpp conftest.$ac_ext") 2>&5 | + $EGREP "memchr" >/dev/null 2>&1; then : + +else + ac_cv_header_stdc=no +fi +rm -f conftest* + +fi + +if test $ac_cv_header_stdc = yes; then + # ISC 2.0.2 stdlib.h does not declare free, contrary to ANSI. + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include + +_ACEOF +if (eval "$ac_cpp conftest.$ac_ext") 2>&5 | + $EGREP "free" >/dev/null 2>&1; then : + +else + ac_cv_header_stdc=no +fi +rm -f conftest* + +fi + +if test $ac_cv_header_stdc = yes; then + # /bin/cc in Irix-4.0.5 gets non-ANSI ctype macros unless using -ansi. + if test "$cross_compiling" = yes; then : + : +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +#include +#if ((' ' & 0x0FF) == 0x020) +# define ISLOWER(c) ('a' <= (c) && (c) <= 'z') +# define TOUPPER(c) (ISLOWER(c) ? 'A' + ((c) - 'a') : (c)) +#else +# define ISLOWER(c) \ + (('a' <= (c) && (c) <= 'i') \ + || ('j' <= (c) && (c) <= 'r') \ + || ('s' <= (c) && (c) <= 'z')) +# define TOUPPER(c) (ISLOWER(c) ? ((c) | 0x40) : (c)) +#endif + +#define XOR(e, f) (((e) && !(f)) || (!(e) && (f))) +int +main () +{ + int i; + for (i = 0; i < 256; i++) + if (XOR (islower (i), ISLOWER (i)) + || toupper (i) != TOUPPER (i)) + return 2; + return 0; +} +_ACEOF +if ac_fn_c_try_run "$LINENO"; then : + +else + ac_cv_header_stdc=no +fi +rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ + conftest.$ac_objext conftest.beam conftest.$ac_ext +fi + +fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_header_stdc" >&5 +$as_echo "$ac_cv_header_stdc" >&6; } +if test $ac_cv_header_stdc = yes; then + +$as_echo "#define STDC_HEADERS 1" >>confdefs.h + +fi + +# On IRIX 5.3, sys/types and inttypes.h are conflicting. +for ac_header in sys/types.h sys/stat.h stdlib.h string.h memory.h strings.h \ + inttypes.h stdint.h unistd.h +do : + as_ac_Header=`$as_echo "ac_cv_header_$ac_header" | $as_tr_sh` +ac_fn_c_check_header_compile "$LINENO" "$ac_header" "$as_ac_Header" "$ac_includes_default +" +if eval test \"x\$"$as_ac_Header"\" = x"yes"; then : + cat >>confdefs.h <<_ACEOF +#define `$as_echo "HAVE_$ac_header" | $as_tr_cpp` 1 +_ACEOF + +fi + +done + + + + ac_fn_c_check_header_mongrel "$LINENO" "minix/config.h" "ac_cv_header_minix_config_h" "$ac_includes_default" +if test "x$ac_cv_header_minix_config_h" = xyes; then : + MINIX=yes +else + MINIX= +fi + + + if test "$MINIX" = yes; then + +$as_echo "#define _POSIX_SOURCE 1" >>confdefs.h + + +$as_echo "#define _POSIX_1_SOURCE 2" >>confdefs.h + + +$as_echo "#define _MINIX 1" >>confdefs.h + + fi + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether it is safe to define __EXTENSIONS__" >&5 +$as_echo_n "checking whether it is safe to define __EXTENSIONS__... " >&6; } +if ${ac_cv_safe_to_define___extensions__+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +# define __EXTENSIONS__ 1 + $ac_includes_default +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_safe_to_define___extensions__=yes +else + ac_cv_safe_to_define___extensions__=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_safe_to_define___extensions__" >&5 +$as_echo "$ac_cv_safe_to_define___extensions__" >&6; } + test $ac_cv_safe_to_define___extensions__ = yes && + $as_echo "#define __EXTENSIONS__ 1" >>confdefs.h + + $as_echo "#define _ALL_SOURCE 1" >>confdefs.h + + $as_echo "#define _GNU_SOURCE 1" >>confdefs.h + + $as_echo "#define _POSIX_PTHREAD_SEMANTICS 1" >>confdefs.h + + $as_echo "#define _TANDEM_SOURCE 1" >>confdefs.h + + + +# Checks for programs. +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu +if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}gcc", so it can be a program name with args. +set dummy ${ac_tool_prefix}gcc; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$CC"; then + ac_cv_prog_CC="$CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_CC="${ac_tool_prefix}gcc" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +CC=$ac_cv_prog_CC +if test -n "$CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CC" >&5 +$as_echo "$CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_CC"; then + ac_ct_CC=$CC + # Extract the first word of "gcc", so it can be a program name with args. +set dummy gcc; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_CC"; then + ac_cv_prog_ac_ct_CC="$ac_ct_CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_CC="gcc" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_CC=$ac_cv_prog_ac_ct_CC +if test -n "$ac_ct_CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_CC" >&5 +$as_echo "$ac_ct_CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_CC" = x; then + CC="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + CC=$ac_ct_CC + fi +else + CC="$ac_cv_prog_CC" +fi + +if test -z "$CC"; then + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}cc", so it can be a program name with args. +set dummy ${ac_tool_prefix}cc; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$CC"; then + ac_cv_prog_CC="$CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_CC="${ac_tool_prefix}cc" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +CC=$ac_cv_prog_CC +if test -n "$CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CC" >&5 +$as_echo "$CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + fi +fi +if test -z "$CC"; then + # Extract the first word of "cc", so it can be a program name with args. +set dummy cc; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$CC"; then + ac_cv_prog_CC="$CC" # Let the user override the test. +else + ac_prog_rejected=no +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + if test "$as_dir/$ac_word$ac_exec_ext" = "/usr/ucb/cc"; then + ac_prog_rejected=yes + continue + fi + ac_cv_prog_CC="cc" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +if test $ac_prog_rejected = yes; then + # We found a bogon in the path, so make sure we never use it. + set dummy $ac_cv_prog_CC + shift + if test $# != 0; then + # We chose a different compiler from the bogus one. + # However, it has the same basename, so the bogon will be chosen + # first if we set CC to just the basename; use the full file name. + shift + ac_cv_prog_CC="$as_dir/$ac_word${1+' '}$@" + fi +fi +fi +fi +CC=$ac_cv_prog_CC +if test -n "$CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CC" >&5 +$as_echo "$CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$CC"; then + if test -n "$ac_tool_prefix"; then + for ac_prog in cl.exe + do + # Extract the first word of "$ac_tool_prefix$ac_prog", so it can be a program name with args. +set dummy $ac_tool_prefix$ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$CC"; then + ac_cv_prog_CC="$CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_CC="$ac_tool_prefix$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +CC=$ac_cv_prog_CC +if test -n "$CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CC" >&5 +$as_echo "$CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$CC" && break + done +fi +if test -z "$CC"; then + ac_ct_CC=$CC + for ac_prog in cl.exe +do + # Extract the first word of "$ac_prog", so it can be a program name with args. +set dummy $ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_CC"; then + ac_cv_prog_ac_ct_CC="$ac_ct_CC" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_CC="$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_CC=$ac_cv_prog_ac_ct_CC +if test -n "$ac_ct_CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_CC" >&5 +$as_echo "$ac_ct_CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$ac_ct_CC" && break +done + + if test "x$ac_ct_CC" = x; then + CC="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + CC=$ac_ct_CC + fi +fi + +fi + + +test -z "$CC" && { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "no acceptable C compiler found in \$PATH +See \`config.log' for more details" "$LINENO" 5; } + +# Provide some information about the compiler. +$as_echo "$as_me:${as_lineno-$LINENO}: checking for C compiler version" >&5 +set X $ac_compile +ac_compiler=$2 +for ac_option in --version -v -V -qversion; do + { { ac_try="$ac_compiler $ac_option >&5" +case "(($ac_try" in + *\"* | *\`* | *\\*) ac_try_echo=\$ac_try;; + *) ac_try_echo=$ac_try;; +esac +eval ac_try_echo="\"\$as_me:${as_lineno-$LINENO}: $ac_try_echo\"" +$as_echo "$ac_try_echo"; } >&5 + (eval "$ac_compiler $ac_option >&5") 2>conftest.err + ac_status=$? + if test -s conftest.err; then + sed '10a\ +... rest of stderr output deleted ... + 10q' conftest.err >conftest.er1 + cat conftest.er1 >&5 + fi + rm -f conftest.er1 conftest.err + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } +done + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether we are using the GNU C compiler" >&5 +$as_echo_n "checking whether we are using the GNU C compiler... " >&6; } +if ${ac_cv_c_compiler_gnu+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ +#ifndef __GNUC__ + choke me +#endif + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_compiler_gnu=yes +else + ac_compiler_gnu=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +ac_cv_c_compiler_gnu=$ac_compiler_gnu + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_c_compiler_gnu" >&5 +$as_echo "$ac_cv_c_compiler_gnu" >&6; } +if test $ac_compiler_gnu = yes; then + GCC=yes +else + GCC= +fi +ac_test_CFLAGS=${CFLAGS+set} +ac_save_CFLAGS=$CFLAGS +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC accepts -g" >&5 +$as_echo_n "checking whether $CC accepts -g... " >&6; } +if ${ac_cv_prog_cc_g+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_save_c_werror_flag=$ac_c_werror_flag + ac_c_werror_flag=yes + ac_cv_prog_cc_g=no + CFLAGS="-g" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_prog_cc_g=yes +else + CFLAGS="" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + +else + ac_c_werror_flag=$ac_save_c_werror_flag + CFLAGS="-g" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_prog_cc_g=yes +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + ac_c_werror_flag=$ac_save_c_werror_flag +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_prog_cc_g" >&5 +$as_echo "$ac_cv_prog_cc_g" >&6; } +if test "$ac_test_CFLAGS" = set; then + CFLAGS=$ac_save_CFLAGS +elif test $ac_cv_prog_cc_g = yes; then + if test "$GCC" = yes; then + CFLAGS="-g -O2" + else + CFLAGS="-g" + fi +else + if test "$GCC" = yes; then + CFLAGS="-O2" + else + CFLAGS= + fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $CC option to accept ISO C89" >&5 +$as_echo_n "checking for $CC option to accept ISO C89... " >&6; } +if ${ac_cv_prog_cc_c89+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_cv_prog_cc_c89=no +ac_save_CC=$CC +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +#include +struct stat; +/* Most of the following tests are stolen from RCS 5.7's src/conf.sh. */ +struct buf { int x; }; +FILE * (*rcsopen) (struct buf *, struct stat *, int); +static char *e (p, i) + char **p; + int i; +{ + return p[i]; +} +static char *f (char * (*g) (char **, int), char **p, ...) +{ + char *s; + va_list v; + va_start (v,p); + s = g (p, va_arg (v,int)); + va_end (v); + return s; +} + +/* OSF 4.0 Compaq cc is some sort of almost-ANSI by default. It has + function prototypes and stuff, but not '\xHH' hex character constants. + These don't provoke an error unfortunately, instead are silently treated + as 'x'. The following induces an error, until -std is added to get + proper ANSI mode. Curiously '\x00'!='x' always comes out true, for an + array size at least. It's necessary to write '\x00'==0 to get something + that's true only with -std. */ +int osf4_cc_array ['\x00' == 0 ? 1 : -1]; + +/* IBM C 6 for AIX is almost-ANSI by default, but it replaces macro parameters + inside strings and character constants. */ +#define FOO(x) 'x' +int xlc6_cc_array[FOO(a) == 'x' ? 1 : -1]; + +int test (int i, double x); +struct s1 {int (*f) (int a);}; +struct s2 {int (*f) (double a);}; +int pairnames (int, char **, FILE *(*)(struct buf *, struct stat *, int), int, int); +int argc; +char **argv; +int +main () +{ +return f (e, argv, 0) != argv[0] || f (e, argv, 1) != argv[1]; + ; + return 0; +} +_ACEOF +for ac_arg in '' -qlanglvl=extc89 -qlanglvl=ansi -std \ + -Ae "-Aa -D_HPUX_SOURCE" "-Xc -D__EXTENSIONS__" +do + CC="$ac_save_CC $ac_arg" + if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_prog_cc_c89=$ac_arg +fi +rm -f core conftest.err conftest.$ac_objext + test "x$ac_cv_prog_cc_c89" != "xno" && break +done +rm -f conftest.$ac_ext +CC=$ac_save_CC + +fi +# AC_CACHE_VAL +case "x$ac_cv_prog_cc_c89" in + x) + { $as_echo "$as_me:${as_lineno-$LINENO}: result: none needed" >&5 +$as_echo "none needed" >&6; } ;; + xno) + { $as_echo "$as_me:${as_lineno-$LINENO}: result: unsupported" >&5 +$as_echo "unsupported" >&6; } ;; + *) + CC="$CC $ac_cv_prog_cc_c89" + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_prog_cc_c89" >&5 +$as_echo "$ac_cv_prog_cc_c89" >&6; } ;; +esac +if test "x$ac_cv_prog_cc_c89" != xno; then : + +fi + +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC understands -c and -o together" >&5 +$as_echo_n "checking whether $CC understands -c and -o together... " >&6; } +if ${am_cv_prog_cc_c_o+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF + # Make sure it works both with $CC and with simple cc. + # Following AC_PROG_CC_C_O, we do the test twice because some + # compilers refuse to overwrite an existing .o file with -o, + # though they will create one. + am_cv_prog_cc_c_o=yes + for am_i in 1 2; do + if { echo "$as_me:$LINENO: $CC -c conftest.$ac_ext -o conftest2.$ac_objext" >&5 + ($CC -c conftest.$ac_ext -o conftest2.$ac_objext) >&5 2>&5 + ac_status=$? + echo "$as_me:$LINENO: \$? = $ac_status" >&5 + (exit $ac_status); } \ + && test -f conftest2.$ac_objext; then + : OK + else + am_cv_prog_cc_c_o=no + break + fi + done + rm -f core conftest* + unset am_i +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $am_cv_prog_cc_c_o" >&5 +$as_echo "$am_cv_prog_cc_c_o" >&6; } +if test "$am_cv_prog_cc_c_o" != yes; then + # Losing compiler, so override with the script. + # FIXME: It is wrong to rewrite CC. + # But if we don't then we get into trouble of one sort or another. + # A longer-term fix would be to have automake use am__CC in this case, + # and then we could set am__CC="\$(top_srcdir)/compile \$(CC)" + CC="$am_aux_dir/compile $CC" +fi +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + + +depcc="$CC" am_compiler_list= + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking dependency style of $depcc" >&5 +$as_echo_n "checking dependency style of $depcc... " >&6; } +if ${am_cv_CC_dependencies_compiler_type+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -z "$AMDEP_TRUE" && test -f "$am_depcomp"; then + # We make a subdir and do the tests there. Otherwise we can end up + # making bogus files that we don't know about and never remove. For + # instance it was reported that on HP-UX the gcc test will end up + # making a dummy file named 'D' -- because '-MD' means "put the output + # in D". + rm -rf conftest.dir + mkdir conftest.dir + # Copy depcomp to subdir because otherwise we won't find it if we're + # using a relative directory. + cp "$am_depcomp" conftest.dir + cd conftest.dir + # We will build objects and dependencies in a subdirectory because + # it helps to detect inapplicable dependency modes. For instance + # both Tru64's cc and ICC support -MD to output dependencies as a + # side effect of compilation, but ICC will put the dependencies in + # the current directory while Tru64 will put them in the object + # directory. + mkdir sub + + am_cv_CC_dependencies_compiler_type=none + if test "$am_compiler_list" = ""; then + am_compiler_list=`sed -n 's/^#*\([a-zA-Z0-9]*\))$/\1/p' < ./depcomp` + fi + am__universal=false + case " $depcc " in #( + *\ -arch\ *\ -arch\ *) am__universal=true ;; + esac + + for depmode in $am_compiler_list; do + # Setup a source with many dependencies, because some compilers + # like to wrap large dependency lists on column 80 (with \), and + # we should not choose a depcomp mode which is confused by this. + # + # We need to recreate these files for each test, as the compiler may + # overwrite some of them when testing with obscure command lines. + # This happens at least with the AIX C compiler. + : > sub/conftest.c + for i in 1 2 3 4 5 6; do + echo '#include "conftst'$i'.h"' >> sub/conftest.c + # Using ": > sub/conftst$i.h" creates only sub/conftst1.h with + # Solaris 10 /bin/sh. + echo '/* dummy */' > sub/conftst$i.h + done + echo "${am__include} ${am__quote}sub/conftest.Po${am__quote}" > confmf + + # We check with '-c' and '-o' for the sake of the "dashmstdout" + # mode. It turns out that the SunPro C++ compiler does not properly + # handle '-M -o', and we need to detect this. Also, some Intel + # versions had trouble with output in subdirs. + am__obj=sub/conftest.${OBJEXT-o} + am__minus_obj="-o $am__obj" + case $depmode in + gcc) + # This depmode causes a compiler race in universal mode. + test "$am__universal" = false || continue + ;; + nosideeffect) + # After this tag, mechanisms are not by side-effect, so they'll + # only be used when explicitly requested. + if test "x$enable_dependency_tracking" = xyes; then + continue + else + break + fi + ;; + msvc7 | msvc7msys | msvisualcpp | msvcmsys) + # This compiler won't grok '-c -o', but also, the minuso test has + # not run yet. These depmodes are late enough in the game, and + # so weak that their functioning should not be impacted. + am__obj=conftest.${OBJEXT-o} + am__minus_obj= + ;; + none) break ;; + esac + if depmode=$depmode \ + source=sub/conftest.c object=$am__obj \ + depfile=sub/conftest.Po tmpdepfile=sub/conftest.TPo \ + $SHELL ./depcomp $depcc -c $am__minus_obj sub/conftest.c \ + >/dev/null 2>conftest.err && + grep sub/conftst1.h sub/conftest.Po > /dev/null 2>&1 && + grep sub/conftst6.h sub/conftest.Po > /dev/null 2>&1 && + grep $am__obj sub/conftest.Po > /dev/null 2>&1 && + ${MAKE-make} -s -f confmf > /dev/null 2>&1; then + # icc doesn't choke on unknown options, it will just issue warnings + # or remarks (even with -Werror). So we grep stderr for any message + # that says an option was ignored or not supported. + # When given -MP, icc 7.0 and 7.1 complain thusly: + # icc: Command line warning: ignoring option '-M'; no argument required + # The diagnosis changed in icc 8.0: + # icc: Command line remark: option '-MP' not supported + if (grep 'ignoring option' conftest.err || + grep 'not supported' conftest.err) >/dev/null 2>&1; then :; else + am_cv_CC_dependencies_compiler_type=$depmode + break + fi + fi + done + + cd .. + rm -rf conftest.dir +else + am_cv_CC_dependencies_compiler_type=none +fi + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $am_cv_CC_dependencies_compiler_type" >&5 +$as_echo "$am_cv_CC_dependencies_compiler_type" >&6; } +CCDEPMODE=depmode=$am_cv_CC_dependencies_compiler_type + + if + test "x$enable_dependency_tracking" != xno \ + && test "$am_cv_CC_dependencies_compiler_type" = gcc3; then + am__fastdepCC_TRUE= + am__fastdepCC_FALSE='#' +else + am__fastdepCC_TRUE='#' + am__fastdepCC_FALSE= +fi + + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $CC option to accept ISO C99" >&5 +$as_echo_n "checking for $CC option to accept ISO C99... " >&6; } +if ${ac_cv_prog_cc_c99+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_cv_prog_cc_c99=no +ac_save_CC=$CC +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +#include +#include +#include +#include + +// Check varargs macros. These examples are taken from C99 6.10.3.5. +#define debug(...) fprintf (stderr, __VA_ARGS__) +#define showlist(...) puts (#__VA_ARGS__) +#define report(test,...) ((test) ? puts (#test) : printf (__VA_ARGS__)) +static void +test_varargs_macros (void) +{ + int x = 1234; + int y = 5678; + debug ("Flag"); + debug ("X = %d\n", x); + showlist (The first, second, and third items.); + report (x>y, "x is %d but y is %d", x, y); +} + +// Check long long types. +#define BIG64 18446744073709551615ull +#define BIG32 4294967295ul +#define BIG_OK (BIG64 / BIG32 == 4294967297ull && BIG64 % BIG32 == 0) +#if !BIG_OK + your preprocessor is broken; +#endif +#if BIG_OK +#else + your preprocessor is broken; +#endif +static long long int bignum = -9223372036854775807LL; +static unsigned long long int ubignum = BIG64; + +struct incomplete_array +{ + int datasize; + double data[]; +}; + +struct named_init { + int number; + const wchar_t *name; + double average; +}; + +typedef const char *ccp; + +static inline int +test_restrict (ccp restrict text) +{ + // See if C++-style comments work. + // Iterate through items via the restricted pointer. + // Also check for declarations in for loops. + for (unsigned int i = 0; *(text+i) != '\0'; ++i) + continue; + return 0; +} + +// Check varargs and va_copy. +static void +test_varargs (const char *format, ...) +{ + va_list args; + va_start (args, format); + va_list args_copy; + va_copy (args_copy, args); + + const char *str; + int number; + float fnumber; + + while (*format) + { + switch (*format++) + { + case 's': // string + str = va_arg (args_copy, const char *); + break; + case 'd': // int + number = va_arg (args_copy, int); + break; + case 'f': // float + fnumber = va_arg (args_copy, double); + break; + default: + break; + } + } + va_end (args_copy); + va_end (args); +} + +int +main () +{ + + // Check bool. + _Bool success = false; + + // Check restrict. + if (test_restrict ("String literal") == 0) + success = true; + char *restrict newvar = "Another string"; + + // Check varargs. + test_varargs ("s, d' f .", "string", 65, 34.234); + test_varargs_macros (); + + // Check flexible array members. + struct incomplete_array *ia = + malloc (sizeof (struct incomplete_array) + (sizeof (double) * 10)); + ia->datasize = 10; + for (int i = 0; i < ia->datasize; ++i) + ia->data[i] = i * 1.234; + + // Check named initializers. + struct named_init ni = { + .number = 34, + .name = L"Test wide string", + .average = 543.34343, + }; + + ni.number = 58; + + int dynamic_array[ni.number]; + dynamic_array[ni.number - 1] = 543; + + // work around unused variable warnings + return (!success || bignum == 0LL || ubignum == 0uLL || newvar[0] == 'x' + || dynamic_array[ni.number - 1] != 543); + + ; + return 0; +} +_ACEOF +for ac_arg in '' -std=gnu99 -std=c99 -c99 -AC99 -D_STDC_C99= -qlanglvl=extc99 +do + CC="$ac_save_CC $ac_arg" + if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_prog_cc_c99=$ac_arg +fi +rm -f core conftest.err conftest.$ac_objext + test "x$ac_cv_prog_cc_c99" != "xno" && break +done +rm -f conftest.$ac_ext +CC=$ac_save_CC + +fi +# AC_CACHE_VAL +case "x$ac_cv_prog_cc_c99" in + x) + { $as_echo "$as_me:${as_lineno-$LINENO}: result: none needed" >&5 +$as_echo "none needed" >&6; } ;; + xno) + { $as_echo "$as_me:${as_lineno-$LINENO}: result: unsupported" >&5 +$as_echo "unsupported" >&6; } ;; + *) + CC="$CC $ac_cv_prog_cc_c99" + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_prog_cc_c99" >&5 +$as_echo "$ac_cv_prog_cc_c99" >&6; } ;; +esac +if test "x$ac_cv_prog_cc_c99" != xno; then : + +fi + + +case `pwd` in + *\ * | *\ *) + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: Libtool does not cope well with whitespace in \`pwd\`" >&5 +$as_echo "$as_me: WARNING: Libtool does not cope well with whitespace in \`pwd\`" >&2;} ;; +esac + + + +macro_version='2.4.6' +macro_revision='2.4.6' + + + + + + + + + + + + + +ltmain=$ac_aux_dir/ltmain.sh + +# Make sure we can run config.sub. +$SHELL "$ac_aux_dir/config.sub" sun4 >/dev/null 2>&1 || + as_fn_error $? "cannot run $SHELL $ac_aux_dir/config.sub" "$LINENO" 5 + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking build system type" >&5 +$as_echo_n "checking build system type... " >&6; } +if ${ac_cv_build+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_build_alias=$build_alias +test "x$ac_build_alias" = x && + ac_build_alias=`$SHELL "$ac_aux_dir/config.guess"` +test "x$ac_build_alias" = x && + as_fn_error $? "cannot guess build type; you must specify one" "$LINENO" 5 +ac_cv_build=`$SHELL "$ac_aux_dir/config.sub" $ac_build_alias` || + as_fn_error $? "$SHELL $ac_aux_dir/config.sub $ac_build_alias failed" "$LINENO" 5 + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_build" >&5 +$as_echo "$ac_cv_build" >&6; } +case $ac_cv_build in +*-*-*) ;; +*) as_fn_error $? "invalid value of canonical build" "$LINENO" 5;; +esac +build=$ac_cv_build +ac_save_IFS=$IFS; IFS='-' +set x $ac_cv_build +shift +build_cpu=$1 +build_vendor=$2 +shift; shift +# Remember, the first character of IFS is used to create $*, +# except with old shells: +build_os=$* +IFS=$ac_save_IFS +case $build_os in *\ *) build_os=`echo "$build_os" | sed 's/ /-/g'`;; esac + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking host system type" >&5 +$as_echo_n "checking host system type... " >&6; } +if ${ac_cv_host+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test "x$host_alias" = x; then + ac_cv_host=$ac_cv_build +else + ac_cv_host=`$SHELL "$ac_aux_dir/config.sub" $host_alias` || + as_fn_error $? "$SHELL $ac_aux_dir/config.sub $host_alias failed" "$LINENO" 5 +fi + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_host" >&5 +$as_echo "$ac_cv_host" >&6; } +case $ac_cv_host in +*-*-*) ;; +*) as_fn_error $? "invalid value of canonical host" "$LINENO" 5;; +esac +host=$ac_cv_host +ac_save_IFS=$IFS; IFS='-' +set x $ac_cv_host +shift +host_cpu=$1 +host_vendor=$2 +shift; shift +# Remember, the first character of IFS is used to create $*, +# except with old shells: +host_os=$* +IFS=$ac_save_IFS +case $host_os in *\ *) host_os=`echo "$host_os" | sed 's/ /-/g'`;; esac + + +# Backslashify metacharacters that are still active within +# double-quoted strings. +sed_quote_subst='s/\(["`$\\]\)/\\\1/g' + +# Same as above, but do not quote variable references. +double_quote_subst='s/\(["`\\]\)/\\\1/g' + +# Sed substitution to delay expansion of an escaped shell variable in a +# double_quote_subst'ed string. +delay_variable_subst='s/\\\\\\\\\\\$/\\\\\\$/g' + +# Sed substitution to delay expansion of an escaped single quote. +delay_single_quote_subst='s/'\''/'\'\\\\\\\'\''/g' + +# Sed substitution to avoid accidental globbing in evaled expressions +no_glob_subst='s/\*/\\\*/g' + +ECHO='\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\' +ECHO=$ECHO$ECHO$ECHO$ECHO$ECHO +ECHO=$ECHO$ECHO$ECHO$ECHO$ECHO$ECHO + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking how to print strings" >&5 +$as_echo_n "checking how to print strings... " >&6; } +# Test print first, because it will be a builtin if present. +if test "X`( print -r -- -n ) 2>/dev/null`" = X-n && \ + test "X`print -r -- $ECHO 2>/dev/null`" = "X$ECHO"; then + ECHO='print -r --' +elif test "X`printf %s $ECHO 2>/dev/null`" = "X$ECHO"; then + ECHO='printf %s\n' +else + # Use this function as a fallback that always works. + func_fallback_echo () + { + eval 'cat <<_LTECHO_EOF +$1 +_LTECHO_EOF' + } + ECHO='func_fallback_echo' +fi + +# func_echo_all arg... +# Invoke $ECHO with all args, space-separated. +func_echo_all () +{ + $ECHO "" +} + +case $ECHO in + printf*) { $as_echo "$as_me:${as_lineno-$LINENO}: result: printf" >&5 +$as_echo "printf" >&6; } ;; + print*) { $as_echo "$as_me:${as_lineno-$LINENO}: result: print -r" >&5 +$as_echo "print -r" >&6; } ;; + *) { $as_echo "$as_me:${as_lineno-$LINENO}: result: cat" >&5 +$as_echo "cat" >&6; } ;; +esac + + + + + + + + + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for a sed that does not truncate output" >&5 +$as_echo_n "checking for a sed that does not truncate output... " >&6; } +if ${ac_cv_path_SED+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_script=s/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb/ + for ac_i in 1 2 3 4 5 6 7; do + ac_script="$ac_script$as_nl$ac_script" + done + echo "$ac_script" 2>/dev/null | sed 99q >conftest.sed + { ac_script=; unset ac_script;} + if test -z "$SED"; then + ac_path_SED_found=false + # Loop through the user's path and test for each of PROGNAME-LIST + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_prog in sed gsed; do + for ac_exec_ext in '' $ac_executable_extensions; do + ac_path_SED="$as_dir/$ac_prog$ac_exec_ext" + as_fn_executable_p "$ac_path_SED" || continue +# Check for GNU ac_path_SED and select it if it is found. + # Check for GNU $ac_path_SED +case `"$ac_path_SED" --version 2>&1` in +*GNU*) + ac_cv_path_SED="$ac_path_SED" ac_path_SED_found=:;; +*) + ac_count=0 + $as_echo_n 0123456789 >"conftest.in" + while : + do + cat "conftest.in" "conftest.in" >"conftest.tmp" + mv "conftest.tmp" "conftest.in" + cp "conftest.in" "conftest.nl" + $as_echo '' >> "conftest.nl" + "$ac_path_SED" -f conftest.sed < "conftest.nl" >"conftest.out" 2>/dev/null || break + diff "conftest.out" "conftest.nl" >/dev/null 2>&1 || break + as_fn_arith $ac_count + 1 && ac_count=$as_val + if test $ac_count -gt ${ac_path_SED_max-0}; then + # Best one so far, save it but keep looking for a better one + ac_cv_path_SED="$ac_path_SED" + ac_path_SED_max=$ac_count + fi + # 10*(2^10) chars as input seems more than enough + test $ac_count -gt 10 && break + done + rm -f conftest.in conftest.tmp conftest.nl conftest.out;; +esac + + $ac_path_SED_found && break 3 + done + done + done +IFS=$as_save_IFS + if test -z "$ac_cv_path_SED"; then + as_fn_error $? "no acceptable sed could be found in \$PATH" "$LINENO" 5 + fi +else + ac_cv_path_SED=$SED +fi + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_path_SED" >&5 +$as_echo "$ac_cv_path_SED" >&6; } + SED="$ac_cv_path_SED" + rm -f conftest.sed + +test -z "$SED" && SED=sed +Xsed="$SED -e 1s/^X//" + + + + + + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for fgrep" >&5 +$as_echo_n "checking for fgrep... " >&6; } +if ${ac_cv_path_FGREP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if echo 'ab*c' | $GREP -F 'ab*c' >/dev/null 2>&1 + then ac_cv_path_FGREP="$GREP -F" + else + if test -z "$FGREP"; then + ac_path_FGREP_found=false + # Loop through the user's path and test for each of PROGNAME-LIST + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH$PATH_SEPARATOR/usr/xpg4/bin +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_prog in fgrep; do + for ac_exec_ext in '' $ac_executable_extensions; do + ac_path_FGREP="$as_dir/$ac_prog$ac_exec_ext" + as_fn_executable_p "$ac_path_FGREP" || continue +# Check for GNU ac_path_FGREP and select it if it is found. + # Check for GNU $ac_path_FGREP +case `"$ac_path_FGREP" --version 2>&1` in +*GNU*) + ac_cv_path_FGREP="$ac_path_FGREP" ac_path_FGREP_found=:;; +*) + ac_count=0 + $as_echo_n 0123456789 >"conftest.in" + while : + do + cat "conftest.in" "conftest.in" >"conftest.tmp" + mv "conftest.tmp" "conftest.in" + cp "conftest.in" "conftest.nl" + $as_echo 'FGREP' >> "conftest.nl" + "$ac_path_FGREP" FGREP < "conftest.nl" >"conftest.out" 2>/dev/null || break + diff "conftest.out" "conftest.nl" >/dev/null 2>&1 || break + as_fn_arith $ac_count + 1 && ac_count=$as_val + if test $ac_count -gt ${ac_path_FGREP_max-0}; then + # Best one so far, save it but keep looking for a better one + ac_cv_path_FGREP="$ac_path_FGREP" + ac_path_FGREP_max=$ac_count + fi + # 10*(2^10) chars as input seems more than enough + test $ac_count -gt 10 && break + done + rm -f conftest.in conftest.tmp conftest.nl conftest.out;; +esac + + $ac_path_FGREP_found && break 3 + done + done + done +IFS=$as_save_IFS + if test -z "$ac_cv_path_FGREP"; then + as_fn_error $? "no acceptable fgrep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin" "$LINENO" 5 + fi +else + ac_cv_path_FGREP=$FGREP +fi + + fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_path_FGREP" >&5 +$as_echo "$ac_cv_path_FGREP" >&6; } + FGREP="$ac_cv_path_FGREP" + + +test -z "$GREP" && GREP=grep + + + + + + + + + + + + + + + + + + + +# Check whether --with-gnu-ld was given. +if test "${with_gnu_ld+set}" = set; then : + withval=$with_gnu_ld; test no = "$withval" || with_gnu_ld=yes +else + with_gnu_ld=no +fi + +ac_prog=ld +if test yes = "$GCC"; then + # Check if gcc -print-prog-name=ld gives a path. + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for ld used by $CC" >&5 +$as_echo_n "checking for ld used by $CC... " >&6; } + case $host in + *-*-mingw*) + # gcc leaves a trailing carriage return, which upsets mingw + ac_prog=`($CC -print-prog-name=ld) 2>&5 | tr -d '\015'` ;; + *) + ac_prog=`($CC -print-prog-name=ld) 2>&5` ;; + esac + case $ac_prog in + # Accept absolute paths. + [\\/]* | ?:[\\/]*) + re_direlt='/[^/][^/]*/\.\./' + # Canonicalize the pathname of ld + ac_prog=`$ECHO "$ac_prog"| $SED 's%\\\\%/%g'` + while $ECHO "$ac_prog" | $GREP "$re_direlt" > /dev/null 2>&1; do + ac_prog=`$ECHO $ac_prog| $SED "s%$re_direlt%/%"` + done + test -z "$LD" && LD=$ac_prog + ;; + "") + # If it fails, then pretend we aren't using GCC. + ac_prog=ld + ;; + *) + # If it is relative, then search for the first ld in PATH. + with_gnu_ld=unknown + ;; + esac +elif test yes = "$with_gnu_ld"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for GNU ld" >&5 +$as_echo_n "checking for GNU ld... " >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for non-GNU ld" >&5 +$as_echo_n "checking for non-GNU ld... " >&6; } +fi +if ${lt_cv_path_LD+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -z "$LD"; then + lt_save_ifs=$IFS; IFS=$PATH_SEPARATOR + for ac_dir in $PATH; do + IFS=$lt_save_ifs + test -z "$ac_dir" && ac_dir=. + if test -f "$ac_dir/$ac_prog" || test -f "$ac_dir/$ac_prog$ac_exeext"; then + lt_cv_path_LD=$ac_dir/$ac_prog + # Check to see if the program is GNU ld. I'd rather use --version, + # but apparently some variants of GNU ld only accept -v. + # Break only if it was the GNU/non-GNU ld that we prefer. + case `"$lt_cv_path_LD" -v 2>&1 &5 +$as_echo "$LD" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi +test -z "$LD" && as_fn_error $? "no acceptable ld found in \$PATH" "$LINENO" 5 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking if the linker ($LD) is GNU ld" >&5 +$as_echo_n "checking if the linker ($LD) is GNU ld... " >&6; } +if ${lt_cv_prog_gnu_ld+:} false; then : + $as_echo_n "(cached) " >&6 +else + # I'd rather use --version here, but apparently some GNU lds only accept -v. +case `$LD -v 2>&1 &5 +$as_echo "$lt_cv_prog_gnu_ld" >&6; } +with_gnu_ld=$lt_cv_prog_gnu_ld + + + + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for BSD- or MS-compatible name lister (nm)" >&5 +$as_echo_n "checking for BSD- or MS-compatible name lister (nm)... " >&6; } +if ${lt_cv_path_NM+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$NM"; then + # Let the user override the test. + lt_cv_path_NM=$NM +else + lt_nm_to_check=${ac_tool_prefix}nm + if test -n "$ac_tool_prefix" && test "$build" = "$host"; then + lt_nm_to_check="$lt_nm_to_check nm" + fi + for lt_tmp_nm in $lt_nm_to_check; do + lt_save_ifs=$IFS; IFS=$PATH_SEPARATOR + for ac_dir in $PATH /usr/ccs/bin/elf /usr/ccs/bin /usr/ucb /bin; do + IFS=$lt_save_ifs + test -z "$ac_dir" && ac_dir=. + tmp_nm=$ac_dir/$lt_tmp_nm + if test -f "$tmp_nm" || test -f "$tmp_nm$ac_exeext"; then + # Check to see if the nm accepts a BSD-compat flag. + # Adding the 'sed 1q' prevents false positives on HP-UX, which says: + # nm: unknown option "B" ignored + # Tru64's nm complains that /dev/null is an invalid object file + # MSYS converts /dev/null to NUL, MinGW nm treats NUL as empty + case $build_os in + mingw*) lt_bad_file=conftest.nm/nofile ;; + *) lt_bad_file=/dev/null ;; + esac + case `"$tmp_nm" -B $lt_bad_file 2>&1 | sed '1q'` in + *$lt_bad_file* | *'Invalid file or object type'*) + lt_cv_path_NM="$tmp_nm -B" + break 2 + ;; + *) + case `"$tmp_nm" -p /dev/null 2>&1 | sed '1q'` in + */dev/null*) + lt_cv_path_NM="$tmp_nm -p" + break 2 + ;; + *) + lt_cv_path_NM=${lt_cv_path_NM="$tmp_nm"} # keep the first match, but + continue # so that we can try to find one that supports BSD flags + ;; + esac + ;; + esac + fi + done + IFS=$lt_save_ifs + done + : ${lt_cv_path_NM=no} +fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_path_NM" >&5 +$as_echo "$lt_cv_path_NM" >&6; } +if test no != "$lt_cv_path_NM"; then + NM=$lt_cv_path_NM +else + # Didn't find any BSD compatible name lister, look for dumpbin. + if test -n "$DUMPBIN"; then : + # Let the user override the test. + else + if test -n "$ac_tool_prefix"; then + for ac_prog in dumpbin "link -dump" + do + # Extract the first word of "$ac_tool_prefix$ac_prog", so it can be a program name with args. +set dummy $ac_tool_prefix$ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_DUMPBIN+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$DUMPBIN"; then + ac_cv_prog_DUMPBIN="$DUMPBIN" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_DUMPBIN="$ac_tool_prefix$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +DUMPBIN=$ac_cv_prog_DUMPBIN +if test -n "$DUMPBIN"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $DUMPBIN" >&5 +$as_echo "$DUMPBIN" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$DUMPBIN" && break + done +fi +if test -z "$DUMPBIN"; then + ac_ct_DUMPBIN=$DUMPBIN + for ac_prog in dumpbin "link -dump" +do + # Extract the first word of "$ac_prog", so it can be a program name with args. +set dummy $ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_DUMPBIN+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_DUMPBIN"; then + ac_cv_prog_ac_ct_DUMPBIN="$ac_ct_DUMPBIN" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_DUMPBIN="$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_DUMPBIN=$ac_cv_prog_ac_ct_DUMPBIN +if test -n "$ac_ct_DUMPBIN"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_DUMPBIN" >&5 +$as_echo "$ac_ct_DUMPBIN" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$ac_ct_DUMPBIN" && break +done + + if test "x$ac_ct_DUMPBIN" = x; then + DUMPBIN=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + DUMPBIN=$ac_ct_DUMPBIN + fi +fi + + case `$DUMPBIN -symbols -headers /dev/null 2>&1 | sed '1q'` in + *COFF*) + DUMPBIN="$DUMPBIN -symbols -headers" + ;; + *) + DUMPBIN=: + ;; + esac + fi + + if test : != "$DUMPBIN"; then + NM=$DUMPBIN + fi +fi +test -z "$NM" && NM=nm + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking the name lister ($NM) interface" >&5 +$as_echo_n "checking the name lister ($NM) interface... " >&6; } +if ${lt_cv_nm_interface+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_nm_interface="BSD nm" + echo "int some_variable = 0;" > conftest.$ac_ext + (eval echo "\"\$as_me:$LINENO: $ac_compile\"" >&5) + (eval "$ac_compile" 2>conftest.err) + cat conftest.err >&5 + (eval echo "\"\$as_me:$LINENO: $NM \\\"conftest.$ac_objext\\\"\"" >&5) + (eval "$NM \"conftest.$ac_objext\"" 2>conftest.err > conftest.out) + cat conftest.err >&5 + (eval echo "\"\$as_me:$LINENO: output\"" >&5) + cat conftest.out >&5 + if $GREP 'External.*some_variable' conftest.out > /dev/null; then + lt_cv_nm_interface="MS dumpbin" + fi + rm -f conftest* +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_nm_interface" >&5 +$as_echo "$lt_cv_nm_interface" >&6; } + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether ln -s works" >&5 +$as_echo_n "checking whether ln -s works... " >&6; } +LN_S=$as_ln_s +if test "$LN_S" = "ln -s"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no, using $LN_S" >&5 +$as_echo "no, using $LN_S" >&6; } +fi + +# find the maximum length of command line arguments +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking the maximum length of command line arguments" >&5 +$as_echo_n "checking the maximum length of command line arguments... " >&6; } +if ${lt_cv_sys_max_cmd_len+:} false; then : + $as_echo_n "(cached) " >&6 +else + i=0 + teststring=ABCD + + case $build_os in + msdosdjgpp*) + # On DJGPP, this test can blow up pretty badly due to problems in libc + # (any single argument exceeding 2000 bytes causes a buffer overrun + # during glob expansion). Even if it were fixed, the result of this + # check would be larger than it should be. + lt_cv_sys_max_cmd_len=12288; # 12K is about right + ;; + + gnu*) + # Under GNU Hurd, this test is not required because there is + # no limit to the length of command line arguments. + # Libtool will interpret -1 as no limit whatsoever + lt_cv_sys_max_cmd_len=-1; + ;; + + cygwin* | mingw* | cegcc*) + # On Win9x/ME, this test blows up -- it succeeds, but takes + # about 5 minutes as the teststring grows exponentially. + # Worse, since 9x/ME are not pre-emptively multitasking, + # you end up with a "frozen" computer, even though with patience + # the test eventually succeeds (with a max line length of 256k). + # Instead, let's just punt: use the minimum linelength reported by + # all of the supported platforms: 8192 (on NT/2K/XP). + lt_cv_sys_max_cmd_len=8192; + ;; + + mint*) + # On MiNT this can take a long time and run out of memory. + lt_cv_sys_max_cmd_len=8192; + ;; + + amigaos*) + # On AmigaOS with pdksh, this test takes hours, literally. + # So we just punt and use a minimum line length of 8192. + lt_cv_sys_max_cmd_len=8192; + ;; + + bitrig* | darwin* | dragonfly* | freebsd* | netbsd* | openbsd*) + # This has been around since 386BSD, at least. Likely further. + if test -x /sbin/sysctl; then + lt_cv_sys_max_cmd_len=`/sbin/sysctl -n kern.argmax` + elif test -x /usr/sbin/sysctl; then + lt_cv_sys_max_cmd_len=`/usr/sbin/sysctl -n kern.argmax` + else + lt_cv_sys_max_cmd_len=65536 # usable default for all BSDs + fi + # And add a safety zone + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \/ 4` + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \* 3` + ;; + + interix*) + # We know the value 262144 and hardcode it with a safety zone (like BSD) + lt_cv_sys_max_cmd_len=196608 + ;; + + os2*) + # The test takes a long time on OS/2. + lt_cv_sys_max_cmd_len=8192 + ;; + + osf*) + # Dr. Hans Ekkehard Plesser reports seeing a kernel panic running configure + # due to this test when exec_disable_arg_limit is 1 on Tru64. It is not + # nice to cause kernel panics so lets avoid the loop below. + # First set a reasonable default. + lt_cv_sys_max_cmd_len=16384 + # + if test -x /sbin/sysconfig; then + case `/sbin/sysconfig -q proc exec_disable_arg_limit` in + *1*) lt_cv_sys_max_cmd_len=-1 ;; + esac + fi + ;; + sco3.2v5*) + lt_cv_sys_max_cmd_len=102400 + ;; + sysv5* | sco5v6* | sysv4.2uw2*) + kargmax=`grep ARG_MAX /etc/conf/cf.d/stune 2>/dev/null` + if test -n "$kargmax"; then + lt_cv_sys_max_cmd_len=`echo $kargmax | sed 's/.*[ ]//'` + else + lt_cv_sys_max_cmd_len=32768 + fi + ;; + *) + lt_cv_sys_max_cmd_len=`(getconf ARG_MAX) 2> /dev/null` + if test -n "$lt_cv_sys_max_cmd_len" && \ + test undefined != "$lt_cv_sys_max_cmd_len"; then + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \/ 4` + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \* 3` + else + # Make teststring a little bigger before we do anything with it. + # a 1K string should be a reasonable start. + for i in 1 2 3 4 5 6 7 8; do + teststring=$teststring$teststring + done + SHELL=${SHELL-${CONFIG_SHELL-/bin/sh}} + # If test is not a shell built-in, we'll probably end up computing a + # maximum length that is only half of the actual maximum length, but + # we can't tell. + while { test X`env echo "$teststring$teststring" 2>/dev/null` \ + = "X$teststring$teststring"; } >/dev/null 2>&1 && + test 17 != "$i" # 1/2 MB should be enough + do + i=`expr $i + 1` + teststring=$teststring$teststring + done + # Only check the string length outside the loop. + lt_cv_sys_max_cmd_len=`expr "X$teststring" : ".*" 2>&1` + teststring= + # Add a significant safety factor because C++ compilers can tack on + # massive amounts of additional arguments before passing them to the + # linker. It appears as though 1/2 is a usable value. + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \/ 2` + fi + ;; + esac + +fi + +if test -n "$lt_cv_sys_max_cmd_len"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_sys_max_cmd_len" >&5 +$as_echo "$lt_cv_sys_max_cmd_len" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: none" >&5 +$as_echo "none" >&6; } +fi +max_cmd_len=$lt_cv_sys_max_cmd_len + + + + + + +: ${CP="cp -f"} +: ${MV="mv -f"} +: ${RM="rm -f"} + +if ( (MAIL=60; unset MAIL) || exit) >/dev/null 2>&1; then + lt_unset=unset +else + lt_unset=false +fi + + + + + +# test EBCDIC or ASCII +case `echo X|tr X '\101'` in + A) # ASCII based system + # \n is not interpreted correctly by Solaris 8 /usr/ucb/tr + lt_SP2NL='tr \040 \012' + lt_NL2SP='tr \015\012 \040\040' + ;; + *) # EBCDIC based system + lt_SP2NL='tr \100 \n' + lt_NL2SP='tr \r\n \100\100' + ;; +esac + + + + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking how to convert $build file names to $host format" >&5 +$as_echo_n "checking how to convert $build file names to $host format... " >&6; } +if ${lt_cv_to_host_file_cmd+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $host in + *-*-mingw* ) + case $build in + *-*-mingw* ) # actually msys + lt_cv_to_host_file_cmd=func_convert_file_msys_to_w32 + ;; + *-*-cygwin* ) + lt_cv_to_host_file_cmd=func_convert_file_cygwin_to_w32 + ;; + * ) # otherwise, assume *nix + lt_cv_to_host_file_cmd=func_convert_file_nix_to_w32 + ;; + esac + ;; + *-*-cygwin* ) + case $build in + *-*-mingw* ) # actually msys + lt_cv_to_host_file_cmd=func_convert_file_msys_to_cygwin + ;; + *-*-cygwin* ) + lt_cv_to_host_file_cmd=func_convert_file_noop + ;; + * ) # otherwise, assume *nix + lt_cv_to_host_file_cmd=func_convert_file_nix_to_cygwin + ;; + esac + ;; + * ) # unhandled hosts (and "normal" native builds) + lt_cv_to_host_file_cmd=func_convert_file_noop + ;; +esac + +fi + +to_host_file_cmd=$lt_cv_to_host_file_cmd +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_to_host_file_cmd" >&5 +$as_echo "$lt_cv_to_host_file_cmd" >&6; } + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking how to convert $build file names to toolchain format" >&5 +$as_echo_n "checking how to convert $build file names to toolchain format... " >&6; } +if ${lt_cv_to_tool_file_cmd+:} false; then : + $as_echo_n "(cached) " >&6 +else + #assume ordinary cross tools, or native build. +lt_cv_to_tool_file_cmd=func_convert_file_noop +case $host in + *-*-mingw* ) + case $build in + *-*-mingw* ) # actually msys + lt_cv_to_tool_file_cmd=func_convert_file_msys_to_w32 + ;; + esac + ;; +esac + +fi + +to_tool_file_cmd=$lt_cv_to_tool_file_cmd +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_to_tool_file_cmd" >&5 +$as_echo "$lt_cv_to_tool_file_cmd" >&6; } + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $LD option to reload object files" >&5 +$as_echo_n "checking for $LD option to reload object files... " >&6; } +if ${lt_cv_ld_reload_flag+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_ld_reload_flag='-r' +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_ld_reload_flag" >&5 +$as_echo "$lt_cv_ld_reload_flag" >&6; } +reload_flag=$lt_cv_ld_reload_flag +case $reload_flag in +"" | " "*) ;; +*) reload_flag=" $reload_flag" ;; +esac +reload_cmds='$LD$reload_flag -o $output$reload_objs' +case $host_os in + cygwin* | mingw* | pw32* | cegcc*) + if test yes != "$GCC"; then + reload_cmds=false + fi + ;; + darwin*) + if test yes = "$GCC"; then + reload_cmds='$LTCC $LTCFLAGS -nostdlib $wl-r -o $output$reload_objs' + else + reload_cmds='$LD$reload_flag -o $output$reload_objs' + fi + ;; +esac + + + + + + + + + +if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}objdump", so it can be a program name with args. +set dummy ${ac_tool_prefix}objdump; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_OBJDUMP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$OBJDUMP"; then + ac_cv_prog_OBJDUMP="$OBJDUMP" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_OBJDUMP="${ac_tool_prefix}objdump" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +OBJDUMP=$ac_cv_prog_OBJDUMP +if test -n "$OBJDUMP"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $OBJDUMP" >&5 +$as_echo "$OBJDUMP" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_OBJDUMP"; then + ac_ct_OBJDUMP=$OBJDUMP + # Extract the first word of "objdump", so it can be a program name with args. +set dummy objdump; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_OBJDUMP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_OBJDUMP"; then + ac_cv_prog_ac_ct_OBJDUMP="$ac_ct_OBJDUMP" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_OBJDUMP="objdump" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_OBJDUMP=$ac_cv_prog_ac_ct_OBJDUMP +if test -n "$ac_ct_OBJDUMP"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_OBJDUMP" >&5 +$as_echo "$ac_ct_OBJDUMP" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_OBJDUMP" = x; then + OBJDUMP="false" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + OBJDUMP=$ac_ct_OBJDUMP + fi +else + OBJDUMP="$ac_cv_prog_OBJDUMP" +fi + +test -z "$OBJDUMP" && OBJDUMP=objdump + + + + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking how to recognize dependent libraries" >&5 +$as_echo_n "checking how to recognize dependent libraries... " >&6; } +if ${lt_cv_deplibs_check_method+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_file_magic_cmd='$MAGIC_CMD' +lt_cv_file_magic_test_file= +lt_cv_deplibs_check_method='unknown' +# Need to set the preceding variable on all platforms that support +# interlibrary dependencies. +# 'none' -- dependencies not supported. +# 'unknown' -- same as none, but documents that we really don't know. +# 'pass_all' -- all dependencies passed with no checks. +# 'test_compile' -- check by making test program. +# 'file_magic [[regex]]' -- check by looking for files in library path +# that responds to the $file_magic_cmd with a given extended regex. +# If you have 'file' or equivalent on your system and you're not sure +# whether 'pass_all' will *always* work, you probably want this one. + +case $host_os in +aix[4-9]*) + lt_cv_deplibs_check_method=pass_all + ;; + +beos*) + lt_cv_deplibs_check_method=pass_all + ;; + +bsdi[45]*) + lt_cv_deplibs_check_method='file_magic ELF [0-9][0-9]*-bit [ML]SB (shared object|dynamic lib)' + lt_cv_file_magic_cmd='/usr/bin/file -L' + lt_cv_file_magic_test_file=/shlib/libc.so + ;; + +cygwin*) + # func_win32_libid is a shell function defined in ltmain.sh + lt_cv_deplibs_check_method='file_magic ^x86 archive import|^x86 DLL' + lt_cv_file_magic_cmd='func_win32_libid' + ;; + +mingw* | pw32*) + # Base MSYS/MinGW do not provide the 'file' command needed by + # func_win32_libid shell function, so use a weaker test based on 'objdump', + # unless we find 'file', for example because we are cross-compiling. + if ( file / ) >/dev/null 2>&1; then + lt_cv_deplibs_check_method='file_magic ^x86 archive import|^x86 DLL' + lt_cv_file_magic_cmd='func_win32_libid' + else + # Keep this pattern in sync with the one in func_win32_libid. + lt_cv_deplibs_check_method='file_magic file format (pei*-i386(.*architecture: i386)?|pe-arm-wince|pe-x86-64)' + lt_cv_file_magic_cmd='$OBJDUMP -f' + fi + ;; + +cegcc*) + # use the weaker test based on 'objdump'. See mingw*. + lt_cv_deplibs_check_method='file_magic file format pe-arm-.*little(.*architecture: arm)?' + lt_cv_file_magic_cmd='$OBJDUMP -f' + ;; + +darwin* | rhapsody*) + lt_cv_deplibs_check_method=pass_all + ;; + +freebsd* | dragonfly*) + if echo __ELF__ | $CC -E - | $GREP __ELF__ > /dev/null; then + case $host_cpu in + i*86 ) + # Not sure whether the presence of OpenBSD here was a mistake. + # Let's accept both of them until this is cleared up. + lt_cv_deplibs_check_method='file_magic (FreeBSD|OpenBSD|DragonFly)/i[3-9]86 (compact )?demand paged shared library' + lt_cv_file_magic_cmd=/usr/bin/file + lt_cv_file_magic_test_file=`echo /usr/lib/libc.so.*` + ;; + esac + else + lt_cv_deplibs_check_method=pass_all + fi + ;; + +haiku*) + lt_cv_deplibs_check_method=pass_all + ;; + +hpux10.20* | hpux11*) + lt_cv_file_magic_cmd=/usr/bin/file + case $host_cpu in + ia64*) + lt_cv_deplibs_check_method='file_magic (s[0-9][0-9][0-9]|ELF-[0-9][0-9]) shared object file - IA64' + lt_cv_file_magic_test_file=/usr/lib/hpux32/libc.so + ;; + hppa*64*) + lt_cv_deplibs_check_method='file_magic (s[0-9][0-9][0-9]|ELF[ -][0-9][0-9])(-bit)?( [LM]SB)? shared object( file)?[, -]* PA-RISC [0-9]\.[0-9]' + lt_cv_file_magic_test_file=/usr/lib/pa20_64/libc.sl + ;; + *) + lt_cv_deplibs_check_method='file_magic (s[0-9][0-9][0-9]|PA-RISC[0-9]\.[0-9]) shared library' + lt_cv_file_magic_test_file=/usr/lib/libc.sl + ;; + esac + ;; + +interix[3-9]*) + # PIC code is broken on Interix 3.x, that's why |\.a not |_pic\.a here + lt_cv_deplibs_check_method='match_pattern /lib[^/]+(\.so|\.a)$' + ;; + +irix5* | irix6* | nonstopux*) + case $LD in + *-32|*"-32 ") libmagic=32-bit;; + *-n32|*"-n32 ") libmagic=N32;; + *-64|*"-64 ") libmagic=64-bit;; + *) libmagic=never-match;; + esac + lt_cv_deplibs_check_method=pass_all + ;; + +# This must be glibc/ELF. +linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) + lt_cv_deplibs_check_method=pass_all + ;; + +netbsd* | netbsdelf*-gnu) + if echo __ELF__ | $CC -E - | $GREP __ELF__ > /dev/null; then + lt_cv_deplibs_check_method='match_pattern /lib[^/]+(\.so\.[0-9]+\.[0-9]+|_pic\.a)$' + else + lt_cv_deplibs_check_method='match_pattern /lib[^/]+(\.so|_pic\.a)$' + fi + ;; + +newos6*) + lt_cv_deplibs_check_method='file_magic ELF [0-9][0-9]*-bit [ML]SB (executable|dynamic lib)' + lt_cv_file_magic_cmd=/usr/bin/file + lt_cv_file_magic_test_file=/usr/lib/libnls.so + ;; + +*nto* | *qnx*) + lt_cv_deplibs_check_method=pass_all + ;; + +openbsd* | bitrig*) + if test -z "`echo __ELF__ | $CC -E - | $GREP __ELF__`"; then + lt_cv_deplibs_check_method='match_pattern /lib[^/]+(\.so\.[0-9]+\.[0-9]+|\.so|_pic\.a)$' + else + lt_cv_deplibs_check_method='match_pattern /lib[^/]+(\.so\.[0-9]+\.[0-9]+|_pic\.a)$' + fi + ;; + +osf3* | osf4* | osf5*) + lt_cv_deplibs_check_method=pass_all + ;; + +rdos*) + lt_cv_deplibs_check_method=pass_all + ;; + +solaris*) + lt_cv_deplibs_check_method=pass_all + ;; + +sysv5* | sco3.2v5* | sco5v6* | unixware* | OpenUNIX* | sysv4*uw2*) + lt_cv_deplibs_check_method=pass_all + ;; + +sysv4 | sysv4.3*) + case $host_vendor in + motorola) + lt_cv_deplibs_check_method='file_magic ELF [0-9][0-9]*-bit [ML]SB (shared object|dynamic lib) M[0-9][0-9]* Version [0-9]' + lt_cv_file_magic_test_file=`echo /usr/lib/libc.so*` + ;; + ncr) + lt_cv_deplibs_check_method=pass_all + ;; + sequent) + lt_cv_file_magic_cmd='/bin/file' + lt_cv_deplibs_check_method='file_magic ELF [0-9][0-9]*-bit [LM]SB (shared object|dynamic lib )' + ;; + sni) + lt_cv_file_magic_cmd='/bin/file' + lt_cv_deplibs_check_method="file_magic ELF [0-9][0-9]*-bit [LM]SB dynamic lib" + lt_cv_file_magic_test_file=/lib/libc.so + ;; + siemens) + lt_cv_deplibs_check_method=pass_all + ;; + pc) + lt_cv_deplibs_check_method=pass_all + ;; + esac + ;; + +tpf*) + lt_cv_deplibs_check_method=pass_all + ;; +os2*) + lt_cv_deplibs_check_method=pass_all + ;; +esac + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_deplibs_check_method" >&5 +$as_echo "$lt_cv_deplibs_check_method" >&6; } + +file_magic_glob= +want_nocaseglob=no +if test "$build" = "$host"; then + case $host_os in + mingw* | pw32*) + if ( shopt | grep nocaseglob ) >/dev/null 2>&1; then + want_nocaseglob=yes + else + file_magic_glob=`echo aAbBcCdDeEfFgGhHiIjJkKlLmMnNoOpPqQrRsStTuUvVwWxXyYzZ | $SED -e "s/\(..\)/s\/[\1]\/[\1]\/g;/g"` + fi + ;; + esac +fi + +file_magic_cmd=$lt_cv_file_magic_cmd +deplibs_check_method=$lt_cv_deplibs_check_method +test -z "$deplibs_check_method" && deplibs_check_method=unknown + + + + + + + + + + + + + + + + + + + + + + +if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}dlltool", so it can be a program name with args. +set dummy ${ac_tool_prefix}dlltool; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_DLLTOOL+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$DLLTOOL"; then + ac_cv_prog_DLLTOOL="$DLLTOOL" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_DLLTOOL="${ac_tool_prefix}dlltool" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +DLLTOOL=$ac_cv_prog_DLLTOOL +if test -n "$DLLTOOL"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $DLLTOOL" >&5 +$as_echo "$DLLTOOL" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_DLLTOOL"; then + ac_ct_DLLTOOL=$DLLTOOL + # Extract the first word of "dlltool", so it can be a program name with args. +set dummy dlltool; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_DLLTOOL+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_DLLTOOL"; then + ac_cv_prog_ac_ct_DLLTOOL="$ac_ct_DLLTOOL" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_DLLTOOL="dlltool" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_DLLTOOL=$ac_cv_prog_ac_ct_DLLTOOL +if test -n "$ac_ct_DLLTOOL"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_DLLTOOL" >&5 +$as_echo "$ac_ct_DLLTOOL" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_DLLTOOL" = x; then + DLLTOOL="false" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + DLLTOOL=$ac_ct_DLLTOOL + fi +else + DLLTOOL="$ac_cv_prog_DLLTOOL" +fi + +test -z "$DLLTOOL" && DLLTOOL=dlltool + + + + + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking how to associate runtime and link libraries" >&5 +$as_echo_n "checking how to associate runtime and link libraries... " >&6; } +if ${lt_cv_sharedlib_from_linklib_cmd+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_sharedlib_from_linklib_cmd='unknown' + +case $host_os in +cygwin* | mingw* | pw32* | cegcc*) + # two different shell functions defined in ltmain.sh; + # decide which one to use based on capabilities of $DLLTOOL + case `$DLLTOOL --help 2>&1` in + *--identify-strict*) + lt_cv_sharedlib_from_linklib_cmd=func_cygming_dll_for_implib + ;; + *) + lt_cv_sharedlib_from_linklib_cmd=func_cygming_dll_for_implib_fallback + ;; + esac + ;; +*) + # fallback: assume linklib IS sharedlib + lt_cv_sharedlib_from_linklib_cmd=$ECHO + ;; +esac + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_sharedlib_from_linklib_cmd" >&5 +$as_echo "$lt_cv_sharedlib_from_linklib_cmd" >&6; } +sharedlib_from_linklib_cmd=$lt_cv_sharedlib_from_linklib_cmd +test -z "$sharedlib_from_linklib_cmd" && sharedlib_from_linklib_cmd=$ECHO + + + + + + + +if test -n "$ac_tool_prefix"; then + for ac_prog in ar + do + # Extract the first word of "$ac_tool_prefix$ac_prog", so it can be a program name with args. +set dummy $ac_tool_prefix$ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_AR+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$AR"; then + ac_cv_prog_AR="$AR" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_AR="$ac_tool_prefix$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +AR=$ac_cv_prog_AR +if test -n "$AR"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $AR" >&5 +$as_echo "$AR" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$AR" && break + done +fi +if test -z "$AR"; then + ac_ct_AR=$AR + for ac_prog in ar +do + # Extract the first word of "$ac_prog", so it can be a program name with args. +set dummy $ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_AR+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_AR"; then + ac_cv_prog_ac_ct_AR="$ac_ct_AR" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_AR="$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_AR=$ac_cv_prog_ac_ct_AR +if test -n "$ac_ct_AR"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_AR" >&5 +$as_echo "$ac_ct_AR" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$ac_ct_AR" && break +done + + if test "x$ac_ct_AR" = x; then + AR="false" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + AR=$ac_ct_AR + fi +fi + +: ${AR=ar} +: ${AR_FLAGS=cru} + + + + + + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for archiver @FILE support" >&5 +$as_echo_n "checking for archiver @FILE support... " >&6; } +if ${lt_cv_ar_at_file+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_ar_at_file=no + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + echo conftest.$ac_objext > conftest.lst + lt_ar_try='$AR $AR_FLAGS libconftest.a @conftest.lst >&5' + { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$lt_ar_try\""; } >&5 + (eval $lt_ar_try) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + if test 0 -eq "$ac_status"; then + # Ensure the archiver fails upon bogus file names. + rm -f conftest.$ac_objext libconftest.a + { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$lt_ar_try\""; } >&5 + (eval $lt_ar_try) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + if test 0 -ne "$ac_status"; then + lt_cv_ar_at_file=@ + fi + fi + rm -f conftest.* libconftest.a + +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_ar_at_file" >&5 +$as_echo "$lt_cv_ar_at_file" >&6; } + +if test no = "$lt_cv_ar_at_file"; then + archiver_list_spec= +else + archiver_list_spec=$lt_cv_ar_at_file +fi + + + + + + + +if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}strip", so it can be a program name with args. +set dummy ${ac_tool_prefix}strip; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_STRIP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$STRIP"; then + ac_cv_prog_STRIP="$STRIP" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_STRIP="${ac_tool_prefix}strip" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +STRIP=$ac_cv_prog_STRIP +if test -n "$STRIP"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $STRIP" >&5 +$as_echo "$STRIP" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_STRIP"; then + ac_ct_STRIP=$STRIP + # Extract the first word of "strip", so it can be a program name with args. +set dummy strip; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_STRIP+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_STRIP"; then + ac_cv_prog_ac_ct_STRIP="$ac_ct_STRIP" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_STRIP="strip" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_STRIP=$ac_cv_prog_ac_ct_STRIP +if test -n "$ac_ct_STRIP"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_STRIP" >&5 +$as_echo "$ac_ct_STRIP" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_STRIP" = x; then + STRIP=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + STRIP=$ac_ct_STRIP + fi +else + STRIP="$ac_cv_prog_STRIP" +fi + +test -z "$STRIP" && STRIP=: + + + + + + +if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}ranlib", so it can be a program name with args. +set dummy ${ac_tool_prefix}ranlib; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_RANLIB+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$RANLIB"; then + ac_cv_prog_RANLIB="$RANLIB" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_RANLIB="${ac_tool_prefix}ranlib" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +RANLIB=$ac_cv_prog_RANLIB +if test -n "$RANLIB"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $RANLIB" >&5 +$as_echo "$RANLIB" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_RANLIB"; then + ac_ct_RANLIB=$RANLIB + # Extract the first word of "ranlib", so it can be a program name with args. +set dummy ranlib; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_RANLIB+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_RANLIB"; then + ac_cv_prog_ac_ct_RANLIB="$ac_ct_RANLIB" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_RANLIB="ranlib" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_RANLIB=$ac_cv_prog_ac_ct_RANLIB +if test -n "$ac_ct_RANLIB"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_RANLIB" >&5 +$as_echo "$ac_ct_RANLIB" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_RANLIB" = x; then + RANLIB=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + RANLIB=$ac_ct_RANLIB + fi +else + RANLIB="$ac_cv_prog_RANLIB" +fi + +test -z "$RANLIB" && RANLIB=: + + + + + + +# Determine commands to create old-style static archives. +old_archive_cmds='$AR $AR_FLAGS $oldlib$oldobjs' +old_postinstall_cmds='chmod 644 $oldlib' +old_postuninstall_cmds= + +if test -n "$RANLIB"; then + case $host_os in + bitrig* | openbsd*) + old_postinstall_cmds="$old_postinstall_cmds~\$RANLIB -t \$tool_oldlib" + ;; + *) + old_postinstall_cmds="$old_postinstall_cmds~\$RANLIB \$tool_oldlib" + ;; + esac + old_archive_cmds="$old_archive_cmds~\$RANLIB \$tool_oldlib" +fi + +case $host_os in + darwin*) + lock_old_archive_extraction=yes ;; + *) + lock_old_archive_extraction=no ;; +esac + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +# If no C compiler was specified, use CC. +LTCC=${LTCC-"$CC"} + +# If no C compiler flags were specified, use CFLAGS. +LTCFLAGS=${LTCFLAGS-"$CFLAGS"} + +# Allow CC to be a program name with arguments. +compiler=$CC + + +# Check for command to grab the raw symbol name followed by C symbol from nm. +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking command to parse $NM output from $compiler object" >&5 +$as_echo_n "checking command to parse $NM output from $compiler object... " >&6; } +if ${lt_cv_sys_global_symbol_pipe+:} false; then : + $as_echo_n "(cached) " >&6 +else + +# These are sane defaults that work on at least a few old systems. +# [They come from Ultrix. What could be older than Ultrix?!! ;)] + +# Character class describing NM global symbol codes. +symcode='[BCDEGRST]' + +# Regexp to match symbols that can be accessed directly from C. +sympat='\([_A-Za-z][_A-Za-z0-9]*\)' + +# Define system-specific variables. +case $host_os in +aix*) + symcode='[BCDT]' + ;; +cygwin* | mingw* | pw32* | cegcc*) + symcode='[ABCDGISTW]' + ;; +hpux*) + if test ia64 = "$host_cpu"; then + symcode='[ABCDEGRST]' + fi + ;; +irix* | nonstopux*) + symcode='[BCDEGRST]' + ;; +osf*) + symcode='[BCDEGQRST]' + ;; +solaris*) + symcode='[BDRT]' + ;; +sco3.2v5*) + symcode='[DT]' + ;; +sysv4.2uw2*) + symcode='[DT]' + ;; +sysv5* | sco5v6* | unixware* | OpenUNIX*) + symcode='[ABDT]' + ;; +sysv4) + symcode='[DFNSTU]' + ;; +esac + +# If we're using GNU nm, then use its standard symbol codes. +case `$NM -V 2>&1` in +*GNU* | *'with BFD'*) + symcode='[ABCDGIRSTW]' ;; +esac + +if test "$lt_cv_nm_interface" = "MS dumpbin"; then + # Gets list of data symbols to import. + lt_cv_sys_global_symbol_to_import="sed -n -e 's/^I .* \(.*\)$/\1/p'" + # Adjust the below global symbol transforms to fixup imported variables. + lt_cdecl_hook=" -e 's/^I .* \(.*\)$/extern __declspec(dllimport) char \1;/p'" + lt_c_name_hook=" -e 's/^I .* \(.*\)$/ {\"\1\", (void *) 0},/p'" + lt_c_name_lib_hook="\ + -e 's/^I .* \(lib.*\)$/ {\"\1\", (void *) 0},/p'\ + -e 's/^I .* \(.*\)$/ {\"lib\1\", (void *) 0},/p'" +else + # Disable hooks by default. + lt_cv_sys_global_symbol_to_import= + lt_cdecl_hook= + lt_c_name_hook= + lt_c_name_lib_hook= +fi + +# Transform an extracted symbol line into a proper C declaration. +# Some systems (esp. on ia64) link data and code symbols differently, +# so use this general approach. +lt_cv_sys_global_symbol_to_cdecl="sed -n"\ +$lt_cdecl_hook\ +" -e 's/^T .* \(.*\)$/extern int \1();/p'"\ +" -e 's/^$symcode$symcode* .* \(.*\)$/extern char \1;/p'" + +# Transform an extracted symbol line into symbol name and symbol address +lt_cv_sys_global_symbol_to_c_name_address="sed -n"\ +$lt_c_name_hook\ +" -e 's/^: \(.*\) .*$/ {\"\1\", (void *) 0},/p'"\ +" -e 's/^$symcode$symcode* .* \(.*\)$/ {\"\1\", (void *) \&\1},/p'" + +# Transform an extracted symbol line into symbol name with lib prefix and +# symbol address. +lt_cv_sys_global_symbol_to_c_name_address_lib_prefix="sed -n"\ +$lt_c_name_lib_hook\ +" -e 's/^: \(.*\) .*$/ {\"\1\", (void *) 0},/p'"\ +" -e 's/^$symcode$symcode* .* \(lib.*\)$/ {\"\1\", (void *) \&\1},/p'"\ +" -e 's/^$symcode$symcode* .* \(.*\)$/ {\"lib\1\", (void *) \&\1},/p'" + +# Handle CRLF in mingw tool chain +opt_cr= +case $build_os in +mingw*) + opt_cr=`$ECHO 'x\{0,1\}' | tr x '\015'` # option cr in regexp + ;; +esac + +# Try without a prefix underscore, then with it. +for ac_symprfx in "" "_"; do + + # Transform symcode, sympat, and symprfx into a raw symbol and a C symbol. + symxfrm="\\1 $ac_symprfx\\2 \\2" + + # Write the raw and C identifiers. + if test "$lt_cv_nm_interface" = "MS dumpbin"; then + # Fake it for dumpbin and say T for any non-static function, + # D for any global variable and I for any imported variable. + # Also find C++ and __fastcall symbols from MSVC++, + # which start with @ or ?. + lt_cv_sys_global_symbol_pipe="$AWK '"\ +" {last_section=section; section=\$ 3};"\ +" /^COFF SYMBOL TABLE/{for(i in hide) delete hide[i]};"\ +" /Section length .*#relocs.*(pick any)/{hide[last_section]=1};"\ +" /^ *Symbol name *: /{split(\$ 0,sn,\":\"); si=substr(sn[2],2)};"\ +" /^ *Type *: code/{print \"T\",si,substr(si,length(prfx))};"\ +" /^ *Type *: data/{print \"I\",si,substr(si,length(prfx))};"\ +" \$ 0!~/External *\|/{next};"\ +" / 0+ UNDEF /{next}; / UNDEF \([^|]\)*()/{next};"\ +" {if(hide[section]) next};"\ +" {f=\"D\"}; \$ 0~/\(\).*\|/{f=\"T\"};"\ +" {split(\$ 0,a,/\||\r/); split(a[2],s)};"\ +" s[1]~/^[@?]/{print f,s[1],s[1]; next};"\ +" s[1]~prfx {split(s[1],t,\"@\"); print f,t[1],substr(t[1],length(prfx))}"\ +" ' prfx=^$ac_symprfx" + else + lt_cv_sys_global_symbol_pipe="sed -n -e 's/^.*[ ]\($symcode$symcode*\)[ ][ ]*$ac_symprfx$sympat$opt_cr$/$symxfrm/p'" + fi + lt_cv_sys_global_symbol_pipe="$lt_cv_sys_global_symbol_pipe | sed '/ __gnu_lto/d'" + + # Check to see that the pipe works correctly. + pipe_works=no + + rm -f conftest* + cat > conftest.$ac_ext <<_LT_EOF +#ifdef __cplusplus +extern "C" { +#endif +char nm_test_var; +void nm_test_func(void); +void nm_test_func(void){} +#ifdef __cplusplus +} +#endif +int main(){nm_test_var='a';nm_test_func();return(0);} +_LT_EOF + + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_compile\""; } >&5 + (eval $ac_compile) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + # Now try to grab the symbols. + nlist=conftest.nm + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist\""; } >&5 + (eval $NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } && test -s "$nlist"; then + # Try sorting and uniquifying the output. + if sort "$nlist" | uniq > "$nlist"T; then + mv -f "$nlist"T "$nlist" + else + rm -f "$nlist"T + fi + + # Make sure that we snagged all the symbols we need. + if $GREP ' nm_test_var$' "$nlist" >/dev/null; then + if $GREP ' nm_test_func$' "$nlist" >/dev/null; then + cat <<_LT_EOF > conftest.$ac_ext +/* Keep this code in sync between libtool.m4, ltmain, lt_system.h, and tests. */ +#if defined _WIN32 || defined __CYGWIN__ || defined _WIN32_WCE +/* DATA imports from DLLs on WIN32 can't be const, because runtime + relocations are performed -- see ld's documentation on pseudo-relocs. */ +# define LT_DLSYM_CONST +#elif defined __osf__ +/* This system does not cope well with relocations in const data. */ +# define LT_DLSYM_CONST +#else +# define LT_DLSYM_CONST const +#endif + +#ifdef __cplusplus +extern "C" { +#endif + +_LT_EOF + # Now generate the symbol file. + eval "$lt_cv_sys_global_symbol_to_cdecl"' < "$nlist" | $GREP -v main >> conftest.$ac_ext' + + cat <<_LT_EOF >> conftest.$ac_ext + +/* The mapping between symbol names and symbols. */ +LT_DLSYM_CONST struct { + const char *name; + void *address; +} +lt__PROGRAM__LTX_preloaded_symbols[] = +{ + { "@PROGRAM@", (void *) 0 }, +_LT_EOF + $SED "s/^$symcode$symcode* .* \(.*\)$/ {\"\1\", (void *) \&\1},/" < "$nlist" | $GREP -v main >> conftest.$ac_ext + cat <<\_LT_EOF >> conftest.$ac_ext + {0, (void *) 0} +}; + +/* This works around a problem in FreeBSD linker */ +#ifdef FREEBSD_WORKAROUND +static const void *lt_preloaded_setup() { + return lt__PROGRAM__LTX_preloaded_symbols; +} +#endif + +#ifdef __cplusplus +} +#endif +_LT_EOF + # Now try linking the two files. + mv conftest.$ac_objext conftstm.$ac_objext + lt_globsym_save_LIBS=$LIBS + lt_globsym_save_CFLAGS=$CFLAGS + LIBS=conftstm.$ac_objext + CFLAGS="$CFLAGS$lt_prog_compiler_no_builtin_flag" + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_link\""; } >&5 + (eval $ac_link) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } && test -s conftest$ac_exeext; then + pipe_works=yes + fi + LIBS=$lt_globsym_save_LIBS + CFLAGS=$lt_globsym_save_CFLAGS + else + echo "cannot find nm_test_func in $nlist" >&5 + fi + else + echo "cannot find nm_test_var in $nlist" >&5 + fi + else + echo "cannot run $lt_cv_sys_global_symbol_pipe" >&5 + fi + else + echo "$progname: failed program was:" >&5 + cat conftest.$ac_ext >&5 + fi + rm -rf conftest* conftst* + + # Do not use the global_symbol_pipe unless it works. + if test yes = "$pipe_works"; then + break + else + lt_cv_sys_global_symbol_pipe= + fi +done + +fi + +if test -z "$lt_cv_sys_global_symbol_pipe"; then + lt_cv_sys_global_symbol_to_cdecl= +fi +if test -z "$lt_cv_sys_global_symbol_pipe$lt_cv_sys_global_symbol_to_cdecl"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: failed" >&5 +$as_echo "failed" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: ok" >&5 +$as_echo "ok" >&6; } +fi + +# Response file support. +if test "$lt_cv_nm_interface" = "MS dumpbin"; then + nm_file_list_spec='@' +elif $NM --help 2>/dev/null | grep '[@]FILE' >/dev/null; then + nm_file_list_spec='@' +fi + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for sysroot" >&5 +$as_echo_n "checking for sysroot... " >&6; } + +# Check whether --with-sysroot was given. +if test "${with_sysroot+set}" = set; then : + withval=$with_sysroot; +else + with_sysroot=no +fi + + +lt_sysroot= +case $with_sysroot in #( + yes) + if test yes = "$GCC"; then + lt_sysroot=`$CC --print-sysroot 2>/dev/null` + fi + ;; #( + /*) + lt_sysroot=`echo "$with_sysroot" | sed -e "$sed_quote_subst"` + ;; #( + no|'') + ;; #( + *) + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $with_sysroot" >&5 +$as_echo "$with_sysroot" >&6; } + as_fn_error $? "The sysroot must be an absolute path." "$LINENO" 5 + ;; +esac + + { $as_echo "$as_me:${as_lineno-$LINENO}: result: ${lt_sysroot:-no}" >&5 +$as_echo "${lt_sysroot:-no}" >&6; } + + + + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for a working dd" >&5 +$as_echo_n "checking for a working dd... " >&6; } +if ${ac_cv_path_lt_DD+:} false; then : + $as_echo_n "(cached) " >&6 +else + printf 0123456789abcdef0123456789abcdef >conftest.i +cat conftest.i conftest.i >conftest2.i +: ${lt_DD:=$DD} +if test -z "$lt_DD"; then + ac_path_lt_DD_found=false + # Loop through the user's path and test for each of PROGNAME-LIST + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_prog in dd; do + for ac_exec_ext in '' $ac_executable_extensions; do + ac_path_lt_DD="$as_dir/$ac_prog$ac_exec_ext" + as_fn_executable_p "$ac_path_lt_DD" || continue +if "$ac_path_lt_DD" bs=32 count=1 conftest.out 2>/dev/null; then + cmp -s conftest.i conftest.out \ + && ac_cv_path_lt_DD="$ac_path_lt_DD" ac_path_lt_DD_found=: +fi + $ac_path_lt_DD_found && break 3 + done + done + done +IFS=$as_save_IFS + if test -z "$ac_cv_path_lt_DD"; then + : + fi +else + ac_cv_path_lt_DD=$lt_DD +fi + +rm -f conftest.i conftest2.i conftest.out +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_path_lt_DD" >&5 +$as_echo "$ac_cv_path_lt_DD" >&6; } + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking how to truncate binary pipes" >&5 +$as_echo_n "checking how to truncate binary pipes... " >&6; } +if ${lt_cv_truncate_bin+:} false; then : + $as_echo_n "(cached) " >&6 +else + printf 0123456789abcdef0123456789abcdef >conftest.i +cat conftest.i conftest.i >conftest2.i +lt_cv_truncate_bin= +if "$ac_cv_path_lt_DD" bs=32 count=1 conftest.out 2>/dev/null; then + cmp -s conftest.i conftest.out \ + && lt_cv_truncate_bin="$ac_cv_path_lt_DD bs=4096 count=1" +fi +rm -f conftest.i conftest2.i conftest.out +test -z "$lt_cv_truncate_bin" && lt_cv_truncate_bin="$SED -e 4q" +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_truncate_bin" >&5 +$as_echo "$lt_cv_truncate_bin" >&6; } + + + + + + + +# Calculate cc_basename. Skip known compiler wrappers and cross-prefix. +func_cc_basename () +{ + for cc_temp in $*""; do + case $cc_temp in + compile | *[\\/]compile | ccache | *[\\/]ccache ) ;; + distcc | *[\\/]distcc | purify | *[\\/]purify ) ;; + \-*) ;; + *) break;; + esac + done + func_cc_basename_result=`$ECHO "$cc_temp" | $SED "s%.*/%%; s%^$host_alias-%%"` +} + +# Check whether --enable-libtool-lock was given. +if test "${enable_libtool_lock+set}" = set; then : + enableval=$enable_libtool_lock; +fi + +test no = "$enable_libtool_lock" || enable_libtool_lock=yes + +# Some flags need to be propagated to the compiler or linker for good +# libtool support. +case $host in +ia64-*-hpux*) + # Find out what ABI is being produced by ac_compile, and set mode + # options accordingly. + echo 'int i;' > conftest.$ac_ext + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_compile\""; } >&5 + (eval $ac_compile) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + case `/usr/bin/file conftest.$ac_objext` in + *ELF-32*) + HPUX_IA64_MODE=32 + ;; + *ELF-64*) + HPUX_IA64_MODE=64 + ;; + esac + fi + rm -rf conftest* + ;; +*-*-irix6*) + # Find out what ABI is being produced by ac_compile, and set linker + # options accordingly. + echo '#line '$LINENO' "configure"' > conftest.$ac_ext + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_compile\""; } >&5 + (eval $ac_compile) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + if test yes = "$lt_cv_prog_gnu_ld"; then + case `/usr/bin/file conftest.$ac_objext` in + *32-bit*) + LD="${LD-ld} -melf32bsmip" + ;; + *N32*) + LD="${LD-ld} -melf32bmipn32" + ;; + *64-bit*) + LD="${LD-ld} -melf64bmip" + ;; + esac + else + case `/usr/bin/file conftest.$ac_objext` in + *32-bit*) + LD="${LD-ld} -32" + ;; + *N32*) + LD="${LD-ld} -n32" + ;; + *64-bit*) + LD="${LD-ld} -64" + ;; + esac + fi + fi + rm -rf conftest* + ;; + +mips64*-*linux*) + # Find out what ABI is being produced by ac_compile, and set linker + # options accordingly. + echo '#line '$LINENO' "configure"' > conftest.$ac_ext + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_compile\""; } >&5 + (eval $ac_compile) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + emul=elf + case `/usr/bin/file conftest.$ac_objext` in + *32-bit*) + emul="${emul}32" + ;; + *64-bit*) + emul="${emul}64" + ;; + esac + case `/usr/bin/file conftest.$ac_objext` in + *MSB*) + emul="${emul}btsmip" + ;; + *LSB*) + emul="${emul}ltsmip" + ;; + esac + case `/usr/bin/file conftest.$ac_objext` in + *N32*) + emul="${emul}n32" + ;; + esac + LD="${LD-ld} -m $emul" + fi + rm -rf conftest* + ;; + +x86_64-*kfreebsd*-gnu|x86_64-*linux*|powerpc*-*linux*| \ +s390*-*linux*|s390*-*tpf*|sparc*-*linux*) + # Find out what ABI is being produced by ac_compile, and set linker + # options accordingly. Note that the listed cases only cover the + # situations where additional linker options are needed (such as when + # doing 32-bit compilation for a host where ld defaults to 64-bit, or + # vice versa); the common cases where no linker options are needed do + # not appear in the list. + echo 'int i;' > conftest.$ac_ext + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_compile\""; } >&5 + (eval $ac_compile) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + case `/usr/bin/file conftest.o` in + *32-bit*) + case $host in + x86_64-*kfreebsd*-gnu) + LD="${LD-ld} -m elf_i386_fbsd" + ;; + x86_64-*linux*) + case `/usr/bin/file conftest.o` in + *x86-64*) + LD="${LD-ld} -m elf32_x86_64" + ;; + *) + LD="${LD-ld} -m elf_i386" + ;; + esac + ;; + powerpc64le-*linux*) + LD="${LD-ld} -m elf32lppclinux" + ;; + powerpc64-*linux*) + LD="${LD-ld} -m elf32ppclinux" + ;; + s390x-*linux*) + LD="${LD-ld} -m elf_s390" + ;; + sparc64-*linux*) + LD="${LD-ld} -m elf32_sparc" + ;; + esac + ;; + *64-bit*) + case $host in + x86_64-*kfreebsd*-gnu) + LD="${LD-ld} -m elf_x86_64_fbsd" + ;; + x86_64-*linux*) + LD="${LD-ld} -m elf_x86_64" + ;; + powerpcle-*linux*) + LD="${LD-ld} -m elf64lppc" + ;; + powerpc-*linux*) + LD="${LD-ld} -m elf64ppc" + ;; + s390*-*linux*|s390*-*tpf*) + LD="${LD-ld} -m elf64_s390" + ;; + sparc*-*linux*) + LD="${LD-ld} -m elf64_sparc" + ;; + esac + ;; + esac + fi + rm -rf conftest* + ;; + +*-*-sco3.2v5*) + # On SCO OpenServer 5, we need -belf to get full-featured binaries. + SAVE_CFLAGS=$CFLAGS + CFLAGS="$CFLAGS -belf" + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether the C compiler needs -belf" >&5 +$as_echo_n "checking whether the C compiler needs -belf... " >&6; } +if ${lt_cv_cc_needs_belf+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + lt_cv_cc_needs_belf=yes +else + lt_cv_cc_needs_belf=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_cc_needs_belf" >&5 +$as_echo "$lt_cv_cc_needs_belf" >&6; } + if test yes != "$lt_cv_cc_needs_belf"; then + # this is probably gcc 2.8.0, egcs 1.0 or newer; no need for -belf + CFLAGS=$SAVE_CFLAGS + fi + ;; +*-*solaris*) + # Find out what ABI is being produced by ac_compile, and set linker + # options accordingly. + echo 'int i;' > conftest.$ac_ext + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_compile\""; } >&5 + (eval $ac_compile) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + case `/usr/bin/file conftest.o` in + *64-bit*) + case $lt_cv_prog_gnu_ld in + yes*) + case $host in + i?86-*-solaris*|x86_64-*-solaris*) + LD="${LD-ld} -m elf_x86_64" + ;; + sparc*-*-solaris*) + LD="${LD-ld} -m elf64_sparc" + ;; + esac + # GNU ld 2.21 introduced _sol2 emulations. Use them if available. + if ${LD-ld} -V | grep _sol2 >/dev/null 2>&1; then + LD=${LD-ld}_sol2 + fi + ;; + *) + if ${LD-ld} -64 -r -o conftest2.o conftest.o >/dev/null 2>&1; then + LD="${LD-ld} -64" + fi + ;; + esac + ;; + esac + fi + rm -rf conftest* + ;; +esac + +need_locks=$enable_libtool_lock + +if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}mt", so it can be a program name with args. +set dummy ${ac_tool_prefix}mt; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_MANIFEST_TOOL+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$MANIFEST_TOOL"; then + ac_cv_prog_MANIFEST_TOOL="$MANIFEST_TOOL" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_MANIFEST_TOOL="${ac_tool_prefix}mt" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +MANIFEST_TOOL=$ac_cv_prog_MANIFEST_TOOL +if test -n "$MANIFEST_TOOL"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $MANIFEST_TOOL" >&5 +$as_echo "$MANIFEST_TOOL" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_MANIFEST_TOOL"; then + ac_ct_MANIFEST_TOOL=$MANIFEST_TOOL + # Extract the first word of "mt", so it can be a program name with args. +set dummy mt; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_MANIFEST_TOOL+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_MANIFEST_TOOL"; then + ac_cv_prog_ac_ct_MANIFEST_TOOL="$ac_ct_MANIFEST_TOOL" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_MANIFEST_TOOL="mt" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_MANIFEST_TOOL=$ac_cv_prog_ac_ct_MANIFEST_TOOL +if test -n "$ac_ct_MANIFEST_TOOL"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_MANIFEST_TOOL" >&5 +$as_echo "$ac_ct_MANIFEST_TOOL" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_MANIFEST_TOOL" = x; then + MANIFEST_TOOL=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + MANIFEST_TOOL=$ac_ct_MANIFEST_TOOL + fi +else + MANIFEST_TOOL="$ac_cv_prog_MANIFEST_TOOL" +fi + +test -z "$MANIFEST_TOOL" && MANIFEST_TOOL=mt +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking if $MANIFEST_TOOL is a manifest tool" >&5 +$as_echo_n "checking if $MANIFEST_TOOL is a manifest tool... " >&6; } +if ${lt_cv_path_mainfest_tool+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_path_mainfest_tool=no + echo "$as_me:$LINENO: $MANIFEST_TOOL '-?'" >&5 + $MANIFEST_TOOL '-?' 2>conftest.err > conftest.out + cat conftest.err >&5 + if $GREP 'Manifest Tool' conftest.out > /dev/null; then + lt_cv_path_mainfest_tool=yes + fi + rm -f conftest* +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_path_mainfest_tool" >&5 +$as_echo "$lt_cv_path_mainfest_tool" >&6; } +if test yes != "$lt_cv_path_mainfest_tool"; then + MANIFEST_TOOL=: +fi + + + + + + + case $host_os in + rhapsody* | darwin*) + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}dsymutil", so it can be a program name with args. +set dummy ${ac_tool_prefix}dsymutil; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_DSYMUTIL+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$DSYMUTIL"; then + ac_cv_prog_DSYMUTIL="$DSYMUTIL" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_DSYMUTIL="${ac_tool_prefix}dsymutil" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +DSYMUTIL=$ac_cv_prog_DSYMUTIL +if test -n "$DSYMUTIL"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $DSYMUTIL" >&5 +$as_echo "$DSYMUTIL" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_DSYMUTIL"; then + ac_ct_DSYMUTIL=$DSYMUTIL + # Extract the first word of "dsymutil", so it can be a program name with args. +set dummy dsymutil; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_DSYMUTIL+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_DSYMUTIL"; then + ac_cv_prog_ac_ct_DSYMUTIL="$ac_ct_DSYMUTIL" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_DSYMUTIL="dsymutil" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_DSYMUTIL=$ac_cv_prog_ac_ct_DSYMUTIL +if test -n "$ac_ct_DSYMUTIL"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_DSYMUTIL" >&5 +$as_echo "$ac_ct_DSYMUTIL" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_DSYMUTIL" = x; then + DSYMUTIL=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + DSYMUTIL=$ac_ct_DSYMUTIL + fi +else + DSYMUTIL="$ac_cv_prog_DSYMUTIL" +fi + + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}nmedit", so it can be a program name with args. +set dummy ${ac_tool_prefix}nmedit; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_NMEDIT+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$NMEDIT"; then + ac_cv_prog_NMEDIT="$NMEDIT" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_NMEDIT="${ac_tool_prefix}nmedit" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +NMEDIT=$ac_cv_prog_NMEDIT +if test -n "$NMEDIT"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $NMEDIT" >&5 +$as_echo "$NMEDIT" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_NMEDIT"; then + ac_ct_NMEDIT=$NMEDIT + # Extract the first word of "nmedit", so it can be a program name with args. +set dummy nmedit; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_NMEDIT+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_NMEDIT"; then + ac_cv_prog_ac_ct_NMEDIT="$ac_ct_NMEDIT" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_NMEDIT="nmedit" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_NMEDIT=$ac_cv_prog_ac_ct_NMEDIT +if test -n "$ac_ct_NMEDIT"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_NMEDIT" >&5 +$as_echo "$ac_ct_NMEDIT" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_NMEDIT" = x; then + NMEDIT=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + NMEDIT=$ac_ct_NMEDIT + fi +else + NMEDIT="$ac_cv_prog_NMEDIT" +fi + + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}lipo", so it can be a program name with args. +set dummy ${ac_tool_prefix}lipo; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_LIPO+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$LIPO"; then + ac_cv_prog_LIPO="$LIPO" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_LIPO="${ac_tool_prefix}lipo" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +LIPO=$ac_cv_prog_LIPO +if test -n "$LIPO"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $LIPO" >&5 +$as_echo "$LIPO" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_LIPO"; then + ac_ct_LIPO=$LIPO + # Extract the first word of "lipo", so it can be a program name with args. +set dummy lipo; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_LIPO+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_LIPO"; then + ac_cv_prog_ac_ct_LIPO="$ac_ct_LIPO" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_LIPO="lipo" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_LIPO=$ac_cv_prog_ac_ct_LIPO +if test -n "$ac_ct_LIPO"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_LIPO" >&5 +$as_echo "$ac_ct_LIPO" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_LIPO" = x; then + LIPO=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + LIPO=$ac_ct_LIPO + fi +else + LIPO="$ac_cv_prog_LIPO" +fi + + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}otool", so it can be a program name with args. +set dummy ${ac_tool_prefix}otool; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_OTOOL+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$OTOOL"; then + ac_cv_prog_OTOOL="$OTOOL" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_OTOOL="${ac_tool_prefix}otool" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +OTOOL=$ac_cv_prog_OTOOL +if test -n "$OTOOL"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $OTOOL" >&5 +$as_echo "$OTOOL" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_OTOOL"; then + ac_ct_OTOOL=$OTOOL + # Extract the first word of "otool", so it can be a program name with args. +set dummy otool; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_OTOOL+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_OTOOL"; then + ac_cv_prog_ac_ct_OTOOL="$ac_ct_OTOOL" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_OTOOL="otool" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_OTOOL=$ac_cv_prog_ac_ct_OTOOL +if test -n "$ac_ct_OTOOL"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_OTOOL" >&5 +$as_echo "$ac_ct_OTOOL" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_OTOOL" = x; then + OTOOL=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + OTOOL=$ac_ct_OTOOL + fi +else + OTOOL="$ac_cv_prog_OTOOL" +fi + + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}otool64", so it can be a program name with args. +set dummy ${ac_tool_prefix}otool64; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_OTOOL64+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$OTOOL64"; then + ac_cv_prog_OTOOL64="$OTOOL64" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_OTOOL64="${ac_tool_prefix}otool64" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +OTOOL64=$ac_cv_prog_OTOOL64 +if test -n "$OTOOL64"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $OTOOL64" >&5 +$as_echo "$OTOOL64" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_prog_OTOOL64"; then + ac_ct_OTOOL64=$OTOOL64 + # Extract the first word of "otool64", so it can be a program name with args. +set dummy otool64; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_ac_ct_OTOOL64+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$ac_ct_OTOOL64"; then + ac_cv_prog_ac_ct_OTOOL64="$ac_ct_OTOOL64" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_ac_ct_OTOOL64="otool64" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +ac_ct_OTOOL64=$ac_cv_prog_ac_ct_OTOOL64 +if test -n "$ac_ct_OTOOL64"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_ct_OTOOL64" >&5 +$as_echo "$ac_ct_OTOOL64" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_ct_OTOOL64" = x; then + OTOOL64=":" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + OTOOL64=$ac_ct_OTOOL64 + fi +else + OTOOL64="$ac_cv_prog_OTOOL64" +fi + + + + + + + + + + + + + + + + + + + + + + + + + + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for -single_module linker flag" >&5 +$as_echo_n "checking for -single_module linker flag... " >&6; } +if ${lt_cv_apple_cc_single_mod+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_apple_cc_single_mod=no + if test -z "$LT_MULTI_MODULE"; then + # By default we will add the -single_module flag. You can override + # by either setting the environment variable LT_MULTI_MODULE + # non-empty at configure time, or by adding -multi_module to the + # link flags. + rm -rf libconftest.dylib* + echo "int foo(void){return 1;}" > conftest.c + echo "$LTCC $LTCFLAGS $LDFLAGS -o libconftest.dylib \ +-dynamiclib -Wl,-single_module conftest.c" >&5 + $LTCC $LTCFLAGS $LDFLAGS -o libconftest.dylib \ + -dynamiclib -Wl,-single_module conftest.c 2>conftest.err + _lt_result=$? + # If there is a non-empty error log, and "single_module" + # appears in it, assume the flag caused a linker warning + if test -s conftest.err && $GREP single_module conftest.err; then + cat conftest.err >&5 + # Otherwise, if the output was created with a 0 exit code from + # the compiler, it worked. + elif test -f libconftest.dylib && test 0 = "$_lt_result"; then + lt_cv_apple_cc_single_mod=yes + else + cat conftest.err >&5 + fi + rm -rf libconftest.dylib* + rm -f conftest.* + fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_apple_cc_single_mod" >&5 +$as_echo "$lt_cv_apple_cc_single_mod" >&6; } + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for -exported_symbols_list linker flag" >&5 +$as_echo_n "checking for -exported_symbols_list linker flag... " >&6; } +if ${lt_cv_ld_exported_symbols_list+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_ld_exported_symbols_list=no + save_LDFLAGS=$LDFLAGS + echo "_main" > conftest.sym + LDFLAGS="$LDFLAGS -Wl,-exported_symbols_list,conftest.sym" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + lt_cv_ld_exported_symbols_list=yes +else + lt_cv_ld_exported_symbols_list=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + LDFLAGS=$save_LDFLAGS + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_ld_exported_symbols_list" >&5 +$as_echo "$lt_cv_ld_exported_symbols_list" >&6; } + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for -force_load linker flag" >&5 +$as_echo_n "checking for -force_load linker flag... " >&6; } +if ${lt_cv_ld_force_load+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_ld_force_load=no + cat > conftest.c << _LT_EOF +int forced_loaded() { return 2;} +_LT_EOF + echo "$LTCC $LTCFLAGS -c -o conftest.o conftest.c" >&5 + $LTCC $LTCFLAGS -c -o conftest.o conftest.c 2>&5 + echo "$AR cru libconftest.a conftest.o" >&5 + $AR cru libconftest.a conftest.o 2>&5 + echo "$RANLIB libconftest.a" >&5 + $RANLIB libconftest.a 2>&5 + cat > conftest.c << _LT_EOF +int main() { return 0;} +_LT_EOF + echo "$LTCC $LTCFLAGS $LDFLAGS -o conftest conftest.c -Wl,-force_load,./libconftest.a" >&5 + $LTCC $LTCFLAGS $LDFLAGS -o conftest conftest.c -Wl,-force_load,./libconftest.a 2>conftest.err + _lt_result=$? + if test -s conftest.err && $GREP force_load conftest.err; then + cat conftest.err >&5 + elif test -f conftest && test 0 = "$_lt_result" && $GREP forced_load conftest >/dev/null 2>&1; then + lt_cv_ld_force_load=yes + else + cat conftest.err >&5 + fi + rm -f conftest.err libconftest.a conftest conftest.c + rm -rf conftest.dSYM + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_ld_force_load" >&5 +$as_echo "$lt_cv_ld_force_load" >&6; } + case $host_os in + rhapsody* | darwin1.[012]) + _lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;; + darwin1.*) + _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;; + darwin*) # darwin 5.x on + # if running on 10.5 or later, the deployment target defaults + # to the OS version, if on x86, and 10.4, the deployment + # target defaults to 10.4. Don't you love it? + case ${MACOSX_DEPLOYMENT_TARGET-10.0},$host in + 10.0,*86*-darwin8*|10.0,*-darwin[91]*) + _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;; + 10.[012][,.]*) + _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;; + 10.*) + _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;; + esac + ;; + esac + if test yes = "$lt_cv_apple_cc_single_mod"; then + _lt_dar_single_mod='$single_module' + fi + if test yes = "$lt_cv_ld_exported_symbols_list"; then + _lt_dar_export_syms=' $wl-exported_symbols_list,$output_objdir/$libname-symbols.expsym' + else + _lt_dar_export_syms='~$NMEDIT -s $output_objdir/$libname-symbols.expsym $lib' + fi + if test : != "$DSYMUTIL" && test no = "$lt_cv_ld_force_load"; then + _lt_dsymutil='~$DSYMUTIL $lib || :' + else + _lt_dsymutil= + fi + ;; + esac + +# func_munge_path_list VARIABLE PATH +# ----------------------------------- +# VARIABLE is name of variable containing _space_ separated list of +# directories to be munged by the contents of PATH, which is string +# having a format: +# "DIR[:DIR]:" +# string "DIR[ DIR]" will be prepended to VARIABLE +# ":DIR[:DIR]" +# string "DIR[ DIR]" will be appended to VARIABLE +# "DIRP[:DIRP]::[DIRA:]DIRA" +# string "DIRP[ DIRP]" will be prepended to VARIABLE and string +# "DIRA[ DIRA]" will be appended to VARIABLE +# "DIR[:DIR]" +# VARIABLE will be replaced by "DIR[ DIR]" +func_munge_path_list () +{ + case x$2 in + x) + ;; + *:) + eval $1=\"`$ECHO $2 | $SED 's/:/ /g'` \$$1\" + ;; + x:*) + eval $1=\"\$$1 `$ECHO $2 | $SED 's/:/ /g'`\" + ;; + *::*) + eval $1=\"\$$1\ `$ECHO $2 | $SED -e 's/.*:://' -e 's/:/ /g'`\" + eval $1=\"`$ECHO $2 | $SED -e 's/::.*//' -e 's/:/ /g'`\ \$$1\" + ;; + *) + eval $1=\"`$ECHO $2 | $SED 's/:/ /g'`\" + ;; + esac +} + +for ac_header in dlfcn.h +do : + ac_fn_c_check_header_compile "$LINENO" "dlfcn.h" "ac_cv_header_dlfcn_h" "$ac_includes_default +" +if test "x$ac_cv_header_dlfcn_h" = xyes; then : + cat >>confdefs.h <<_ACEOF +#define HAVE_DLFCN_H 1 +_ACEOF + +fi + +done + + + + + +# Set options + + + + enable_dlopen=no + + + enable_win32_dll=no + + + # Check whether --enable-shared was given. +if test "${enable_shared+set}" = set; then : + enableval=$enable_shared; p=${PACKAGE-default} + case $enableval in + yes) enable_shared=yes ;; + no) enable_shared=no ;; + *) + enable_shared=no + # Look at the argument we got. We use all the common list separators. + lt_save_ifs=$IFS; IFS=$IFS$PATH_SEPARATOR, + for pkg in $enableval; do + IFS=$lt_save_ifs + if test "X$pkg" = "X$p"; then + enable_shared=yes + fi + done + IFS=$lt_save_ifs + ;; + esac +else + enable_shared=yes +fi + + + + + + + + + + # Check whether --enable-static was given. +if test "${enable_static+set}" = set; then : + enableval=$enable_static; p=${PACKAGE-default} + case $enableval in + yes) enable_static=yes ;; + no) enable_static=no ;; + *) + enable_static=no + # Look at the argument we got. We use all the common list separators. + lt_save_ifs=$IFS; IFS=$IFS$PATH_SEPARATOR, + for pkg in $enableval; do + IFS=$lt_save_ifs + if test "X$pkg" = "X$p"; then + enable_static=yes + fi + done + IFS=$lt_save_ifs + ;; + esac +else + enable_static=yes +fi + + + + + + + + + + +# Check whether --with-pic was given. +if test "${with_pic+set}" = set; then : + withval=$with_pic; lt_p=${PACKAGE-default} + case $withval in + yes|no) pic_mode=$withval ;; + *) + pic_mode=default + # Look at the argument we got. We use all the common list separators. + lt_save_ifs=$IFS; IFS=$IFS$PATH_SEPARATOR, + for lt_pkg in $withval; do + IFS=$lt_save_ifs + if test "X$lt_pkg" = "X$lt_p"; then + pic_mode=yes + fi + done + IFS=$lt_save_ifs + ;; + esac +else + pic_mode=default +fi + + + + + + + + + # Check whether --enable-fast-install was given. +if test "${enable_fast_install+set}" = set; then : + enableval=$enable_fast_install; p=${PACKAGE-default} + case $enableval in + yes) enable_fast_install=yes ;; + no) enable_fast_install=no ;; + *) + enable_fast_install=no + # Look at the argument we got. We use all the common list separators. + lt_save_ifs=$IFS; IFS=$IFS$PATH_SEPARATOR, + for pkg in $enableval; do + IFS=$lt_save_ifs + if test "X$pkg" = "X$p"; then + enable_fast_install=yes + fi + done + IFS=$lt_save_ifs + ;; + esac +else + enable_fast_install=yes +fi + + + + + + + + + shared_archive_member_spec= +case $host,$enable_shared in +power*-*-aix[5-9]*,yes) + { $as_echo "$as_me:${as_lineno-$LINENO}: checking which variant of shared library versioning to provide" >&5 +$as_echo_n "checking which variant of shared library versioning to provide... " >&6; } + +# Check whether --with-aix-soname was given. +if test "${with_aix_soname+set}" = set; then : + withval=$with_aix_soname; case $withval in + aix|svr4|both) + ;; + *) + as_fn_error $? "Unknown argument to --with-aix-soname" "$LINENO" 5 + ;; + esac + lt_cv_with_aix_soname=$with_aix_soname +else + if ${lt_cv_with_aix_soname+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_with_aix_soname=aix +fi + + with_aix_soname=$lt_cv_with_aix_soname +fi + + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $with_aix_soname" >&5 +$as_echo "$with_aix_soname" >&6; } + if test aix != "$with_aix_soname"; then + # For the AIX way of multilib, we name the shared archive member + # based on the bitwidth used, traditionally 'shr.o' or 'shr_64.o', + # and 'shr.imp' or 'shr_64.imp', respectively, for the Import File. + # Even when GNU compilers ignore OBJECT_MODE but need '-maix64' flag, + # the AIX toolchain works better with OBJECT_MODE set (default 32). + if test 64 = "${OBJECT_MODE-32}"; then + shared_archive_member_spec=shr_64 + else + shared_archive_member_spec=shr + fi + fi + ;; +*) + with_aix_soname=aix + ;; +esac + + + + + + + + + + +# This can be used to rebuild libtool when needed +LIBTOOL_DEPS=$ltmain + +# Always use our own libtool. +LIBTOOL='$(SHELL) $(top_builddir)/libtool' + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +test -z "$LN_S" && LN_S="ln -s" + + + + + + + + + + + + + + +if test -n "${ZSH_VERSION+set}"; then + setopt NO_GLOB_SUBST +fi + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for objdir" >&5 +$as_echo_n "checking for objdir... " >&6; } +if ${lt_cv_objdir+:} false; then : + $as_echo_n "(cached) " >&6 +else + rm -f .libs 2>/dev/null +mkdir .libs 2>/dev/null +if test -d .libs; then + lt_cv_objdir=.libs +else + # MS-DOS does not allow filenames that begin with a dot. + lt_cv_objdir=_libs +fi +rmdir .libs 2>/dev/null +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_objdir" >&5 +$as_echo "$lt_cv_objdir" >&6; } +objdir=$lt_cv_objdir + + + + + +cat >>confdefs.h <<_ACEOF +#define LT_OBJDIR "$lt_cv_objdir/" +_ACEOF + + + + +case $host_os in +aix3*) + # AIX sometimes has problems with the GCC collect2 program. For some + # reason, if we set the COLLECT_NAMES environment variable, the problems + # vanish in a puff of smoke. + if test set != "${COLLECT_NAMES+set}"; then + COLLECT_NAMES= + export COLLECT_NAMES + fi + ;; +esac + +# Global variables: +ofile=libtool +can_build_shared=yes + +# All known linkers require a '.a' archive for static linking (except MSVC, +# which needs '.lib'). +libext=a + +with_gnu_ld=$lt_cv_prog_gnu_ld + +old_CC=$CC +old_CFLAGS=$CFLAGS + +# Set sane defaults for various variables +test -z "$CC" && CC=cc +test -z "$LTCC" && LTCC=$CC +test -z "$LTCFLAGS" && LTCFLAGS=$CFLAGS +test -z "$LD" && LD=ld +test -z "$ac_objext" && ac_objext=o + +func_cc_basename $compiler +cc_basename=$func_cc_basename_result + + +# Only perform the check for file, if the check method requires it +test -z "$MAGIC_CMD" && MAGIC_CMD=file +case $deplibs_check_method in +file_magic*) + if test "$file_magic_cmd" = '$MAGIC_CMD'; then + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for ${ac_tool_prefix}file" >&5 +$as_echo_n "checking for ${ac_tool_prefix}file... " >&6; } +if ${lt_cv_path_MAGIC_CMD+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $MAGIC_CMD in +[\\/*] | ?:[\\/]*) + lt_cv_path_MAGIC_CMD=$MAGIC_CMD # Let the user override the test with a path. + ;; +*) + lt_save_MAGIC_CMD=$MAGIC_CMD + lt_save_ifs=$IFS; IFS=$PATH_SEPARATOR + ac_dummy="/usr/bin$PATH_SEPARATOR$PATH" + for ac_dir in $ac_dummy; do + IFS=$lt_save_ifs + test -z "$ac_dir" && ac_dir=. + if test -f "$ac_dir/${ac_tool_prefix}file"; then + lt_cv_path_MAGIC_CMD=$ac_dir/"${ac_tool_prefix}file" + if test -n "$file_magic_test_file"; then + case $deplibs_check_method in + "file_magic "*) + file_magic_regex=`expr "$deplibs_check_method" : "file_magic \(.*\)"` + MAGIC_CMD=$lt_cv_path_MAGIC_CMD + if eval $file_magic_cmd \$file_magic_test_file 2> /dev/null | + $EGREP "$file_magic_regex" > /dev/null; then + : + else + cat <<_LT_EOF 1>&2 + +*** Warning: the command libtool uses to detect shared libraries, +*** $file_magic_cmd, produces output that libtool cannot recognize. +*** The result is that libtool may fail to recognize shared libraries +*** as such. This will affect the creation of libtool libraries that +*** depend on shared libraries, but programs linked with such libtool +*** libraries will work regardless of this problem. Nevertheless, you +*** may want to report the problem to your system manager and/or to +*** bug-libtool@gnu.org + +_LT_EOF + fi ;; + esac + fi + break + fi + done + IFS=$lt_save_ifs + MAGIC_CMD=$lt_save_MAGIC_CMD + ;; +esac +fi + +MAGIC_CMD=$lt_cv_path_MAGIC_CMD +if test -n "$MAGIC_CMD"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $MAGIC_CMD" >&5 +$as_echo "$MAGIC_CMD" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + + + +if test -z "$lt_cv_path_MAGIC_CMD"; then + if test -n "$ac_tool_prefix"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for file" >&5 +$as_echo_n "checking for file... " >&6; } +if ${lt_cv_path_MAGIC_CMD+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $MAGIC_CMD in +[\\/*] | ?:[\\/]*) + lt_cv_path_MAGIC_CMD=$MAGIC_CMD # Let the user override the test with a path. + ;; +*) + lt_save_MAGIC_CMD=$MAGIC_CMD + lt_save_ifs=$IFS; IFS=$PATH_SEPARATOR + ac_dummy="/usr/bin$PATH_SEPARATOR$PATH" + for ac_dir in $ac_dummy; do + IFS=$lt_save_ifs + test -z "$ac_dir" && ac_dir=. + if test -f "$ac_dir/file"; then + lt_cv_path_MAGIC_CMD=$ac_dir/"file" + if test -n "$file_magic_test_file"; then + case $deplibs_check_method in + "file_magic "*) + file_magic_regex=`expr "$deplibs_check_method" : "file_magic \(.*\)"` + MAGIC_CMD=$lt_cv_path_MAGIC_CMD + if eval $file_magic_cmd \$file_magic_test_file 2> /dev/null | + $EGREP "$file_magic_regex" > /dev/null; then + : + else + cat <<_LT_EOF 1>&2 + +*** Warning: the command libtool uses to detect shared libraries, +*** $file_magic_cmd, produces output that libtool cannot recognize. +*** The result is that libtool may fail to recognize shared libraries +*** as such. This will affect the creation of libtool libraries that +*** depend on shared libraries, but programs linked with such libtool +*** libraries will work regardless of this problem. Nevertheless, you +*** may want to report the problem to your system manager and/or to +*** bug-libtool@gnu.org + +_LT_EOF + fi ;; + esac + fi + break + fi + done + IFS=$lt_save_ifs + MAGIC_CMD=$lt_save_MAGIC_CMD + ;; +esac +fi + +MAGIC_CMD=$lt_cv_path_MAGIC_CMD +if test -n "$MAGIC_CMD"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $MAGIC_CMD" >&5 +$as_echo "$MAGIC_CMD" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + else + MAGIC_CMD=: + fi +fi + + fi + ;; +esac + +# Use C for the default configuration in the libtool script + +lt_save_CC=$CC +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + + +# Source file extension for C test sources. +ac_ext=c + +# Object file extension for compiled C test sources. +objext=o +objext=$objext + +# Code to be used in simple compile tests +lt_simple_compile_test_code="int some_variable = 0;" + +# Code to be used in simple link tests +lt_simple_link_test_code='int main(){return(0);}' + + + + + + + +# If no C compiler was specified, use CC. +LTCC=${LTCC-"$CC"} + +# If no C compiler flags were specified, use CFLAGS. +LTCFLAGS=${LTCFLAGS-"$CFLAGS"} + +# Allow CC to be a program name with arguments. +compiler=$CC + +# Save the default compiler, since it gets overwritten when the other +# tags are being tested, and _LT_TAGVAR(compiler, []) is a NOP. +compiler_DEFAULT=$CC + +# save warnings/boilerplate of simple test code +ac_outfile=conftest.$ac_objext +echo "$lt_simple_compile_test_code" >conftest.$ac_ext +eval "$ac_compile" 2>&1 >/dev/null | $SED '/^$/d; /^ *+/d' >conftest.err +_lt_compiler_boilerplate=`cat conftest.err` +$RM conftest* + +ac_outfile=conftest.$ac_objext +echo "$lt_simple_link_test_code" >conftest.$ac_ext +eval "$ac_link" 2>&1 >/dev/null | $SED '/^$/d; /^ *+/d' >conftest.err +_lt_linker_boilerplate=`cat conftest.err` +$RM -r conftest* + + +## CAVEAT EMPTOR: +## There is no encapsulation within the following macros, do not change +## the running order or otherwise move them around unless you know exactly +## what you are doing... +if test -n "$compiler"; then + +lt_prog_compiler_no_builtin_flag= + +if test yes = "$GCC"; then + case $cc_basename in + nvcc*) + lt_prog_compiler_no_builtin_flag=' -Xcompiler -fno-builtin' ;; + *) + lt_prog_compiler_no_builtin_flag=' -fno-builtin' ;; + esac + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking if $compiler supports -fno-rtti -fno-exceptions" >&5 +$as_echo_n "checking if $compiler supports -fno-rtti -fno-exceptions... " >&6; } +if ${lt_cv_prog_compiler_rtti_exceptions+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_prog_compiler_rtti_exceptions=no + ac_outfile=conftest.$ac_objext + echo "$lt_simple_compile_test_code" > conftest.$ac_ext + lt_compiler_flag="-fno-rtti -fno-exceptions" ## exclude from sc_useless_quotes_in_assignment + # Insert the option either (1) after the last *FLAGS variable, or + # (2) before a word containing "conftest.", or (3) at the end. + # Note that $ac_compile itself does not contain backslashes and begins + # with a dollar sign (not a hyphen), so the echo should work correctly. + # The option is referenced via a variable to avoid confusing sed. + lt_compile=`echo "$ac_compile" | $SED \ + -e 's:.*FLAGS}\{0,1\} :&$lt_compiler_flag :; t' \ + -e 's: [^ ]*conftest\.: $lt_compiler_flag&:; t' \ + -e 's:$: $lt_compiler_flag:'` + (eval echo "\"\$as_me:$LINENO: $lt_compile\"" >&5) + (eval "$lt_compile" 2>conftest.err) + ac_status=$? + cat conftest.err >&5 + echo "$as_me:$LINENO: \$? = $ac_status" >&5 + if (exit $ac_status) && test -s "$ac_outfile"; then + # The compiler can only warn and ignore the option if not recognized + # So say no if there are warnings other than the usual output. + $ECHO "$_lt_compiler_boilerplate" | $SED '/^$/d' >conftest.exp + $SED '/^$/d; /^ *+/d' conftest.err >conftest.er2 + if test ! -s conftest.er2 || diff conftest.exp conftest.er2 >/dev/null; then + lt_cv_prog_compiler_rtti_exceptions=yes + fi + fi + $RM conftest* + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_prog_compiler_rtti_exceptions" >&5 +$as_echo "$lt_cv_prog_compiler_rtti_exceptions" >&6; } + +if test yes = "$lt_cv_prog_compiler_rtti_exceptions"; then + lt_prog_compiler_no_builtin_flag="$lt_prog_compiler_no_builtin_flag -fno-rtti -fno-exceptions" +else + : +fi + +fi + + + + + + + lt_prog_compiler_wl= +lt_prog_compiler_pic= +lt_prog_compiler_static= + + + if test yes = "$GCC"; then + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_static='-static' + + case $host_os in + aix*) + # All AIX code is PIC. + if test ia64 = "$host_cpu"; then + # AIX 5 now supports IA64 processor + lt_prog_compiler_static='-Bstatic' + fi + lt_prog_compiler_pic='-fPIC' + ;; + + amigaos*) + case $host_cpu in + powerpc) + # see comment about AmigaOS4 .so support + lt_prog_compiler_pic='-fPIC' + ;; + m68k) + # FIXME: we need at least 68020 code to build shared libraries, but + # adding the '-m68020' flag to GCC prevents building anything better, + # like '-m68040'. + lt_prog_compiler_pic='-m68020 -resident32 -malways-restore-a4' + ;; + esac + ;; + + beos* | irix5* | irix6* | nonstopux* | osf3* | osf4* | osf5*) + # PIC is the default for these OSes. + ;; + + mingw* | cygwin* | pw32* | os2* | cegcc*) + # This hack is so that the source file can tell whether it is being + # built for inclusion in a dll (and should export symbols for example). + # Although the cygwin gcc ignores -fPIC, still need this for old-style + # (--disable-auto-import) libraries + lt_prog_compiler_pic='-DDLL_EXPORT' + case $host_os in + os2*) + lt_prog_compiler_static='$wl-static' + ;; + esac + ;; + + darwin* | rhapsody*) + # PIC is the default on this platform + # Common symbols not allowed in MH_DYLIB files + lt_prog_compiler_pic='-fno-common' + ;; + + haiku*) + # PIC is the default for Haiku. + # The "-static" flag exists, but is broken. + lt_prog_compiler_static= + ;; + + hpux*) + # PIC is the default for 64-bit PA HP-UX, but not for 32-bit + # PA HP-UX. On IA64 HP-UX, PIC is the default but the pic flag + # sets the default TLS model and affects inlining. + case $host_cpu in + hppa*64*) + # +Z the default + ;; + *) + lt_prog_compiler_pic='-fPIC' + ;; + esac + ;; + + interix[3-9]*) + # Interix 3.x gcc -fpic/-fPIC options generate broken code. + # Instead, we relocate shared libraries at runtime. + ;; + + msdosdjgpp*) + # Just because we use GCC doesn't mean we suddenly get shared libraries + # on systems that don't support them. + lt_prog_compiler_can_build_shared=no + enable_shared=no + ;; + + *nto* | *qnx*) + # QNX uses GNU C++, but need to define -shared option too, otherwise + # it will coredump. + lt_prog_compiler_pic='-fPIC -shared' + ;; + + sysv4*MP*) + if test -d /usr/nec; then + lt_prog_compiler_pic=-Kconform_pic + fi + ;; + + *) + lt_prog_compiler_pic='-fPIC' + ;; + esac + + case $cc_basename in + nvcc*) # Cuda Compiler Driver 2.2 + lt_prog_compiler_wl='-Xlinker ' + if test -n "$lt_prog_compiler_pic"; then + lt_prog_compiler_pic="-Xcompiler $lt_prog_compiler_pic" + fi + ;; + esac + else + # PORTME Check for flag to pass linker flags through the system compiler. + case $host_os in + aix*) + lt_prog_compiler_wl='-Wl,' + if test ia64 = "$host_cpu"; then + # AIX 5 now supports IA64 processor + lt_prog_compiler_static='-Bstatic' + else + lt_prog_compiler_static='-bnso -bI:/lib/syscalls.exp' + fi + ;; + + darwin* | rhapsody*) + # PIC is the default on this platform + # Common symbols not allowed in MH_DYLIB files + lt_prog_compiler_pic='-fno-common' + case $cc_basename in + nagfor*) + # NAG Fortran compiler + lt_prog_compiler_wl='-Wl,-Wl,,' + lt_prog_compiler_pic='-PIC' + lt_prog_compiler_static='-Bstatic' + ;; + esac + ;; + + mingw* | cygwin* | pw32* | os2* | cegcc*) + # This hack is so that the source file can tell whether it is being + # built for inclusion in a dll (and should export symbols for example). + lt_prog_compiler_pic='-DDLL_EXPORT' + case $host_os in + os2*) + lt_prog_compiler_static='$wl-static' + ;; + esac + ;; + + hpux9* | hpux10* | hpux11*) + lt_prog_compiler_wl='-Wl,' + # PIC is the default for IA64 HP-UX and 64-bit HP-UX, but + # not for PA HP-UX. + case $host_cpu in + hppa*64*|ia64*) + # +Z the default + ;; + *) + lt_prog_compiler_pic='+Z' + ;; + esac + # Is there a better lt_prog_compiler_static that works with the bundled CC? + lt_prog_compiler_static='$wl-a ${wl}archive' + ;; + + irix5* | irix6* | nonstopux*) + lt_prog_compiler_wl='-Wl,' + # PIC (with -KPIC) is the default. + lt_prog_compiler_static='-non_shared' + ;; + + linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) + case $cc_basename in + # old Intel for x86_64, which still supported -KPIC. + ecc*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-KPIC' + lt_prog_compiler_static='-static' + ;; + # icc used to be incompatible with GCC. + # ICC 10 doesn't accept -KPIC any more. + icc* | ifort*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-fPIC' + lt_prog_compiler_static='-static' + ;; + # Lahey Fortran 8.1. + lf95*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='--shared' + lt_prog_compiler_static='--static' + ;; + nagfor*) + # NAG Fortran compiler + lt_prog_compiler_wl='-Wl,-Wl,,' + lt_prog_compiler_pic='-PIC' + lt_prog_compiler_static='-Bstatic' + ;; + tcc*) + # Fabrice Bellard et al's Tiny C Compiler + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-fPIC' + lt_prog_compiler_static='-static' + ;; + pgcc* | pgf77* | pgf90* | pgf95* | pgfortran*) + # Portland Group compilers (*not* the Pentium gcc compiler, + # which looks to be a dead project) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-fpic' + lt_prog_compiler_static='-Bstatic' + ;; + ccc*) + lt_prog_compiler_wl='-Wl,' + # All Alpha code is PIC. + lt_prog_compiler_static='-non_shared' + ;; + xl* | bgxl* | bgf* | mpixl*) + # IBM XL C 8.0/Fortran 10.1, 11.1 on PPC and BlueGene + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-qpic' + lt_prog_compiler_static='-qstaticlink' + ;; + *) + case `$CC -V 2>&1 | sed 5q` in + *Sun\ Ceres\ Fortran* | *Sun*Fortran*\ [1-7].* | *Sun*Fortran*\ 8.[0-3]*) + # Sun Fortran 8.3 passes all unrecognized flags to the linker + lt_prog_compiler_pic='-KPIC' + lt_prog_compiler_static='-Bstatic' + lt_prog_compiler_wl='' + ;; + *Sun\ F* | *Sun*Fortran*) + lt_prog_compiler_pic='-KPIC' + lt_prog_compiler_static='-Bstatic' + lt_prog_compiler_wl='-Qoption ld ' + ;; + *Sun\ C*) + # Sun C 5.9 + lt_prog_compiler_pic='-KPIC' + lt_prog_compiler_static='-Bstatic' + lt_prog_compiler_wl='-Wl,' + ;; + *Intel*\ [CF]*Compiler*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-fPIC' + lt_prog_compiler_static='-static' + ;; + *Portland\ Group*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-fpic' + lt_prog_compiler_static='-Bstatic' + ;; + esac + ;; + esac + ;; + + newsos6) + lt_prog_compiler_pic='-KPIC' + lt_prog_compiler_static='-Bstatic' + ;; + + *nto* | *qnx*) + # QNX uses GNU C++, but need to define -shared option too, otherwise + # it will coredump. + lt_prog_compiler_pic='-fPIC -shared' + ;; + + osf3* | osf4* | osf5*) + lt_prog_compiler_wl='-Wl,' + # All OSF/1 code is PIC. + lt_prog_compiler_static='-non_shared' + ;; + + rdos*) + lt_prog_compiler_static='-non_shared' + ;; + + solaris*) + lt_prog_compiler_pic='-KPIC' + lt_prog_compiler_static='-Bstatic' + case $cc_basename in + f77* | f90* | f95* | sunf77* | sunf90* | sunf95*) + lt_prog_compiler_wl='-Qoption ld ';; + *) + lt_prog_compiler_wl='-Wl,';; + esac + ;; + + sunos4*) + lt_prog_compiler_wl='-Qoption ld ' + lt_prog_compiler_pic='-PIC' + lt_prog_compiler_static='-Bstatic' + ;; + + sysv4 | sysv4.2uw2* | sysv4.3*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-KPIC' + lt_prog_compiler_static='-Bstatic' + ;; + + sysv4*MP*) + if test -d /usr/nec; then + lt_prog_compiler_pic='-Kconform_pic' + lt_prog_compiler_static='-Bstatic' + fi + ;; + + sysv5* | unixware* | sco3.2v5* | sco5v6* | OpenUNIX*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-KPIC' + lt_prog_compiler_static='-Bstatic' + ;; + + unicos*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_can_build_shared=no + ;; + + uts4*) + lt_prog_compiler_pic='-pic' + lt_prog_compiler_static='-Bstatic' + ;; + + *) + lt_prog_compiler_can_build_shared=no + ;; + esac + fi + +case $host_os in + # For platforms that do not support PIC, -DPIC is meaningless: + *djgpp*) + lt_prog_compiler_pic= + ;; + *) + lt_prog_compiler_pic="$lt_prog_compiler_pic -DPIC" + ;; +esac + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $compiler option to produce PIC" >&5 +$as_echo_n "checking for $compiler option to produce PIC... " >&6; } +if ${lt_cv_prog_compiler_pic+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_prog_compiler_pic=$lt_prog_compiler_pic +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_prog_compiler_pic" >&5 +$as_echo "$lt_cv_prog_compiler_pic" >&6; } +lt_prog_compiler_pic=$lt_cv_prog_compiler_pic + +# +# Check to make sure the PIC flag actually works. +# +if test -n "$lt_prog_compiler_pic"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: checking if $compiler PIC flag $lt_prog_compiler_pic works" >&5 +$as_echo_n "checking if $compiler PIC flag $lt_prog_compiler_pic works... " >&6; } +if ${lt_cv_prog_compiler_pic_works+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_prog_compiler_pic_works=no + ac_outfile=conftest.$ac_objext + echo "$lt_simple_compile_test_code" > conftest.$ac_ext + lt_compiler_flag="$lt_prog_compiler_pic -DPIC" ## exclude from sc_useless_quotes_in_assignment + # Insert the option either (1) after the last *FLAGS variable, or + # (2) before a word containing "conftest.", or (3) at the end. + # Note that $ac_compile itself does not contain backslashes and begins + # with a dollar sign (not a hyphen), so the echo should work correctly. + # The option is referenced via a variable to avoid confusing sed. + lt_compile=`echo "$ac_compile" | $SED \ + -e 's:.*FLAGS}\{0,1\} :&$lt_compiler_flag :; t' \ + -e 's: [^ ]*conftest\.: $lt_compiler_flag&:; t' \ + -e 's:$: $lt_compiler_flag:'` + (eval echo "\"\$as_me:$LINENO: $lt_compile\"" >&5) + (eval "$lt_compile" 2>conftest.err) + ac_status=$? + cat conftest.err >&5 + echo "$as_me:$LINENO: \$? = $ac_status" >&5 + if (exit $ac_status) && test -s "$ac_outfile"; then + # The compiler can only warn and ignore the option if not recognized + # So say no if there are warnings other than the usual output. + $ECHO "$_lt_compiler_boilerplate" | $SED '/^$/d' >conftest.exp + $SED '/^$/d; /^ *+/d' conftest.err >conftest.er2 + if test ! -s conftest.er2 || diff conftest.exp conftest.er2 >/dev/null; then + lt_cv_prog_compiler_pic_works=yes + fi + fi + $RM conftest* + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_prog_compiler_pic_works" >&5 +$as_echo "$lt_cv_prog_compiler_pic_works" >&6; } + +if test yes = "$lt_cv_prog_compiler_pic_works"; then + case $lt_prog_compiler_pic in + "" | " "*) ;; + *) lt_prog_compiler_pic=" $lt_prog_compiler_pic" ;; + esac +else + lt_prog_compiler_pic= + lt_prog_compiler_can_build_shared=no +fi + +fi + + + + + + + + + + + +# +# Check to make sure the static flag actually works. +# +wl=$lt_prog_compiler_wl eval lt_tmp_static_flag=\"$lt_prog_compiler_static\" +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking if $compiler static flag $lt_tmp_static_flag works" >&5 +$as_echo_n "checking if $compiler static flag $lt_tmp_static_flag works... " >&6; } +if ${lt_cv_prog_compiler_static_works+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_prog_compiler_static_works=no + save_LDFLAGS=$LDFLAGS + LDFLAGS="$LDFLAGS $lt_tmp_static_flag" + echo "$lt_simple_link_test_code" > conftest.$ac_ext + if (eval $ac_link 2>conftest.err) && test -s conftest$ac_exeext; then + # The linker can only warn and ignore the option if not recognized + # So say no if there are warnings + if test -s conftest.err; then + # Append any errors to the config.log. + cat conftest.err 1>&5 + $ECHO "$_lt_linker_boilerplate" | $SED '/^$/d' > conftest.exp + $SED '/^$/d; /^ *+/d' conftest.err >conftest.er2 + if diff conftest.exp conftest.er2 >/dev/null; then + lt_cv_prog_compiler_static_works=yes + fi + else + lt_cv_prog_compiler_static_works=yes + fi + fi + $RM -r conftest* + LDFLAGS=$save_LDFLAGS + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_prog_compiler_static_works" >&5 +$as_echo "$lt_cv_prog_compiler_static_works" >&6; } + +if test yes = "$lt_cv_prog_compiler_static_works"; then + : +else + lt_prog_compiler_static= +fi + + + + + + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking if $compiler supports -c -o file.$ac_objext" >&5 +$as_echo_n "checking if $compiler supports -c -o file.$ac_objext... " >&6; } +if ${lt_cv_prog_compiler_c_o+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_prog_compiler_c_o=no + $RM -r conftest 2>/dev/null + mkdir conftest + cd conftest + mkdir out + echo "$lt_simple_compile_test_code" > conftest.$ac_ext + + lt_compiler_flag="-o out/conftest2.$ac_objext" + # Insert the option either (1) after the last *FLAGS variable, or + # (2) before a word containing "conftest.", or (3) at the end. + # Note that $ac_compile itself does not contain backslashes and begins + # with a dollar sign (not a hyphen), so the echo should work correctly. + lt_compile=`echo "$ac_compile" | $SED \ + -e 's:.*FLAGS}\{0,1\} :&$lt_compiler_flag :; t' \ + -e 's: [^ ]*conftest\.: $lt_compiler_flag&:; t' \ + -e 's:$: $lt_compiler_flag:'` + (eval echo "\"\$as_me:$LINENO: $lt_compile\"" >&5) + (eval "$lt_compile" 2>out/conftest.err) + ac_status=$? + cat out/conftest.err >&5 + echo "$as_me:$LINENO: \$? = $ac_status" >&5 + if (exit $ac_status) && test -s out/conftest2.$ac_objext + then + # The compiler can only warn and ignore the option if not recognized + # So say no if there are warnings + $ECHO "$_lt_compiler_boilerplate" | $SED '/^$/d' > out/conftest.exp + $SED '/^$/d; /^ *+/d' out/conftest.err >out/conftest.er2 + if test ! -s out/conftest.er2 || diff out/conftest.exp out/conftest.er2 >/dev/null; then + lt_cv_prog_compiler_c_o=yes + fi + fi + chmod u+w . 2>&5 + $RM conftest* + # SGI C++ compiler will create directory out/ii_files/ for + # template instantiation + test -d out/ii_files && $RM out/ii_files/* && rmdir out/ii_files + $RM out/* && rmdir out + cd .. + $RM -r conftest + $RM conftest* + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_prog_compiler_c_o" >&5 +$as_echo "$lt_cv_prog_compiler_c_o" >&6; } + + + + + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking if $compiler supports -c -o file.$ac_objext" >&5 +$as_echo_n "checking if $compiler supports -c -o file.$ac_objext... " >&6; } +if ${lt_cv_prog_compiler_c_o+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_prog_compiler_c_o=no + $RM -r conftest 2>/dev/null + mkdir conftest + cd conftest + mkdir out + echo "$lt_simple_compile_test_code" > conftest.$ac_ext + + lt_compiler_flag="-o out/conftest2.$ac_objext" + # Insert the option either (1) after the last *FLAGS variable, or + # (2) before a word containing "conftest.", or (3) at the end. + # Note that $ac_compile itself does not contain backslashes and begins + # with a dollar sign (not a hyphen), so the echo should work correctly. + lt_compile=`echo "$ac_compile" | $SED \ + -e 's:.*FLAGS}\{0,1\} :&$lt_compiler_flag :; t' \ + -e 's: [^ ]*conftest\.: $lt_compiler_flag&:; t' \ + -e 's:$: $lt_compiler_flag:'` + (eval echo "\"\$as_me:$LINENO: $lt_compile\"" >&5) + (eval "$lt_compile" 2>out/conftest.err) + ac_status=$? + cat out/conftest.err >&5 + echo "$as_me:$LINENO: \$? = $ac_status" >&5 + if (exit $ac_status) && test -s out/conftest2.$ac_objext + then + # The compiler can only warn and ignore the option if not recognized + # So say no if there are warnings + $ECHO "$_lt_compiler_boilerplate" | $SED '/^$/d' > out/conftest.exp + $SED '/^$/d; /^ *+/d' out/conftest.err >out/conftest.er2 + if test ! -s out/conftest.er2 || diff out/conftest.exp out/conftest.er2 >/dev/null; then + lt_cv_prog_compiler_c_o=yes + fi + fi + chmod u+w . 2>&5 + $RM conftest* + # SGI C++ compiler will create directory out/ii_files/ for + # template instantiation + test -d out/ii_files && $RM out/ii_files/* && rmdir out/ii_files + $RM out/* && rmdir out + cd .. + $RM -r conftest + $RM conftest* + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_prog_compiler_c_o" >&5 +$as_echo "$lt_cv_prog_compiler_c_o" >&6; } + + + + +hard_links=nottested +if test no = "$lt_cv_prog_compiler_c_o" && test no != "$need_locks"; then + # do not overwrite the value of need_locks provided by the user + { $as_echo "$as_me:${as_lineno-$LINENO}: checking if we can lock with hard links" >&5 +$as_echo_n "checking if we can lock with hard links... " >&6; } + hard_links=yes + $RM conftest* + ln conftest.a conftest.b 2>/dev/null && hard_links=no + touch conftest.a + ln conftest.a conftest.b 2>&5 || hard_links=no + ln conftest.a conftest.b 2>/dev/null && hard_links=no + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $hard_links" >&5 +$as_echo "$hard_links" >&6; } + if test no = "$hard_links"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: '$CC' does not support '-c -o', so 'make -j' may be unsafe" >&5 +$as_echo "$as_me: WARNING: '$CC' does not support '-c -o', so 'make -j' may be unsafe" >&2;} + need_locks=warn + fi +else + need_locks=no +fi + + + + + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether the $compiler linker ($LD) supports shared libraries" >&5 +$as_echo_n "checking whether the $compiler linker ($LD) supports shared libraries... " >&6; } + + runpath_var= + allow_undefined_flag= + always_export_symbols=no + archive_cmds= + archive_expsym_cmds= + compiler_needs_object=no + enable_shared_with_static_runtimes=no + export_dynamic_flag_spec= + export_symbols_cmds='$NM $libobjs $convenience | $global_symbol_pipe | $SED '\''s/.* //'\'' | sort | uniq > $export_symbols' + hardcode_automatic=no + hardcode_direct=no + hardcode_direct_absolute=no + hardcode_libdir_flag_spec= + hardcode_libdir_separator= + hardcode_minus_L=no + hardcode_shlibpath_var=unsupported + inherit_rpath=no + link_all_deplibs=unknown + module_cmds= + module_expsym_cmds= + old_archive_from_new_cmds= + old_archive_from_expsyms_cmds= + thread_safe_flag_spec= + whole_archive_flag_spec= + # include_expsyms should be a list of space-separated symbols to be *always* + # included in the symbol list + include_expsyms= + # exclude_expsyms can be an extended regexp of symbols to exclude + # it will be wrapped by ' (' and ')$', so one must not match beginning or + # end of line. Example: 'a|bc|.*d.*' will exclude the symbols 'a' and 'bc', + # as well as any symbol that contains 'd'. + exclude_expsyms='_GLOBAL_OFFSET_TABLE_|_GLOBAL__F[ID]_.*' + # Although _GLOBAL_OFFSET_TABLE_ is a valid symbol C name, most a.out + # platforms (ab)use it in PIC code, but their linkers get confused if + # the symbol is explicitly referenced. Since portable code cannot + # rely on this symbol name, it's probably fine to never include it in + # preloaded symbol tables. + # Exclude shared library initialization/finalization symbols. + extract_expsyms_cmds= + + case $host_os in + cygwin* | mingw* | pw32* | cegcc*) + # FIXME: the MSVC++ port hasn't been tested in a loooong time + # When not using gcc, we currently assume that we are using + # Microsoft Visual C++. + if test yes != "$GCC"; then + with_gnu_ld=no + fi + ;; + interix*) + # we just hope/assume this is gcc and not c89 (= MSVC++) + with_gnu_ld=yes + ;; + openbsd* | bitrig*) + with_gnu_ld=no + ;; + linux* | k*bsd*-gnu | gnu*) + link_all_deplibs=no + ;; + esac + + ld_shlibs=yes + + # On some targets, GNU ld is compatible enough with the native linker + # that we're better off using the native interface for both. + lt_use_gnu_ld_interface=no + if test yes = "$with_gnu_ld"; then + case $host_os in + aix*) + # The AIX port of GNU ld has always aspired to compatibility + # with the native linker. However, as the warning in the GNU ld + # block says, versions before 2.19.5* couldn't really create working + # shared libraries, regardless of the interface used. + case `$LD -v 2>&1` in + *\ \(GNU\ Binutils\)\ 2.19.5*) ;; + *\ \(GNU\ Binutils\)\ 2.[2-9]*) ;; + *\ \(GNU\ Binutils\)\ [3-9]*) ;; + *) + lt_use_gnu_ld_interface=yes + ;; + esac + ;; + *) + lt_use_gnu_ld_interface=yes + ;; + esac + fi + + if test yes = "$lt_use_gnu_ld_interface"; then + # If archive_cmds runs LD, not CC, wlarc should be empty + wlarc='$wl' + + # Set some defaults for GNU ld with shared library support. These + # are reset later if shared libraries are not supported. Putting them + # here allows them to be overridden if necessary. + runpath_var=LD_RUN_PATH + hardcode_libdir_flag_spec='$wl-rpath $wl$libdir' + export_dynamic_flag_spec='$wl--export-dynamic' + # ancient GNU ld didn't support --whole-archive et. al. + if $LD --help 2>&1 | $GREP 'no-whole-archive' > /dev/null; then + whole_archive_flag_spec=$wlarc'--whole-archive$convenience '$wlarc'--no-whole-archive' + else + whole_archive_flag_spec= + fi + supports_anon_versioning=no + case `$LD -v | $SED -e 's/(^)\+)\s\+//' 2>&1` in + *GNU\ gold*) supports_anon_versioning=yes ;; + *\ [01].* | *\ 2.[0-9].* | *\ 2.10.*) ;; # catch versions < 2.11 + *\ 2.11.93.0.2\ *) supports_anon_versioning=yes ;; # RH7.3 ... + *\ 2.11.92.0.12\ *) supports_anon_versioning=yes ;; # Mandrake 8.2 ... + *\ 2.11.*) ;; # other 2.11 versions + *) supports_anon_versioning=yes ;; + esac + + # See if GNU ld supports shared libraries. + case $host_os in + aix[3-9]*) + # On AIX/PPC, the GNU linker is very broken + if test ia64 != "$host_cpu"; then + ld_shlibs=no + cat <<_LT_EOF 1>&2 + +*** Warning: the GNU linker, at least up to release 2.19, is reported +*** to be unable to reliably create shared libraries on AIX. +*** Therefore, libtool is disabling shared libraries support. If you +*** really care for shared libraries, you may want to install binutils +*** 2.20 or above, or modify your PATH so that a non-GNU linker is found. +*** You will then need to restart the configuration process. + +_LT_EOF + fi + ;; + + amigaos*) + case $host_cpu in + powerpc) + # see comment about AmigaOS4 .so support + archive_cmds='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + archive_expsym_cmds='' + ;; + m68k) + archive_cmds='$RM $output_objdir/a2ixlibrary.data~$ECHO "#define NAME $libname" > $output_objdir/a2ixlibrary.data~$ECHO "#define LIBRARY_ID 1" >> $output_objdir/a2ixlibrary.data~$ECHO "#define VERSION $major" >> $output_objdir/a2ixlibrary.data~$ECHO "#define REVISION $revision" >> $output_objdir/a2ixlibrary.data~$AR $AR_FLAGS $lib $libobjs~$RANLIB $lib~(cd $output_objdir && a2ixlibrary -32)' + hardcode_libdir_flag_spec='-L$libdir' + hardcode_minus_L=yes + ;; + esac + ;; + + beos*) + if $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + allow_undefined_flag=unsupported + # Joseph Beckenbach says some releases of gcc + # support --undefined. This deserves some investigation. FIXME + archive_cmds='$CC -nostart $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + else + ld_shlibs=no + fi + ;; + + cygwin* | mingw* | pw32* | cegcc*) + # _LT_TAGVAR(hardcode_libdir_flag_spec, ) is actually meaningless, + # as there is no search path for DLLs. + hardcode_libdir_flag_spec='-L$libdir' + export_dynamic_flag_spec='$wl--export-all-symbols' + allow_undefined_flag=unsupported + always_export_symbols=no + enable_shared_with_static_runtimes=yes + export_symbols_cmds='$NM $libobjs $convenience | $global_symbol_pipe | $SED -e '\''/^[BCDGRS][ ]/s/.*[ ]\([^ ]*\)/\1 DATA/;s/^.*[ ]__nm__\([^ ]*\)[ ][^ ]*/\1 DATA/;/^I[ ]/d;/^[AITW][ ]/s/.* //'\'' | sort | uniq > $export_symbols' + exclude_expsyms='[_]+GLOBAL_OFFSET_TABLE_|[_]+GLOBAL__[FID]_.*|[_]+head_[A-Za-z0-9_]+_dll|[A-Za-z0-9_]+_dll_iname' + + if $LD --help 2>&1 | $GREP 'auto-import' > /dev/null; then + archive_cmds='$CC -shared $libobjs $deplibs $compiler_flags -o $output_objdir/$soname $wl--enable-auto-image-base -Xlinker --out-implib -Xlinker $lib' + # If the export-symbols file already is a .def file, use it as + # is; otherwise, prepend EXPORTS... + archive_expsym_cmds='if test DEF = "`$SED -n -e '\''s/^[ ]*//'\'' -e '\''/^\(;.*\)*$/d'\'' -e '\''s/^\(EXPORTS\|LIBRARY\)\([ ].*\)*$/DEF/p'\'' -e q $export_symbols`" ; then + cp $export_symbols $output_objdir/$soname.def; + else + echo EXPORTS > $output_objdir/$soname.def; + cat $export_symbols >> $output_objdir/$soname.def; + fi~ + $CC -shared $output_objdir/$soname.def $libobjs $deplibs $compiler_flags -o $output_objdir/$soname $wl--enable-auto-image-base -Xlinker --out-implib -Xlinker $lib' + else + ld_shlibs=no + fi + ;; + + haiku*) + archive_cmds='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + link_all_deplibs=yes + ;; + + os2*) + hardcode_libdir_flag_spec='-L$libdir' + hardcode_minus_L=yes + allow_undefined_flag=unsupported + shrext_cmds=.dll + archive_cmds='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + emxexp $libobjs | $SED /"_DLL_InitTerm"/d >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + archive_expsym_cmds='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + prefix_cmds="$SED"~ + if test EXPORTS = "`$SED 1q $export_symbols`"; then + prefix_cmds="$prefix_cmds -e 1d"; + fi~ + prefix_cmds="$prefix_cmds -e \"s/^\(.*\)$/_\1/g\""~ + cat $export_symbols | $prefix_cmds >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + old_archive_From_new_cmds='emximp -o $output_objdir/${libname}_dll.a $output_objdir/$libname.def' + enable_shared_with_static_runtimes=yes + ;; + + interix[3-9]*) + hardcode_direct=no + hardcode_shlibpath_var=no + hardcode_libdir_flag_spec='$wl-rpath,$libdir' + export_dynamic_flag_spec='$wl-E' + # Hack: On Interix 3.x, we cannot compile PIC because of a broken gcc. + # Instead, shared libraries are loaded at an image base (0x10000000 by + # default) and relocated if they conflict, which is a slow very memory + # consuming and fragmenting process. To avoid this, we pick a random, + # 256 KiB-aligned image base between 0x50000000 and 0x6FFC0000 at link + # time. Moving up from 0x10000000 also allows more sbrk(2) space. + archive_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-h,$soname $wl--image-base,`expr ${RANDOM-$$} % 4096 / 2 \* 262144 + 1342177280` -o $lib' + archive_expsym_cmds='sed "s|^|_|" $export_symbols >$output_objdir/$soname.expsym~$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-h,$soname $wl--retain-symbols-file,$output_objdir/$soname.expsym $wl--image-base,`expr ${RANDOM-$$} % 4096 / 2 \* 262144 + 1342177280` -o $lib' + ;; + + gnu* | linux* | tpf* | k*bsd*-gnu | kopensolaris*-gnu) + tmp_diet=no + if test linux-dietlibc = "$host_os"; then + case $cc_basename in + diet\ *) tmp_diet=yes;; # linux-dietlibc with static linking (!diet-dyn) + esac + fi + if $LD --help 2>&1 | $EGREP ': supported targets:.* elf' > /dev/null \ + && test no = "$tmp_diet" + then + tmp_addflag=' $pic_flag' + tmp_sharedflag='-shared' + case $cc_basename,$host_cpu in + pgcc*) # Portland Group C compiler + whole_archive_flag_spec='$wl--whole-archive`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + tmp_addflag=' $pic_flag' + ;; + pgf77* | pgf90* | pgf95* | pgfortran*) + # Portland Group f77 and f90 compilers + whole_archive_flag_spec='$wl--whole-archive`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + tmp_addflag=' $pic_flag -Mnomain' ;; + ecc*,ia64* | icc*,ia64*) # Intel C compiler on ia64 + tmp_addflag=' -i_dynamic' ;; + efc*,ia64* | ifort*,ia64*) # Intel Fortran compiler on ia64 + tmp_addflag=' -i_dynamic -nofor_main' ;; + ifc* | ifort*) # Intel Fortran compiler + tmp_addflag=' -nofor_main' ;; + lf95*) # Lahey Fortran 8.1 + whole_archive_flag_spec= + tmp_sharedflag='--shared' ;; + nagfor*) # NAGFOR 5.3 + tmp_sharedflag='-Wl,-shared' ;; + xl[cC]* | bgxl[cC]* | mpixl[cC]*) # IBM XL C 8.0 on PPC (deal with xlf below) + tmp_sharedflag='-qmkshrobj' + tmp_addflag= ;; + nvcc*) # Cuda Compiler Driver 2.2 + whole_archive_flag_spec='$wl--whole-archive`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + compiler_needs_object=yes + ;; + esac + case `$CC -V 2>&1 | sed 5q` in + *Sun\ C*) # Sun C 5.9 + whole_archive_flag_spec='$wl--whole-archive`new_convenience=; for conv in $convenience\"\"; do test -z \"$conv\" || new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + compiler_needs_object=yes + tmp_sharedflag='-G' ;; + *Sun\ F*) # Sun Fortran 8.3 + tmp_sharedflag='-G' ;; + esac + archive_cmds='$CC '"$tmp_sharedflag""$tmp_addflag"' $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + + if test yes = "$supports_anon_versioning"; then + archive_expsym_cmds='echo "{ global:" > $output_objdir/$libname.ver~ + cat $export_symbols | sed -e "s/\(.*\)/\1;/" >> $output_objdir/$libname.ver~ + echo "local: *; };" >> $output_objdir/$libname.ver~ + $CC '"$tmp_sharedflag""$tmp_addflag"' $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-version-script $wl$output_objdir/$libname.ver -o $lib' + fi + + case $cc_basename in + tcc*) + export_dynamic_flag_spec='-rdynamic' + ;; + xlf* | bgf* | bgxlf* | mpixlf*) + # IBM XL Fortran 10.1 on PPC cannot create shared libs itself + whole_archive_flag_spec='--whole-archive$convenience --no-whole-archive' + hardcode_libdir_flag_spec='$wl-rpath $wl$libdir' + archive_cmds='$LD -shared $libobjs $deplibs $linker_flags -soname $soname -o $lib' + if test yes = "$supports_anon_versioning"; then + archive_expsym_cmds='echo "{ global:" > $output_objdir/$libname.ver~ + cat $export_symbols | sed -e "s/\(.*\)/\1;/" >> $output_objdir/$libname.ver~ + echo "local: *; };" >> $output_objdir/$libname.ver~ + $LD -shared $libobjs $deplibs $linker_flags -soname $soname -version-script $output_objdir/$libname.ver -o $lib' + fi + ;; + esac + else + ld_shlibs=no + fi + ;; + + netbsd* | netbsdelf*-gnu) + if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then + archive_cmds='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' + wlarc= + else + archive_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + archive_expsym_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + fi + ;; + + solaris*) + if $LD -v 2>&1 | $GREP 'BFD 2\.8' > /dev/null; then + ld_shlibs=no + cat <<_LT_EOF 1>&2 + +*** Warning: The releases 2.8.* of the GNU linker cannot reliably +*** create shared libraries on Solaris systems. Therefore, libtool +*** is disabling shared libraries support. We urge you to upgrade GNU +*** binutils to release 2.9.1 or newer. Another option is to modify +*** your PATH or compiler configuration so that the native linker is +*** used, and then restart. + +_LT_EOF + elif $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + archive_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + archive_expsym_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + else + ld_shlibs=no + fi + ;; + + sysv5* | sco3.2v5* | sco5v6* | unixware* | OpenUNIX*) + case `$LD -v 2>&1` in + *\ [01].* | *\ 2.[0-9].* | *\ 2.1[0-5].*) + ld_shlibs=no + cat <<_LT_EOF 1>&2 + +*** Warning: Releases of the GNU linker prior to 2.16.91.0.3 cannot +*** reliably create shared libraries on SCO systems. Therefore, libtool +*** is disabling shared libraries support. We urge you to upgrade GNU +*** binutils to release 2.16.91.0.3 or newer. Another option is to modify +*** your PATH or compiler configuration so that the native linker is +*** used, and then restart. + +_LT_EOF + ;; + *) + # For security reasons, it is highly recommended that you always + # use absolute paths for naming shared libraries, and exclude the + # DT_RUNPATH tag from executables and libraries. But doing so + # requires that you compile everything twice, which is a pain. + if $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + hardcode_libdir_flag_spec='$wl-rpath $wl$libdir' + archive_cmds='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + archive_expsym_cmds='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + else + ld_shlibs=no + fi + ;; + esac + ;; + + sunos4*) + archive_cmds='$LD -assert pure-text -Bshareable -o $lib $libobjs $deplibs $linker_flags' + wlarc= + hardcode_direct=yes + hardcode_shlibpath_var=no + ;; + + *) + if $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + archive_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + archive_expsym_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + else + ld_shlibs=no + fi + ;; + esac + + if test no = "$ld_shlibs"; then + runpath_var= + hardcode_libdir_flag_spec= + export_dynamic_flag_spec= + whole_archive_flag_spec= + fi + else + # PORTME fill in a description of your system's linker (not GNU ld) + case $host_os in + aix3*) + allow_undefined_flag=unsupported + always_export_symbols=yes + archive_expsym_cmds='$LD -o $output_objdir/$soname $libobjs $deplibs $linker_flags -bE:$export_symbols -T512 -H512 -bM:SRE~$AR $AR_FLAGS $lib $output_objdir/$soname' + # Note: this linker hardcodes the directories in LIBPATH if there + # are no directories specified by -L. + hardcode_minus_L=yes + if test yes = "$GCC" && test -z "$lt_prog_compiler_static"; then + # Neither direct hardcoding nor static linking is supported with a + # broken collect2. + hardcode_direct=unsupported + fi + ;; + + aix[4-9]*) + if test ia64 = "$host_cpu"; then + # On IA64, the linker does run time linking by default, so we don't + # have to do anything special. + aix_use_runtimelinking=no + exp_sym_flag='-Bexport' + no_entry_flag= + else + # If we're using GNU nm, then we don't want the "-C" option. + # -C means demangle to GNU nm, but means don't demangle to AIX nm. + # Without the "-l" option, or with the "-B" option, AIX nm treats + # weak defined symbols like other global defined symbols, whereas + # GNU nm marks them as "W". + # While the 'weak' keyword is ignored in the Export File, we need + # it in the Import File for the 'aix-soname' feature, so we have + # to replace the "-B" option with "-P" for AIX nm. + if $NM -V 2>&1 | $GREP 'GNU' > /dev/null; then + export_symbols_cmds='$NM -Bpg $libobjs $convenience | awk '\''{ if (((\$ 2 == "T") || (\$ 2 == "D") || (\$ 2 == "B") || (\$ 2 == "W")) && (substr(\$ 3,1,1) != ".")) { if (\$ 2 == "W") { print \$ 3 " weak" } else { print \$ 3 } } }'\'' | sort -u > $export_symbols' + else + export_symbols_cmds='`func_echo_all $NM | $SED -e '\''s/B\([^B]*\)$/P\1/'\''` -PCpgl $libobjs $convenience | awk '\''{ if (((\$ 2 == "T") || (\$ 2 == "D") || (\$ 2 == "B") || (\$ 2 == "W") || (\$ 2 == "V") || (\$ 2 == "Z")) && (substr(\$ 1,1,1) != ".")) { if ((\$ 2 == "W") || (\$ 2 == "V") || (\$ 2 == "Z")) { print \$ 1 " weak" } else { print \$ 1 } } }'\'' | sort -u > $export_symbols' + fi + aix_use_runtimelinking=no + + # Test if we are trying to use run time linking or normal + # AIX style linking. If -brtl is somewhere in LDFLAGS, we + # have runtime linking enabled, and use it for executables. + # For shared libraries, we enable/disable runtime linking + # depending on the kind of the shared library created - + # when "with_aix_soname,aix_use_runtimelinking" is: + # "aix,no" lib.a(lib.so.V) shared, rtl:no, for executables + # "aix,yes" lib.so shared, rtl:yes, for executables + # lib.a static archive + # "both,no" lib.so.V(shr.o) shared, rtl:yes + # lib.a(lib.so.V) shared, rtl:no, for executables + # "both,yes" lib.so.V(shr.o) shared, rtl:yes, for executables + # lib.a(lib.so.V) shared, rtl:no + # "svr4,*" lib.so.V(shr.o) shared, rtl:yes, for executables + # lib.a static archive + case $host_os in aix4.[23]|aix4.[23].*|aix[5-9]*) + for ld_flag in $LDFLAGS; do + if (test x-brtl = "x$ld_flag" || test x-Wl,-brtl = "x$ld_flag"); then + aix_use_runtimelinking=yes + break + fi + done + if test svr4,no = "$with_aix_soname,$aix_use_runtimelinking"; then + # With aix-soname=svr4, we create the lib.so.V shared archives only, + # so we don't have lib.a shared libs to link our executables. + # We have to force runtime linking in this case. + aix_use_runtimelinking=yes + LDFLAGS="$LDFLAGS -Wl,-brtl" + fi + ;; + esac + + exp_sym_flag='-bexport' + no_entry_flag='-bnoentry' + fi + + # When large executables or shared objects are built, AIX ld can + # have problems creating the table of contents. If linking a library + # or program results in "error TOC overflow" add -mminimal-toc to + # CXXFLAGS/CFLAGS for g++/gcc. In the cases where that is not + # enough to fix the problem, add -Wl,-bbigtoc to LDFLAGS. + + archive_cmds='' + hardcode_direct=yes + hardcode_direct_absolute=yes + hardcode_libdir_separator=':' + link_all_deplibs=yes + file_list_spec='$wl-f,' + case $with_aix_soname,$aix_use_runtimelinking in + aix,*) ;; # traditional, no import file + svr4,* | *,yes) # use import file + # The Import File defines what to hardcode. + hardcode_direct=no + hardcode_direct_absolute=no + ;; + esac + + if test yes = "$GCC"; then + case $host_os in aix4.[012]|aix4.[012].*) + # We only want to do this on AIX 4.2 and lower, the check + # below for broken collect2 doesn't work under 4.3+ + collect2name=`$CC -print-prog-name=collect2` + if test -f "$collect2name" && + strings "$collect2name" | $GREP resolve_lib_name >/dev/null + then + # We have reworked collect2 + : + else + # We have old collect2 + hardcode_direct=unsupported + # It fails to find uninstalled libraries when the uninstalled + # path is not listed in the libpath. Setting hardcode_minus_L + # to unsupported forces relinking + hardcode_minus_L=yes + hardcode_libdir_flag_spec='-L$libdir' + hardcode_libdir_separator= + fi + ;; + esac + shared_flag='-shared' + if test yes = "$aix_use_runtimelinking"; then + shared_flag="$shared_flag "'$wl-G' + fi + # Need to ensure runtime linking is disabled for the traditional + # shared library, or the linker may eventually find shared libraries + # /with/ Import File - we do not want to mix them. + shared_flag_aix='-shared' + shared_flag_svr4='-shared $wl-G' + else + # not using gcc + if test ia64 = "$host_cpu"; then + # VisualAge C++, Version 5.5 for AIX 5L for IA-64, Beta 3 Release + # chokes on -Wl,-G. The following line is correct: + shared_flag='-G' + else + if test yes = "$aix_use_runtimelinking"; then + shared_flag='$wl-G' + else + shared_flag='$wl-bM:SRE' + fi + shared_flag_aix='$wl-bM:SRE' + shared_flag_svr4='$wl-G' + fi + fi + + export_dynamic_flag_spec='$wl-bexpall' + # It seems that -bexpall does not export symbols beginning with + # underscore (_), so it is better to generate a list of symbols to export. + always_export_symbols=yes + if test aix,yes = "$with_aix_soname,$aix_use_runtimelinking"; then + # Warning - without using the other runtime loading flags (-brtl), + # -berok will link without error, but may produce a broken library. + allow_undefined_flag='-berok' + # Determine the default libpath from the value encoded in an + # empty executable. + if test set = "${lt_cv_aix_libpath+set}"; then + aix_libpath=$lt_cv_aix_libpath +else + if ${lt_cv_aix_libpath_+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + + lt_aix_libpath_sed=' + /Import File Strings/,/^$/ { + /^0/ { + s/^0 *\([^ ]*\) *$/\1/ + p + } + }' + lt_cv_aix_libpath_=`dump -H conftest$ac_exeext 2>/dev/null | $SED -n -e "$lt_aix_libpath_sed"` + # Check for a 64-bit object if we didn't find anything. + if test -z "$lt_cv_aix_libpath_"; then + lt_cv_aix_libpath_=`dump -HX64 conftest$ac_exeext 2>/dev/null | $SED -n -e "$lt_aix_libpath_sed"` + fi +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + if test -z "$lt_cv_aix_libpath_"; then + lt_cv_aix_libpath_=/usr/lib:/lib + fi + +fi + + aix_libpath=$lt_cv_aix_libpath_ +fi + + hardcode_libdir_flag_spec='$wl-blibpath:$libdir:'"$aix_libpath" + archive_expsym_cmds='$CC -o $output_objdir/$soname $libobjs $deplibs $wl'$no_entry_flag' $compiler_flags `if test -n "$allow_undefined_flag"; then func_echo_all "$wl$allow_undefined_flag"; else :; fi` $wl'$exp_sym_flag:\$export_symbols' '$shared_flag + else + if test ia64 = "$host_cpu"; then + hardcode_libdir_flag_spec='$wl-R $libdir:/usr/lib:/lib' + allow_undefined_flag="-z nodefs" + archive_expsym_cmds="\$CC $shared_flag"' -o $output_objdir/$soname $libobjs $deplibs '"\$wl$no_entry_flag"' $compiler_flags $wl$allow_undefined_flag '"\$wl$exp_sym_flag:\$export_symbols" + else + # Determine the default libpath from the value encoded in an + # empty executable. + if test set = "${lt_cv_aix_libpath+set}"; then + aix_libpath=$lt_cv_aix_libpath +else + if ${lt_cv_aix_libpath_+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + + lt_aix_libpath_sed=' + /Import File Strings/,/^$/ { + /^0/ { + s/^0 *\([^ ]*\) *$/\1/ + p + } + }' + lt_cv_aix_libpath_=`dump -H conftest$ac_exeext 2>/dev/null | $SED -n -e "$lt_aix_libpath_sed"` + # Check for a 64-bit object if we didn't find anything. + if test -z "$lt_cv_aix_libpath_"; then + lt_cv_aix_libpath_=`dump -HX64 conftest$ac_exeext 2>/dev/null | $SED -n -e "$lt_aix_libpath_sed"` + fi +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + if test -z "$lt_cv_aix_libpath_"; then + lt_cv_aix_libpath_=/usr/lib:/lib + fi + +fi + + aix_libpath=$lt_cv_aix_libpath_ +fi + + hardcode_libdir_flag_spec='$wl-blibpath:$libdir:'"$aix_libpath" + # Warning - without using the other run time loading flags, + # -berok will link without error, but may produce a broken library. + no_undefined_flag=' $wl-bernotok' + allow_undefined_flag=' $wl-berok' + if test yes = "$with_gnu_ld"; then + # We only use this code for GNU lds that support --whole-archive. + whole_archive_flag_spec='$wl--whole-archive$convenience $wl--no-whole-archive' + else + # Exported symbols can be pulled into shared objects from archives + whole_archive_flag_spec='$convenience' + fi + archive_cmds_need_lc=yes + archive_expsym_cmds='$RM -r $output_objdir/$realname.d~$MKDIR $output_objdir/$realname.d' + # -brtl affects multiple linker settings, -berok does not and is overridden later + compiler_flags_filtered='`func_echo_all "$compiler_flags " | $SED -e "s%-brtl\\([, ]\\)%-berok\\1%g"`' + if test svr4 != "$with_aix_soname"; then + # This is similar to how AIX traditionally builds its shared libraries. + archive_expsym_cmds="$archive_expsym_cmds"'~$CC '$shared_flag_aix' -o $output_objdir/$realname.d/$soname $libobjs $deplibs $wl-bnoentry '$compiler_flags_filtered'$wl-bE:$export_symbols$allow_undefined_flag~$AR $AR_FLAGS $output_objdir/$libname$release.a $output_objdir/$realname.d/$soname' + fi + if test aix != "$with_aix_soname"; then + archive_expsym_cmds="$archive_expsym_cmds"'~$CC '$shared_flag_svr4' -o $output_objdir/$realname.d/$shared_archive_member_spec.o $libobjs $deplibs $wl-bnoentry '$compiler_flags_filtered'$wl-bE:$export_symbols$allow_undefined_flag~$STRIP -e $output_objdir/$realname.d/$shared_archive_member_spec.o~( func_echo_all "#! $soname($shared_archive_member_spec.o)"; if test shr_64 = "$shared_archive_member_spec"; then func_echo_all "# 64"; else func_echo_all "# 32"; fi; cat $export_symbols ) > $output_objdir/$realname.d/$shared_archive_member_spec.imp~$AR $AR_FLAGS $output_objdir/$soname $output_objdir/$realname.d/$shared_archive_member_spec.o $output_objdir/$realname.d/$shared_archive_member_spec.imp' + else + # used by -dlpreopen to get the symbols + archive_expsym_cmds="$archive_expsym_cmds"'~$MV $output_objdir/$realname.d/$soname $output_objdir' + fi + archive_expsym_cmds="$archive_expsym_cmds"'~$RM -r $output_objdir/$realname.d' + fi + fi + ;; + + amigaos*) + case $host_cpu in + powerpc) + # see comment about AmigaOS4 .so support + archive_cmds='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + archive_expsym_cmds='' + ;; + m68k) + archive_cmds='$RM $output_objdir/a2ixlibrary.data~$ECHO "#define NAME $libname" > $output_objdir/a2ixlibrary.data~$ECHO "#define LIBRARY_ID 1" >> $output_objdir/a2ixlibrary.data~$ECHO "#define VERSION $major" >> $output_objdir/a2ixlibrary.data~$ECHO "#define REVISION $revision" >> $output_objdir/a2ixlibrary.data~$AR $AR_FLAGS $lib $libobjs~$RANLIB $lib~(cd $output_objdir && a2ixlibrary -32)' + hardcode_libdir_flag_spec='-L$libdir' + hardcode_minus_L=yes + ;; + esac + ;; + + bsdi[45]*) + export_dynamic_flag_spec=-rdynamic + ;; + + cygwin* | mingw* | pw32* | cegcc*) + # When not using gcc, we currently assume that we are using + # Microsoft Visual C++. + # hardcode_libdir_flag_spec is actually meaningless, as there is + # no search path for DLLs. + case $cc_basename in + cl*) + # Native MSVC + hardcode_libdir_flag_spec=' ' + allow_undefined_flag=unsupported + always_export_symbols=yes + file_list_spec='@' + # Tell ltmain to make .lib files, not .a files. + libext=lib + # Tell ltmain to make .dll files, not .so files. + shrext_cmds=.dll + # FIXME: Setting linknames here is a bad hack. + archive_cmds='$CC -o $output_objdir/$soname $libobjs $compiler_flags $deplibs -Wl,-DLL,-IMPLIB:"$tool_output_objdir$libname.dll.lib"~linknames=' + archive_expsym_cmds='if test DEF = "`$SED -n -e '\''s/^[ ]*//'\'' -e '\''/^\(;.*\)*$/d'\'' -e '\''s/^\(EXPORTS\|LIBRARY\)\([ ].*\)*$/DEF/p'\'' -e q $export_symbols`" ; then + cp "$export_symbols" "$output_objdir/$soname.def"; + echo "$tool_output_objdir$soname.def" > "$output_objdir/$soname.exp"; + else + $SED -e '\''s/^/-link -EXPORT:/'\'' < $export_symbols > $output_objdir/$soname.exp; + fi~ + $CC -o $tool_output_objdir$soname $libobjs $compiler_flags $deplibs "@$tool_output_objdir$soname.exp" -Wl,-DLL,-IMPLIB:"$tool_output_objdir$libname.dll.lib"~ + linknames=' + # The linker will not automatically build a static lib if we build a DLL. + # _LT_TAGVAR(old_archive_from_new_cmds, )='true' + enable_shared_with_static_runtimes=yes + exclude_expsyms='_NULL_IMPORT_DESCRIPTOR|_IMPORT_DESCRIPTOR_.*' + export_symbols_cmds='$NM $libobjs $convenience | $global_symbol_pipe | $SED -e '\''/^[BCDGRS][ ]/s/.*[ ]\([^ ]*\)/\1,DATA/'\'' | $SED -e '\''/^[AITW][ ]/s/.*[ ]//'\'' | sort | uniq > $export_symbols' + # Don't use ranlib + old_postinstall_cmds='chmod 644 $oldlib' + postlink_cmds='lt_outputfile="@OUTPUT@"~ + lt_tool_outputfile="@TOOL_OUTPUT@"~ + case $lt_outputfile in + *.exe|*.EXE) ;; + *) + lt_outputfile=$lt_outputfile.exe + lt_tool_outputfile=$lt_tool_outputfile.exe + ;; + esac~ + if test : != "$MANIFEST_TOOL" && test -f "$lt_outputfile.manifest"; then + $MANIFEST_TOOL -manifest "$lt_tool_outputfile.manifest" -outputresource:"$lt_tool_outputfile" || exit 1; + $RM "$lt_outputfile.manifest"; + fi' + ;; + *) + # Assume MSVC wrapper + hardcode_libdir_flag_spec=' ' + allow_undefined_flag=unsupported + # Tell ltmain to make .lib files, not .a files. + libext=lib + # Tell ltmain to make .dll files, not .so files. + shrext_cmds=.dll + # FIXME: Setting linknames here is a bad hack. + archive_cmds='$CC -o $lib $libobjs $compiler_flags `func_echo_all "$deplibs" | $SED '\''s/ -lc$//'\''` -link -dll~linknames=' + # The linker will automatically build a .lib file if we build a DLL. + old_archive_from_new_cmds='true' + # FIXME: Should let the user specify the lib program. + old_archive_cmds='lib -OUT:$oldlib$oldobjs$old_deplibs' + enable_shared_with_static_runtimes=yes + ;; + esac + ;; + + darwin* | rhapsody*) + + + archive_cmds_need_lc=no + hardcode_direct=no + hardcode_automatic=yes + hardcode_shlibpath_var=unsupported + if test yes = "$lt_cv_ld_force_load"; then + whole_archive_flag_spec='`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience $wl-force_load,$conv\"; done; func_echo_all \"$new_convenience\"`' + + else + whole_archive_flag_spec='' + fi + link_all_deplibs=yes + allow_undefined_flag=$_lt_dar_allow_undefined + case $cc_basename in + ifort*|nagfor*) _lt_dar_can_shared=yes ;; + *) _lt_dar_can_shared=$GCC ;; + esac + if test yes = "$_lt_dar_can_shared"; then + output_verbose_link_cmd=func_echo_all + archive_cmds="\$CC -dynamiclib \$allow_undefined_flag -o \$lib \$libobjs \$deplibs \$compiler_flags -install_name \$rpath/\$soname \$verstring $_lt_dar_single_mod$_lt_dsymutil" + module_cmds="\$CC \$allow_undefined_flag -o \$lib -bundle \$libobjs \$deplibs \$compiler_flags$_lt_dsymutil" + archive_expsym_cmds="sed 's|^|_|' < \$export_symbols > \$output_objdir/\$libname-symbols.expsym~\$CC -dynamiclib \$allow_undefined_flag -o \$lib \$libobjs \$deplibs \$compiler_flags -install_name \$rpath/\$soname \$verstring $_lt_dar_single_mod$_lt_dar_export_syms$_lt_dsymutil" + module_expsym_cmds="sed -e 's|^|_|' < \$export_symbols > \$output_objdir/\$libname-symbols.expsym~\$CC \$allow_undefined_flag -o \$lib -bundle \$libobjs \$deplibs \$compiler_flags$_lt_dar_export_syms$_lt_dsymutil" + + else + ld_shlibs=no + fi + + ;; + + dgux*) + archive_cmds='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + hardcode_libdir_flag_spec='-L$libdir' + hardcode_shlibpath_var=no + ;; + + # FreeBSD 2.2.[012] allows us to include c++rt0.o to get C++ constructor + # support. Future versions do this automatically, but an explicit c++rt0.o + # does not break anything, and helps significantly (at the cost of a little + # extra space). + freebsd2.2*) + archive_cmds='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags /usr/lib/c++rt0.o' + hardcode_libdir_flag_spec='-R$libdir' + hardcode_direct=yes + hardcode_shlibpath_var=no + ;; + + # Unfortunately, older versions of FreeBSD 2 do not have this feature. + freebsd2.*) + archive_cmds='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' + hardcode_direct=yes + hardcode_minus_L=yes + hardcode_shlibpath_var=no + ;; + + # FreeBSD 3 and greater uses gcc -shared to do shared libraries. + freebsd* | dragonfly*) + archive_cmds='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags' + hardcode_libdir_flag_spec='-R$libdir' + hardcode_direct=yes + hardcode_shlibpath_var=no + ;; + + hpux9*) + if test yes = "$GCC"; then + archive_cmds='$RM $output_objdir/$soname~$CC -shared $pic_flag $wl+b $wl$install_libdir -o $output_objdir/$soname $libobjs $deplibs $compiler_flags~test "x$output_objdir/$soname" = "x$lib" || mv $output_objdir/$soname $lib' + else + archive_cmds='$RM $output_objdir/$soname~$LD -b +b $install_libdir -o $output_objdir/$soname $libobjs $deplibs $linker_flags~test "x$output_objdir/$soname" = "x$lib" || mv $output_objdir/$soname $lib' + fi + hardcode_libdir_flag_spec='$wl+b $wl$libdir' + hardcode_libdir_separator=: + hardcode_direct=yes + + # hardcode_minus_L: Not really in the search PATH, + # but as the default location of the library. + hardcode_minus_L=yes + export_dynamic_flag_spec='$wl-E' + ;; + + hpux10*) + if test yes,no = "$GCC,$with_gnu_ld"; then + archive_cmds='$CC -shared $pic_flag $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $libobjs $deplibs $compiler_flags' + else + archive_cmds='$LD -b +h $soname +b $install_libdir -o $lib $libobjs $deplibs $linker_flags' + fi + if test no = "$with_gnu_ld"; then + hardcode_libdir_flag_spec='$wl+b $wl$libdir' + hardcode_libdir_separator=: + hardcode_direct=yes + hardcode_direct_absolute=yes + export_dynamic_flag_spec='$wl-E' + # hardcode_minus_L: Not really in the search PATH, + # but as the default location of the library. + hardcode_minus_L=yes + fi + ;; + + hpux11*) + if test yes,no = "$GCC,$with_gnu_ld"; then + case $host_cpu in + hppa*64*) + archive_cmds='$CC -shared $wl+h $wl$soname -o $lib $libobjs $deplibs $compiler_flags' + ;; + ia64*) + archive_cmds='$CC -shared $pic_flag $wl+h $wl$soname $wl+nodefaultrpath -o $lib $libobjs $deplibs $compiler_flags' + ;; + *) + archive_cmds='$CC -shared $pic_flag $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $libobjs $deplibs $compiler_flags' + ;; + esac + else + case $host_cpu in + hppa*64*) + archive_cmds='$CC -b $wl+h $wl$soname -o $lib $libobjs $deplibs $compiler_flags' + ;; + ia64*) + archive_cmds='$CC -b $wl+h $wl$soname $wl+nodefaultrpath -o $lib $libobjs $deplibs $compiler_flags' + ;; + *) + + # Older versions of the 11.00 compiler do not understand -b yet + # (HP92453-01 A.11.01.20 doesn't, HP92453-01 B.11.X.35175-35176.GP does) + { $as_echo "$as_me:${as_lineno-$LINENO}: checking if $CC understands -b" >&5 +$as_echo_n "checking if $CC understands -b... " >&6; } +if ${lt_cv_prog_compiler__b+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_prog_compiler__b=no + save_LDFLAGS=$LDFLAGS + LDFLAGS="$LDFLAGS -b" + echo "$lt_simple_link_test_code" > conftest.$ac_ext + if (eval $ac_link 2>conftest.err) && test -s conftest$ac_exeext; then + # The linker can only warn and ignore the option if not recognized + # So say no if there are warnings + if test -s conftest.err; then + # Append any errors to the config.log. + cat conftest.err 1>&5 + $ECHO "$_lt_linker_boilerplate" | $SED '/^$/d' > conftest.exp + $SED '/^$/d; /^ *+/d' conftest.err >conftest.er2 + if diff conftest.exp conftest.er2 >/dev/null; then + lt_cv_prog_compiler__b=yes + fi + else + lt_cv_prog_compiler__b=yes + fi + fi + $RM -r conftest* + LDFLAGS=$save_LDFLAGS + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_prog_compiler__b" >&5 +$as_echo "$lt_cv_prog_compiler__b" >&6; } + +if test yes = "$lt_cv_prog_compiler__b"; then + archive_cmds='$CC -b $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $libobjs $deplibs $compiler_flags' +else + archive_cmds='$LD -b +h $soname +b $install_libdir -o $lib $libobjs $deplibs $linker_flags' +fi + + ;; + esac + fi + if test no = "$with_gnu_ld"; then + hardcode_libdir_flag_spec='$wl+b $wl$libdir' + hardcode_libdir_separator=: + + case $host_cpu in + hppa*64*|ia64*) + hardcode_direct=no + hardcode_shlibpath_var=no + ;; + *) + hardcode_direct=yes + hardcode_direct_absolute=yes + export_dynamic_flag_spec='$wl-E' + + # hardcode_minus_L: Not really in the search PATH, + # but as the default location of the library. + hardcode_minus_L=yes + ;; + esac + fi + ;; + + irix5* | irix6* | nonstopux*) + if test yes = "$GCC"; then + archive_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + # Try to use the -exported_symbol ld option, if it does not + # work, assume that -exports_file does not work either and + # implicitly export all symbols. + # This should be the same for all languages, so no per-tag cache variable. + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether the $host_os linker accepts -exported_symbol" >&5 +$as_echo_n "checking whether the $host_os linker accepts -exported_symbol... " >&6; } +if ${lt_cv_irix_exported_symbol+:} false; then : + $as_echo_n "(cached) " >&6 +else + save_LDFLAGS=$LDFLAGS + LDFLAGS="$LDFLAGS -shared $wl-exported_symbol ${wl}foo $wl-update_registry $wl/dev/null" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +int foo (void) { return 0; } +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + lt_cv_irix_exported_symbol=yes +else + lt_cv_irix_exported_symbol=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + LDFLAGS=$save_LDFLAGS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_irix_exported_symbol" >&5 +$as_echo "$lt_cv_irix_exported_symbol" >&6; } + if test yes = "$lt_cv_irix_exported_symbol"; then + archive_expsym_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' + fi + link_all_deplibs=no + else + archive_cmds='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + archive_expsym_cmds='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' + fi + archive_cmds_need_lc='no' + hardcode_libdir_flag_spec='$wl-rpath $wl$libdir' + hardcode_libdir_separator=: + inherit_rpath=yes + link_all_deplibs=yes + ;; + + linux*) + case $cc_basename in + tcc*) + # Fabrice Bellard et al's Tiny C Compiler + ld_shlibs=yes + archive_cmds='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags' + ;; + esac + ;; + + netbsd* | netbsdelf*-gnu) + if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then + archive_cmds='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out + else + archive_cmds='$LD -shared -o $lib $libobjs $deplibs $linker_flags' # ELF + fi + hardcode_libdir_flag_spec='-R$libdir' + hardcode_direct=yes + hardcode_shlibpath_var=no + ;; + + newsos6) + archive_cmds='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + hardcode_direct=yes + hardcode_libdir_flag_spec='$wl-rpath $wl$libdir' + hardcode_libdir_separator=: + hardcode_shlibpath_var=no + ;; + + *nto* | *qnx*) + ;; + + openbsd* | bitrig*) + if test -f /usr/libexec/ld.so; then + hardcode_direct=yes + hardcode_shlibpath_var=no + hardcode_direct_absolute=yes + if test -z "`echo __ELF__ | $CC -E - | $GREP __ELF__`"; then + archive_cmds='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags' + archive_expsym_cmds='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags $wl-retain-symbols-file,$export_symbols' + hardcode_libdir_flag_spec='$wl-rpath,$libdir' + export_dynamic_flag_spec='$wl-E' + else + archive_cmds='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags' + hardcode_libdir_flag_spec='$wl-rpath,$libdir' + fi + else + ld_shlibs=no + fi + ;; + + os2*) + hardcode_libdir_flag_spec='-L$libdir' + hardcode_minus_L=yes + allow_undefined_flag=unsupported + shrext_cmds=.dll + archive_cmds='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + emxexp $libobjs | $SED /"_DLL_InitTerm"/d >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + archive_expsym_cmds='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + prefix_cmds="$SED"~ + if test EXPORTS = "`$SED 1q $export_symbols`"; then + prefix_cmds="$prefix_cmds -e 1d"; + fi~ + prefix_cmds="$prefix_cmds -e \"s/^\(.*\)$/_\1/g\""~ + cat $export_symbols | $prefix_cmds >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + old_archive_From_new_cmds='emximp -o $output_objdir/${libname}_dll.a $output_objdir/$libname.def' + enable_shared_with_static_runtimes=yes + ;; + + osf3*) + if test yes = "$GCC"; then + allow_undefined_flag=' $wl-expect_unresolved $wl\*' + archive_cmds='$CC -shared$allow_undefined_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + else + allow_undefined_flag=' -expect_unresolved \*' + archive_cmds='$CC -shared$allow_undefined_flag $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + fi + archive_cmds_need_lc='no' + hardcode_libdir_flag_spec='$wl-rpath $wl$libdir' + hardcode_libdir_separator=: + ;; + + osf4* | osf5*) # as osf3* with the addition of -msym flag + if test yes = "$GCC"; then + allow_undefined_flag=' $wl-expect_unresolved $wl\*' + archive_cmds='$CC -shared$allow_undefined_flag $pic_flag $libobjs $deplibs $compiler_flags $wl-msym $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + hardcode_libdir_flag_spec='$wl-rpath $wl$libdir' + else + allow_undefined_flag=' -expect_unresolved \*' + archive_cmds='$CC -shared$allow_undefined_flag $libobjs $deplibs $compiler_flags -msym -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + archive_expsym_cmds='for i in `cat $export_symbols`; do printf "%s %s\\n" -exported_symbol "\$i" >> $lib.exp; done; printf "%s\\n" "-hidden">> $lib.exp~ + $CC -shared$allow_undefined_flag $wl-input $wl$lib.exp $compiler_flags $libobjs $deplibs -soname $soname `test -n "$verstring" && $ECHO "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib~$RM $lib.exp' + + # Both c and cxx compiler support -rpath directly + hardcode_libdir_flag_spec='-rpath $libdir' + fi + archive_cmds_need_lc='no' + hardcode_libdir_separator=: + ;; + + solaris*) + no_undefined_flag=' -z defs' + if test yes = "$GCC"; then + wlarc='$wl' + archive_cmds='$CC -shared $pic_flag $wl-z ${wl}text $wl-h $wl$soname -o $lib $libobjs $deplibs $compiler_flags' + archive_expsym_cmds='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $CC -shared $pic_flag $wl-z ${wl}text $wl-M $wl$lib.exp $wl-h $wl$soname -o $lib $libobjs $deplibs $compiler_flags~$RM $lib.exp' + else + case `$CC -V 2>&1` in + *"Compilers 5.0"*) + wlarc='' + archive_cmds='$LD -G$allow_undefined_flag -h $soname -o $lib $libobjs $deplibs $linker_flags' + archive_expsym_cmds='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $LD -G$allow_undefined_flag -M $lib.exp -h $soname -o $lib $libobjs $deplibs $linker_flags~$RM $lib.exp' + ;; + *) + wlarc='$wl' + archive_cmds='$CC -G$allow_undefined_flag -h $soname -o $lib $libobjs $deplibs $compiler_flags' + archive_expsym_cmds='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $CC -G$allow_undefined_flag -M $lib.exp -h $soname -o $lib $libobjs $deplibs $compiler_flags~$RM $lib.exp' + ;; + esac + fi + hardcode_libdir_flag_spec='-R$libdir' + hardcode_shlibpath_var=no + case $host_os in + solaris2.[0-5] | solaris2.[0-5].*) ;; + *) + # The compiler driver will combine and reorder linker options, + # but understands '-z linker_flag'. GCC discards it without '$wl', + # but is careful enough not to reorder. + # Supported since Solaris 2.6 (maybe 2.5.1?) + if test yes = "$GCC"; then + whole_archive_flag_spec='$wl-z ${wl}allextract$convenience $wl-z ${wl}defaultextract' + else + whole_archive_flag_spec='-z allextract$convenience -z defaultextract' + fi + ;; + esac + link_all_deplibs=yes + ;; + + sunos4*) + if test sequent = "$host_vendor"; then + # Use $CC to link under sequent, because it throws in some extra .o + # files that make .init and .fini sections work. + archive_cmds='$CC -G $wl-h $soname -o $lib $libobjs $deplibs $compiler_flags' + else + archive_cmds='$LD -assert pure-text -Bstatic -o $lib $libobjs $deplibs $linker_flags' + fi + hardcode_libdir_flag_spec='-L$libdir' + hardcode_direct=yes + hardcode_minus_L=yes + hardcode_shlibpath_var=no + ;; + + sysv4) + case $host_vendor in + sni) + archive_cmds='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + hardcode_direct=yes # is this really true??? + ;; + siemens) + ## LD is ld it makes a PLAMLIB + ## CC just makes a GrossModule. + archive_cmds='$LD -G -o $lib $libobjs $deplibs $linker_flags' + reload_cmds='$CC -r -o $output$reload_objs' + hardcode_direct=no + ;; + motorola) + archive_cmds='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + hardcode_direct=no #Motorola manual says yes, but my tests say they lie + ;; + esac + runpath_var='LD_RUN_PATH' + hardcode_shlibpath_var=no + ;; + + sysv4.3*) + archive_cmds='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + hardcode_shlibpath_var=no + export_dynamic_flag_spec='-Bexport' + ;; + + sysv4*MP*) + if test -d /usr/nec; then + archive_cmds='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + hardcode_shlibpath_var=no + runpath_var=LD_RUN_PATH + hardcode_runpath_var=yes + ld_shlibs=yes + fi + ;; + + sysv4*uw2* | sysv5OpenUNIX* | sysv5UnixWare7.[01].[10]* | unixware7* | sco3.2v5.0.[024]*) + no_undefined_flag='$wl-z,text' + archive_cmds_need_lc=no + hardcode_shlibpath_var=no + runpath_var='LD_RUN_PATH' + + if test yes = "$GCC"; then + archive_cmds='$CC -shared $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + archive_expsym_cmds='$CC -shared $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + else + archive_cmds='$CC -G $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + archive_expsym_cmds='$CC -G $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + fi + ;; + + sysv5* | sco3.2v5* | sco5v6*) + # Note: We CANNOT use -z defs as we might desire, because we do not + # link with -lc, and that would cause any symbols used from libc to + # always be unresolved, which means just about no library would + # ever link correctly. If we're not using GNU ld we use -z text + # though, which does catch some bad symbols but isn't as heavy-handed + # as -z defs. + no_undefined_flag='$wl-z,text' + allow_undefined_flag='$wl-z,nodefs' + archive_cmds_need_lc=no + hardcode_shlibpath_var=no + hardcode_libdir_flag_spec='$wl-R,$libdir' + hardcode_libdir_separator=':' + link_all_deplibs=yes + export_dynamic_flag_spec='$wl-Bexport' + runpath_var='LD_RUN_PATH' + + if test yes = "$GCC"; then + archive_cmds='$CC -shared $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + archive_expsym_cmds='$CC -shared $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + else + archive_cmds='$CC -G $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + archive_expsym_cmds='$CC -G $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + fi + ;; + + uts4*) + archive_cmds='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + hardcode_libdir_flag_spec='-L$libdir' + hardcode_shlibpath_var=no + ;; + + *) + ld_shlibs=no + ;; + esac + + if test sni = "$host_vendor"; then + case $host in + sysv4 | sysv4.2uw2* | sysv4.3* | sysv5*) + export_dynamic_flag_spec='$wl-Blargedynsym' + ;; + esac + fi + fi + +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ld_shlibs" >&5 +$as_echo "$ld_shlibs" >&6; } +test no = "$ld_shlibs" && can_build_shared=no + +with_gnu_ld=$with_gnu_ld + + + + + + + + + + + + + + + +# +# Do we need to explicitly link libc? +# +case "x$archive_cmds_need_lc" in +x|xyes) + # Assume -lc should be added + archive_cmds_need_lc=yes + + if test yes,yes = "$GCC,$enable_shared"; then + case $archive_cmds in + *'~'*) + # FIXME: we may have to deal with multi-command sequences. + ;; + '$CC '*) + # Test whether the compiler implicitly links with -lc since on some + # systems, -lgcc has to come before -lc. If gcc already passes -lc + # to ld, don't add -lc before -lgcc. + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether -lc should be explicitly linked in" >&5 +$as_echo_n "checking whether -lc should be explicitly linked in... " >&6; } +if ${lt_cv_archive_cmds_need_lc+:} false; then : + $as_echo_n "(cached) " >&6 +else + $RM conftest* + echo "$lt_simple_compile_test_code" > conftest.$ac_ext + + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_compile\""; } >&5 + (eval $ac_compile) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } 2>conftest.err; then + soname=conftest + lib=conftest + libobjs=conftest.$ac_objext + deplibs= + wl=$lt_prog_compiler_wl + pic_flag=$lt_prog_compiler_pic + compiler_flags=-v + linker_flags=-v + verstring= + output_objdir=. + libname=conftest + lt_save_allow_undefined_flag=$allow_undefined_flag + allow_undefined_flag= + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$archive_cmds 2\>\&1 \| $GREP \" -lc \" \>/dev/null 2\>\&1\""; } >&5 + (eval $archive_cmds 2\>\&1 \| $GREP \" -lc \" \>/dev/null 2\>\&1) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + then + lt_cv_archive_cmds_need_lc=no + else + lt_cv_archive_cmds_need_lc=yes + fi + allow_undefined_flag=$lt_save_allow_undefined_flag + else + cat conftest.err 1>&5 + fi + $RM conftest* + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_archive_cmds_need_lc" >&5 +$as_echo "$lt_cv_archive_cmds_need_lc" >&6; } + archive_cmds_need_lc=$lt_cv_archive_cmds_need_lc + ;; + esac + fi + ;; +esac + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking dynamic linker characteristics" >&5 +$as_echo_n "checking dynamic linker characteristics... " >&6; } + +if test yes = "$GCC"; then + case $host_os in + darwin*) lt_awk_arg='/^libraries:/,/LR/' ;; + *) lt_awk_arg='/^libraries:/' ;; + esac + case $host_os in + mingw* | cegcc*) lt_sed_strip_eq='s|=\([A-Za-z]:\)|\1|g' ;; + *) lt_sed_strip_eq='s|=/|/|g' ;; + esac + lt_search_path_spec=`$CC -print-search-dirs | awk $lt_awk_arg | $SED -e "s/^libraries://" -e $lt_sed_strip_eq` + case $lt_search_path_spec in + *\;*) + # if the path contains ";" then we assume it to be the separator + # otherwise default to the standard path separator (i.e. ":") - it is + # assumed that no part of a normal pathname contains ";" but that should + # okay in the real world where ";" in dirpaths is itself problematic. + lt_search_path_spec=`$ECHO "$lt_search_path_spec" | $SED 's/;/ /g'` + ;; + *) + lt_search_path_spec=`$ECHO "$lt_search_path_spec" | $SED "s/$PATH_SEPARATOR/ /g"` + ;; + esac + # Ok, now we have the path, separated by spaces, we can step through it + # and add multilib dir if necessary... + lt_tmp_lt_search_path_spec= + lt_multi_os_dir=/`$CC $CPPFLAGS $CFLAGS $LDFLAGS -print-multi-os-directory 2>/dev/null` + # ...but if some path component already ends with the multilib dir we assume + # that all is fine and trust -print-search-dirs as is (GCC 4.2? or newer). + case "$lt_multi_os_dir; $lt_search_path_spec " in + "/; "* | "/.; "* | "/./; "* | *"$lt_multi_os_dir "* | *"$lt_multi_os_dir/ "*) + lt_multi_os_dir= + ;; + esac + for lt_sys_path in $lt_search_path_spec; do + if test -d "$lt_sys_path$lt_multi_os_dir"; then + lt_tmp_lt_search_path_spec="$lt_tmp_lt_search_path_spec $lt_sys_path$lt_multi_os_dir" + elif test -n "$lt_multi_os_dir"; then + test -d "$lt_sys_path" && \ + lt_tmp_lt_search_path_spec="$lt_tmp_lt_search_path_spec $lt_sys_path" + fi + done + lt_search_path_spec=`$ECHO "$lt_tmp_lt_search_path_spec" | awk ' +BEGIN {RS = " "; FS = "/|\n";} { + lt_foo = ""; + lt_count = 0; + for (lt_i = NF; lt_i > 0; lt_i--) { + if ($lt_i != "" && $lt_i != ".") { + if ($lt_i == "..") { + lt_count++; + } else { + if (lt_count == 0) { + lt_foo = "/" $lt_i lt_foo; + } else { + lt_count--; + } + } + } + } + if (lt_foo != "") { lt_freq[lt_foo]++; } + if (lt_freq[lt_foo] == 1) { print lt_foo; } +}'` + # AWK program above erroneously prepends '/' to C:/dos/paths + # for these hosts. + case $host_os in + mingw* | cegcc*) lt_search_path_spec=`$ECHO "$lt_search_path_spec" |\ + $SED 's|/\([A-Za-z]:\)|\1|g'` ;; + esac + sys_lib_search_path_spec=`$ECHO "$lt_search_path_spec" | $lt_NL2SP` +else + sys_lib_search_path_spec="/lib /usr/lib /usr/local/lib" +fi +library_names_spec= +libname_spec='lib$name' +soname_spec= +shrext_cmds=.so +postinstall_cmds= +postuninstall_cmds= +finish_cmds= +finish_eval= +shlibpath_var= +shlibpath_overrides_runpath=unknown +version_type=none +dynamic_linker="$host_os ld.so" +sys_lib_dlsearch_path_spec="/lib /usr/lib" +need_lib_prefix=unknown +hardcode_into_libs=no + +# when you set need_version to no, make sure it does not cause -set_version +# flags to be left without arguments +need_version=unknown + + + +case $host_os in +aix3*) + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$release$shared_ext$versuffix $libname.a' + shlibpath_var=LIBPATH + + # AIX 3 has no versioning support, so we append a major version to the name. + soname_spec='$libname$release$shared_ext$major' + ;; + +aix[4-9]*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + hardcode_into_libs=yes + if test ia64 = "$host_cpu"; then + # AIX 5 supports IA64 + library_names_spec='$libname$release$shared_ext$major $libname$release$shared_ext$versuffix $libname$shared_ext' + shlibpath_var=LD_LIBRARY_PATH + else + # With GCC up to 2.95.x, collect2 would create an import file + # for dependence libraries. The import file would start with + # the line '#! .'. This would cause the generated library to + # depend on '.', always an invalid library. This was fixed in + # development snapshots of GCC prior to 3.0. + case $host_os in + aix4 | aix4.[01] | aix4.[01].*) + if { echo '#if __GNUC__ > 2 || (__GNUC__ == 2 && __GNUC_MINOR__ >= 97)' + echo ' yes ' + echo '#endif'; } | $CC -E - | $GREP yes > /dev/null; then + : + else + can_build_shared=no + fi + ;; + esac + # Using Import Files as archive members, it is possible to support + # filename-based versioning of shared library archives on AIX. While + # this would work for both with and without runtime linking, it will + # prevent static linking of such archives. So we do filename-based + # shared library versioning with .so extension only, which is used + # when both runtime linking and shared linking is enabled. + # Unfortunately, runtime linking may impact performance, so we do + # not want this to be the default eventually. Also, we use the + # versioned .so libs for executables only if there is the -brtl + # linker flag in LDFLAGS as well, or --with-aix-soname=svr4 only. + # To allow for filename-based versioning support, we need to create + # libNAME.so.V as an archive file, containing: + # *) an Import File, referring to the versioned filename of the + # archive as well as the shared archive member, telling the + # bitwidth (32 or 64) of that shared object, and providing the + # list of exported symbols of that shared object, eventually + # decorated with the 'weak' keyword + # *) the shared object with the F_LOADONLY flag set, to really avoid + # it being seen by the linker. + # At run time we better use the real file rather than another symlink, + # but for link time we create the symlink libNAME.so -> libNAME.so.V + + case $with_aix_soname,$aix_use_runtimelinking in + # AIX (on Power*) has no versioning support, so currently we cannot hardcode correct + # soname into executable. Probably we can add versioning support to + # collect2, so additional links can be useful in future. + aix,yes) # traditional libtool + dynamic_linker='AIX unversionable lib.so' + # If using run time linking (on AIX 4.2 or later) use lib.so + # instead of lib.a to let people know that these are not + # typical AIX shared libraries. + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + ;; + aix,no) # traditional AIX only + dynamic_linker='AIX lib.a(lib.so.V)' + # We preserve .a as extension for shared libraries through AIX4.2 + # and later when we are not doing run time linking. + library_names_spec='$libname$release.a $libname.a' + soname_spec='$libname$release$shared_ext$major' + ;; + svr4,*) # full svr4 only + dynamic_linker="AIX lib.so.V($shared_archive_member_spec.o)" + library_names_spec='$libname$release$shared_ext$major $libname$shared_ext' + # We do not specify a path in Import Files, so LIBPATH fires. + shlibpath_overrides_runpath=yes + ;; + *,yes) # both, prefer svr4 + dynamic_linker="AIX lib.so.V($shared_archive_member_spec.o), lib.a(lib.so.V)" + library_names_spec='$libname$release$shared_ext$major $libname$shared_ext' + # unpreferred sharedlib libNAME.a needs extra handling + postinstall_cmds='test -n "$linkname" || linkname="$realname"~func_stripname "" ".so" "$linkname"~$install_shared_prog "$dir/$func_stripname_result.$libext" "$destdir/$func_stripname_result.$libext"~test -z "$tstripme" || test -z "$striplib" || $striplib "$destdir/$func_stripname_result.$libext"' + postuninstall_cmds='for n in $library_names $old_library; do :; done~func_stripname "" ".so" "$n"~test "$func_stripname_result" = "$n" || func_append rmfiles " $odir/$func_stripname_result.$libext"' + # We do not specify a path in Import Files, so LIBPATH fires. + shlibpath_overrides_runpath=yes + ;; + *,no) # both, prefer aix + dynamic_linker="AIX lib.a(lib.so.V), lib.so.V($shared_archive_member_spec.o)" + library_names_spec='$libname$release.a $libname.a' + soname_spec='$libname$release$shared_ext$major' + # unpreferred sharedlib libNAME.so.V and symlink libNAME.so need extra handling + postinstall_cmds='test -z "$dlname" || $install_shared_prog $dir/$dlname $destdir/$dlname~test -z "$tstripme" || test -z "$striplib" || $striplib $destdir/$dlname~test -n "$linkname" || linkname=$realname~func_stripname "" ".a" "$linkname"~(cd "$destdir" && $LN_S -f $dlname $func_stripname_result.so)' + postuninstall_cmds='test -z "$dlname" || func_append rmfiles " $odir/$dlname"~for n in $old_library $library_names; do :; done~func_stripname "" ".a" "$n"~func_append rmfiles " $odir/$func_stripname_result.so"' + ;; + esac + shlibpath_var=LIBPATH + fi + ;; + +amigaos*) + case $host_cpu in + powerpc) + # Since July 2007 AmigaOS4 officially supports .so libraries. + # When compiling the executable, add -use-dynld -Lsobjs: to the compileline. + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + ;; + m68k) + library_names_spec='$libname.ixlibrary $libname.a' + # Create ${libname}_ixlibrary.a entries in /sys/libs. + finish_eval='for lib in `ls $libdir/*.ixlibrary 2>/dev/null`; do libname=`func_echo_all "$lib" | $SED '\''s%^.*/\([^/]*\)\.ixlibrary$%\1%'\''`; $RM /sys/libs/${libname}_ixlibrary.a; $show "cd /sys/libs && $LN_S $lib ${libname}_ixlibrary.a"; cd /sys/libs && $LN_S $lib ${libname}_ixlibrary.a || exit 1; done' + ;; + esac + ;; + +beos*) + library_names_spec='$libname$shared_ext' + dynamic_linker="$host_os ld.so" + shlibpath_var=LIBRARY_PATH + ;; + +bsdi[45]*) + version_type=linux # correct to gnu/linux during the next big refactor + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + finish_cmds='PATH="\$PATH:/sbin" ldconfig $libdir' + shlibpath_var=LD_LIBRARY_PATH + sys_lib_search_path_spec="/shlib /usr/lib /usr/X11/lib /usr/contrib/lib /lib /usr/local/lib" + sys_lib_dlsearch_path_spec="/shlib /usr/lib /usr/local/lib" + # the default ld.so.conf also contains /usr/contrib/lib and + # /usr/X11R6/lib (/usr/X11 is a link to /usr/X11R6), but let us allow + # libtool to hard-code these into programs + ;; + +cygwin* | mingw* | pw32* | cegcc*) + version_type=windows + shrext_cmds=.dll + need_version=no + need_lib_prefix=no + + case $GCC,$cc_basename in + yes,*) + # gcc + library_names_spec='$libname.dll.a' + # DLL is installed to $(libdir)/../bin by postinstall_cmds + postinstall_cmds='base_file=`basename \$file`~ + dlpath=`$SHELL 2>&1 -c '\''. $dir/'\''\$base_file'\''i; echo \$dlname'\''`~ + dldir=$destdir/`dirname \$dlpath`~ + test -d \$dldir || mkdir -p \$dldir~ + $install_prog $dir/$dlname \$dldir/$dlname~ + chmod a+x \$dldir/$dlname~ + if test -n '\''$stripme'\'' && test -n '\''$striplib'\''; then + eval '\''$striplib \$dldir/$dlname'\'' || exit \$?; + fi' + postuninstall_cmds='dldll=`$SHELL 2>&1 -c '\''. $file; echo \$dlname'\''`~ + dlpath=$dir/\$dldll~ + $RM \$dlpath' + shlibpath_overrides_runpath=yes + + case $host_os in + cygwin*) + # Cygwin DLLs use 'cyg' prefix rather than 'lib' + soname_spec='`echo $libname | sed -e 's/^lib/cyg/'``echo $release | $SED -e 's/[.]/-/g'`$versuffix$shared_ext' + + sys_lib_search_path_spec="$sys_lib_search_path_spec /usr/lib/w32api" + ;; + mingw* | cegcc*) + # MinGW DLLs use traditional 'lib' prefix + soname_spec='$libname`echo $release | $SED -e 's/[.]/-/g'`$versuffix$shared_ext' + ;; + pw32*) + # pw32 DLLs use 'pw' prefix rather than 'lib' + library_names_spec='`echo $libname | sed -e 's/^lib/pw/'``echo $release | $SED -e 's/[.]/-/g'`$versuffix$shared_ext' + ;; + esac + dynamic_linker='Win32 ld.exe' + ;; + + *,cl*) + # Native MSVC + libname_spec='$name' + soname_spec='$libname`echo $release | $SED -e 's/[.]/-/g'`$versuffix$shared_ext' + library_names_spec='$libname.dll.lib' + + case $build_os in + mingw*) + sys_lib_search_path_spec= + lt_save_ifs=$IFS + IFS=';' + for lt_path in $LIB + do + IFS=$lt_save_ifs + # Let DOS variable expansion print the short 8.3 style file name. + lt_path=`cd "$lt_path" 2>/dev/null && cmd //C "for %i in (".") do @echo %~si"` + sys_lib_search_path_spec="$sys_lib_search_path_spec $lt_path" + done + IFS=$lt_save_ifs + # Convert to MSYS style. + sys_lib_search_path_spec=`$ECHO "$sys_lib_search_path_spec" | sed -e 's|\\\\|/|g' -e 's| \\([a-zA-Z]\\):| /\\1|g' -e 's|^ ||'` + ;; + cygwin*) + # Convert to unix form, then to dos form, then back to unix form + # but this time dos style (no spaces!) so that the unix form looks + # like /cygdrive/c/PROGRA~1:/cygdr... + sys_lib_search_path_spec=`cygpath --path --unix "$LIB"` + sys_lib_search_path_spec=`cygpath --path --dos "$sys_lib_search_path_spec" 2>/dev/null` + sys_lib_search_path_spec=`cygpath --path --unix "$sys_lib_search_path_spec" | $SED -e "s/$PATH_SEPARATOR/ /g"` + ;; + *) + sys_lib_search_path_spec=$LIB + if $ECHO "$sys_lib_search_path_spec" | $GREP ';[c-zC-Z]:/' >/dev/null; then + # It is most probably a Windows format PATH. + sys_lib_search_path_spec=`$ECHO "$sys_lib_search_path_spec" | $SED -e 's/;/ /g'` + else + sys_lib_search_path_spec=`$ECHO "$sys_lib_search_path_spec" | $SED -e "s/$PATH_SEPARATOR/ /g"` + fi + # FIXME: find the short name or the path components, as spaces are + # common. (e.g. "Program Files" -> "PROGRA~1") + ;; + esac + + # DLL is installed to $(libdir)/../bin by postinstall_cmds + postinstall_cmds='base_file=`basename \$file`~ + dlpath=`$SHELL 2>&1 -c '\''. $dir/'\''\$base_file'\''i; echo \$dlname'\''`~ + dldir=$destdir/`dirname \$dlpath`~ + test -d \$dldir || mkdir -p \$dldir~ + $install_prog $dir/$dlname \$dldir/$dlname' + postuninstall_cmds='dldll=`$SHELL 2>&1 -c '\''. $file; echo \$dlname'\''`~ + dlpath=$dir/\$dldll~ + $RM \$dlpath' + shlibpath_overrides_runpath=yes + dynamic_linker='Win32 link.exe' + ;; + + *) + # Assume MSVC wrapper + library_names_spec='$libname`echo $release | $SED -e 's/[.]/-/g'`$versuffix$shared_ext $libname.lib' + dynamic_linker='Win32 ld.exe' + ;; + esac + # FIXME: first we should search . and the directory the executable is in + shlibpath_var=PATH + ;; + +darwin* | rhapsody*) + dynamic_linker="$host_os dyld" + version_type=darwin + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$major$shared_ext $libname$shared_ext' + soname_spec='$libname$release$major$shared_ext' + shlibpath_overrides_runpath=yes + shlibpath_var=DYLD_LIBRARY_PATH + shrext_cmds='`test .$module = .yes && echo .so || echo .dylib`' + + sys_lib_search_path_spec="$sys_lib_search_path_spec /usr/local/lib" + sys_lib_dlsearch_path_spec='/usr/local/lib /lib /usr/lib' + ;; + +dgux*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + ;; + +freebsd* | dragonfly*) + # DragonFly does not have aout. When/if they implement a new + # versioning mechanism, adjust this. + if test -x /usr/bin/objformat; then + objformat=`/usr/bin/objformat` + else + case $host_os in + freebsd[23].*) objformat=aout ;; + *) objformat=elf ;; + esac + fi + version_type=freebsd-$objformat + case $version_type in + freebsd-elf*) + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + need_version=no + need_lib_prefix=no + ;; + freebsd-*) + library_names_spec='$libname$release$shared_ext$versuffix $libname$shared_ext$versuffix' + need_version=yes + ;; + esac + shlibpath_var=LD_LIBRARY_PATH + case $host_os in + freebsd2.*) + shlibpath_overrides_runpath=yes + ;; + freebsd3.[01]* | freebsdelf3.[01]*) + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + ;; + freebsd3.[2-9]* | freebsdelf3.[2-9]* | \ + freebsd4.[0-5] | freebsdelf4.[0-5] | freebsd4.1.1 | freebsdelf4.1.1) + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + ;; + *) # from 4.6 on, and DragonFly + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + ;; + esac + ;; + +haiku*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + dynamic_linker="$host_os runtime_loader" + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LIBRARY_PATH + shlibpath_overrides_runpath=no + sys_lib_dlsearch_path_spec='/boot/home/config/lib /boot/common/lib /boot/system/lib' + hardcode_into_libs=yes + ;; + +hpux9* | hpux10* | hpux11*) + # Give a soname corresponding to the major version so that dld.sl refuses to + # link against other versions. + version_type=sunos + need_lib_prefix=no + need_version=no + case $host_cpu in + ia64*) + shrext_cmds='.so' + hardcode_into_libs=yes + dynamic_linker="$host_os dld.so" + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes # Unless +noenvvar is specified. + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + if test 32 = "$HPUX_IA64_MODE"; then + sys_lib_search_path_spec="/usr/lib/hpux32 /usr/local/lib/hpux32 /usr/local/lib" + sys_lib_dlsearch_path_spec=/usr/lib/hpux32 + else + sys_lib_search_path_spec="/usr/lib/hpux64 /usr/local/lib/hpux64" + sys_lib_dlsearch_path_spec=/usr/lib/hpux64 + fi + ;; + hppa*64*) + shrext_cmds='.sl' + hardcode_into_libs=yes + dynamic_linker="$host_os dld.sl" + shlibpath_var=LD_LIBRARY_PATH # How should we handle SHLIB_PATH + shlibpath_overrides_runpath=yes # Unless +noenvvar is specified. + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + sys_lib_search_path_spec="/usr/lib/pa20_64 /usr/ccs/lib/pa20_64" + sys_lib_dlsearch_path_spec=$sys_lib_search_path_spec + ;; + *) + shrext_cmds='.sl' + dynamic_linker="$host_os dld.sl" + shlibpath_var=SHLIB_PATH + shlibpath_overrides_runpath=no # +s is required to enable SHLIB_PATH + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + ;; + esac + # HP-UX runs *really* slowly unless shared libraries are mode 555, ... + postinstall_cmds='chmod 555 $lib' + # or fails outright, so override atomically: + install_override_mode=555 + ;; + +interix[3-9]*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + dynamic_linker='Interix 3.x ld.so.1 (PE, like ELF)' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + ;; + +irix5* | irix6* | nonstopux*) + case $host_os in + nonstopux*) version_type=nonstopux ;; + *) + if test yes = "$lt_cv_prog_gnu_ld"; then + version_type=linux # correct to gnu/linux during the next big refactor + else + version_type=irix + fi ;; + esac + need_lib_prefix=no + need_version=no + soname_spec='$libname$release$shared_ext$major' + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$release$shared_ext $libname$shared_ext' + case $host_os in + irix5* | nonstopux*) + libsuff= shlibsuff= + ;; + *) + case $LD in # libtool.m4 will add one of these switches to LD + *-32|*"-32 "|*-melf32bsmip|*"-melf32bsmip ") + libsuff= shlibsuff= libmagic=32-bit;; + *-n32|*"-n32 "|*-melf32bmipn32|*"-melf32bmipn32 ") + libsuff=32 shlibsuff=N32 libmagic=N32;; + *-64|*"-64 "|*-melf64bmip|*"-melf64bmip ") + libsuff=64 shlibsuff=64 libmagic=64-bit;; + *) libsuff= shlibsuff= libmagic=never-match;; + esac + ;; + esac + shlibpath_var=LD_LIBRARY${shlibsuff}_PATH + shlibpath_overrides_runpath=no + sys_lib_search_path_spec="/usr/lib$libsuff /lib$libsuff /usr/local/lib$libsuff" + sys_lib_dlsearch_path_spec="/usr/lib$libsuff /lib$libsuff" + hardcode_into_libs=yes + ;; + +# No shared lib support for Linux oldld, aout, or coff. +linux*oldld* | linux*aout* | linux*coff*) + dynamic_linker=no + ;; + +linux*android*) + version_type=none # Android doesn't support versioned libraries. + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext' + soname_spec='$libname$release$shared_ext' + finish_cmds= + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + + # This implies no fast_install, which is unacceptable. + # Some rework will be needed to allow for fast_install + # before this can be enabled. + hardcode_into_libs=yes + + dynamic_linker='Android linker' + # Don't embed -rpath directories since the linker doesn't support them. + hardcode_libdir_flag_spec='-L$libdir' + ;; + +# This must be glibc/ELF. +linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + finish_cmds='PATH="\$PATH:/sbin" ldconfig -n $libdir' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + + # Some binutils ld are patched to set DT_RUNPATH + if ${lt_cv_shlibpath_overrides_runpath+:} false; then : + $as_echo_n "(cached) " >&6 +else + lt_cv_shlibpath_overrides_runpath=no + save_LDFLAGS=$LDFLAGS + save_libdir=$libdir + eval "libdir=/foo; wl=\"$lt_prog_compiler_wl\"; \ + LDFLAGS=\"\$LDFLAGS $hardcode_libdir_flag_spec\"" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + if ($OBJDUMP -p conftest$ac_exeext) 2>/dev/null | grep "RUNPATH.*$libdir" >/dev/null; then : + lt_cv_shlibpath_overrides_runpath=yes +fi +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + LDFLAGS=$save_LDFLAGS + libdir=$save_libdir + +fi + + shlibpath_overrides_runpath=$lt_cv_shlibpath_overrides_runpath + + # This implies no fast_install, which is unacceptable. + # Some rework will be needed to allow for fast_install + # before this can be enabled. + hardcode_into_libs=yes + + # Ideally, we could use ldconfig to report *all* directores which are + # searched for libraries, however this is still not possible. Aside from not + # being certain /sbin/ldconfig is available, command + # 'ldconfig -N -X -v | grep ^/' on 64bit Fedora does not report /usr/lib64, + # even though it is searched at run-time. Try to do the best guess by + # appending ld.so.conf contents (and includes) to the search path. + if test -f /etc/ld.so.conf; then + lt_ld_extra=`awk '/^include / { system(sprintf("cd /etc; cat %s 2>/dev/null", \$2)); skip = 1; } { if (!skip) print \$0; skip = 0; }' < /etc/ld.so.conf | $SED -e 's/#.*//;/^[ ]*hwcap[ ]/d;s/[:, ]/ /g;s/=[^=]*$//;s/=[^= ]* / /g;s/"//g;/^$/d' | tr '\n' ' '` + sys_lib_dlsearch_path_spec="/lib /usr/lib $lt_ld_extra" + fi + + # We used to test for /lib/ld.so.1 and disable shared libraries on + # powerpc, because MkLinux only supported shared libraries with the + # GNU dynamic linker. Since this was broken with cross compilers, + # most powerpc-linux boxes support dynamic linking these days and + # people can always --disable-shared, the test was removed, and we + # assume the GNU/Linux dynamic linker is in use. + dynamic_linker='GNU/Linux ld.so' + ;; + +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + +netbsd*) + version_type=sunos + need_lib_prefix=no + need_version=no + if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then + library_names_spec='$libname$release$shared_ext$versuffix $libname$shared_ext$versuffix' + finish_cmds='PATH="\$PATH:/sbin" ldconfig -m $libdir' + dynamic_linker='NetBSD (a.out) ld.so' + else + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + dynamic_linker='NetBSD ld.elf_so' + fi + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + ;; + +newsos6) + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + ;; + +*nto* | *qnx*) + version_type=qnx + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='ldqnx.so' + ;; + +openbsd* | bitrig*) + version_type=sunos + sys_lib_dlsearch_path_spec=/usr/lib + need_lib_prefix=no + if test -z "`echo __ELF__ | $CC -E - | $GREP __ELF__`"; then + need_version=no + else + need_version=yes + fi + library_names_spec='$libname$release$shared_ext$versuffix $libname$shared_ext$versuffix' + finish_cmds='PATH="\$PATH:/sbin" ldconfig -m $libdir' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + ;; + +os2*) + libname_spec='$name' + version_type=windows + shrext_cmds=.dll + need_version=no + need_lib_prefix=no + # OS/2 can only load a DLL with a base name of 8 characters or less. + soname_spec='`test -n "$os2dllname" && libname="$os2dllname"; + v=$($ECHO $release$versuffix | tr -d .-); + n=$($ECHO $libname | cut -b -$((8 - ${#v})) | tr . _); + $ECHO $n$v`$shared_ext' + library_names_spec='${libname}_dll.$libext' + dynamic_linker='OS/2 ld.exe' + shlibpath_var=BEGINLIBPATH + sys_lib_search_path_spec="/lib /usr/lib /usr/local/lib" + sys_lib_dlsearch_path_spec=$sys_lib_search_path_spec + postinstall_cmds='base_file=`basename \$file`~ + dlpath=`$SHELL 2>&1 -c '\''. $dir/'\''\$base_file'\''i; $ECHO \$dlname'\''`~ + dldir=$destdir/`dirname \$dlpath`~ + test -d \$dldir || mkdir -p \$dldir~ + $install_prog $dir/$dlname \$dldir/$dlname~ + chmod a+x \$dldir/$dlname~ + if test -n '\''$stripme'\'' && test -n '\''$striplib'\''; then + eval '\''$striplib \$dldir/$dlname'\'' || exit \$?; + fi' + postuninstall_cmds='dldll=`$SHELL 2>&1 -c '\''. $file; $ECHO \$dlname'\''`~ + dlpath=$dir/\$dldll~ + $RM \$dlpath' + ;; + +osf3* | osf4* | osf5*) + version_type=osf + need_lib_prefix=no + need_version=no + soname_spec='$libname$release$shared_ext$major' + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + shlibpath_var=LD_LIBRARY_PATH + sys_lib_search_path_spec="/usr/shlib /usr/ccs/lib /usr/lib/cmplrs/cc /usr/lib /usr/local/lib /var/shlib" + sys_lib_dlsearch_path_spec=$sys_lib_search_path_spec + ;; + +rdos*) + dynamic_linker=no + ;; + +solaris*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + # ldd complains unless libraries are executable + postinstall_cmds='chmod +x $lib' + ;; + +sunos4*) + version_type=sunos + library_names_spec='$libname$release$shared_ext$versuffix $libname$shared_ext$versuffix' + finish_cmds='PATH="\$PATH:/usr/etc" ldconfig $libdir' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + if test yes = "$with_gnu_ld"; then + need_lib_prefix=no + fi + need_version=yes + ;; + +sysv4 | sysv4.3*) + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + case $host_vendor in + sni) + shlibpath_overrides_runpath=no + need_lib_prefix=no + runpath_var=LD_RUN_PATH + ;; + siemens) + need_lib_prefix=no + ;; + motorola) + need_lib_prefix=no + need_version=no + shlibpath_overrides_runpath=no + sys_lib_search_path_spec='/lib /usr/lib /usr/ccs/lib' + ;; + esac + ;; + +sysv4*MP*) + if test -d /usr/nec; then + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$shared_ext.$versuffix $libname$shared_ext.$major $libname$shared_ext' + soname_spec='$libname$shared_ext.$major' + shlibpath_var=LD_LIBRARY_PATH + fi + ;; + +sysv5* | sco3.2v5* | sco5v6* | unixware* | OpenUNIX* | sysv4*uw2*) + version_type=sco + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + if test yes = "$with_gnu_ld"; then + sys_lib_search_path_spec='/usr/local/lib /usr/gnu/lib /usr/ccs/lib /usr/lib /lib' + else + sys_lib_search_path_spec='/usr/ccs/lib /usr/lib' + case $host_os in + sco3.2v5*) + sys_lib_search_path_spec="$sys_lib_search_path_spec /lib" + ;; + esac + fi + sys_lib_dlsearch_path_spec='/usr/lib' + ;; + +tpf*) + # TPF is a cross-target only. Preferred cross-host = GNU/Linux. + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + ;; + +uts4*) + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + ;; + +*) + dynamic_linker=no + ;; +esac +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $dynamic_linker" >&5 +$as_echo "$dynamic_linker" >&6; } +test no = "$dynamic_linker" && can_build_shared=no + +variables_saved_for_relink="PATH $shlibpath_var $runpath_var" +if test yes = "$GCC"; then + variables_saved_for_relink="$variables_saved_for_relink GCC_EXEC_PREFIX COMPILER_PATH LIBRARY_PATH" +fi + +if test set = "${lt_cv_sys_lib_search_path_spec+set}"; then + sys_lib_search_path_spec=$lt_cv_sys_lib_search_path_spec +fi + +if test set = "${lt_cv_sys_lib_dlsearch_path_spec+set}"; then + sys_lib_dlsearch_path_spec=$lt_cv_sys_lib_dlsearch_path_spec +fi + +# remember unaugmented sys_lib_dlsearch_path content for libtool script decls... +configure_time_dlsearch_path=$sys_lib_dlsearch_path_spec + +# ... but it needs LT_SYS_LIBRARY_PATH munging for other configure-time code +func_munge_path_list sys_lib_dlsearch_path_spec "$LT_SYS_LIBRARY_PATH" + +# to be used as default LT_SYS_LIBRARY_PATH value in generated libtool +configure_time_lt_sys_library_path=$LT_SYS_LIBRARY_PATH + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking how to hardcode library paths into programs" >&5 +$as_echo_n "checking how to hardcode library paths into programs... " >&6; } +hardcode_action= +if test -n "$hardcode_libdir_flag_spec" || + test -n "$runpath_var" || + test yes = "$hardcode_automatic"; then + + # We can hardcode non-existent directories. + if test no != "$hardcode_direct" && + # If the only mechanism to avoid hardcoding is shlibpath_var, we + # have to relink, otherwise we might link with an installed library + # when we should be linking with a yet-to-be-installed one + ## test no != "$_LT_TAGVAR(hardcode_shlibpath_var, )" && + test no != "$hardcode_minus_L"; then + # Linking always hardcodes the temporary library directory. + hardcode_action=relink + else + # We can link without hardcoding, and we can hardcode nonexisting dirs. + hardcode_action=immediate + fi +else + # We cannot hardcode anything, or else we can only hardcode existing + # directories. + hardcode_action=unsupported +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $hardcode_action" >&5 +$as_echo "$hardcode_action" >&6; } + +if test relink = "$hardcode_action" || + test yes = "$inherit_rpath"; then + # Fast installation is not supported + enable_fast_install=no +elif test yes = "$shlibpath_overrides_runpath" || + test no = "$enable_shared"; then + # Fast installation is not necessary + enable_fast_install=needless +fi + + + + + + + if test yes != "$enable_dlopen"; then + enable_dlopen=unknown + enable_dlopen_self=unknown + enable_dlopen_self_static=unknown +else + lt_cv_dlopen=no + lt_cv_dlopen_libs= + + case $host_os in + beos*) + lt_cv_dlopen=load_add_on + lt_cv_dlopen_libs= + lt_cv_dlopen_self=yes + ;; + + mingw* | pw32* | cegcc*) + lt_cv_dlopen=LoadLibrary + lt_cv_dlopen_libs= + ;; + + cygwin*) + lt_cv_dlopen=dlopen + lt_cv_dlopen_libs= + ;; + + darwin*) + # if libdl is installed we need to link against it + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for dlopen in -ldl" >&5 +$as_echo_n "checking for dlopen in -ldl... " >&6; } +if ${ac_cv_lib_dl_dlopen+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_check_lib_save_LIBS=$LIBS +LIBS="-ldl $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char dlopen (); +int +main () +{ +return dlopen (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_lib_dl_dlopen=yes +else + ac_cv_lib_dl_dlopen=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_dl_dlopen" >&5 +$as_echo "$ac_cv_lib_dl_dlopen" >&6; } +if test "x$ac_cv_lib_dl_dlopen" = xyes; then : + lt_cv_dlopen=dlopen lt_cv_dlopen_libs=-ldl +else + + lt_cv_dlopen=dyld + lt_cv_dlopen_libs= + lt_cv_dlopen_self=yes + +fi + + ;; + + tpf*) + # Don't try to run any link tests for TPF. We know it's impossible + # because TPF is a cross-compiler, and we know how we open DSOs. + lt_cv_dlopen=dlopen + lt_cv_dlopen_libs= + lt_cv_dlopen_self=no + ;; + + *) + ac_fn_c_check_func "$LINENO" "shl_load" "ac_cv_func_shl_load" +if test "x$ac_cv_func_shl_load" = xyes; then : + lt_cv_dlopen=shl_load +else + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for shl_load in -ldld" >&5 +$as_echo_n "checking for shl_load in -ldld... " >&6; } +if ${ac_cv_lib_dld_shl_load+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_check_lib_save_LIBS=$LIBS +LIBS="-ldld $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char shl_load (); +int +main () +{ +return shl_load (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_lib_dld_shl_load=yes +else + ac_cv_lib_dld_shl_load=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_dld_shl_load" >&5 +$as_echo "$ac_cv_lib_dld_shl_load" >&6; } +if test "x$ac_cv_lib_dld_shl_load" = xyes; then : + lt_cv_dlopen=shl_load lt_cv_dlopen_libs=-ldld +else + ac_fn_c_check_func "$LINENO" "dlopen" "ac_cv_func_dlopen" +if test "x$ac_cv_func_dlopen" = xyes; then : + lt_cv_dlopen=dlopen +else + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for dlopen in -ldl" >&5 +$as_echo_n "checking for dlopen in -ldl... " >&6; } +if ${ac_cv_lib_dl_dlopen+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_check_lib_save_LIBS=$LIBS +LIBS="-ldl $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char dlopen (); +int +main () +{ +return dlopen (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_lib_dl_dlopen=yes +else + ac_cv_lib_dl_dlopen=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_dl_dlopen" >&5 +$as_echo "$ac_cv_lib_dl_dlopen" >&6; } +if test "x$ac_cv_lib_dl_dlopen" = xyes; then : + lt_cv_dlopen=dlopen lt_cv_dlopen_libs=-ldl +else + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for dlopen in -lsvld" >&5 +$as_echo_n "checking for dlopen in -lsvld... " >&6; } +if ${ac_cv_lib_svld_dlopen+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_check_lib_save_LIBS=$LIBS +LIBS="-lsvld $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char dlopen (); +int +main () +{ +return dlopen (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_lib_svld_dlopen=yes +else + ac_cv_lib_svld_dlopen=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_svld_dlopen" >&5 +$as_echo "$ac_cv_lib_svld_dlopen" >&6; } +if test "x$ac_cv_lib_svld_dlopen" = xyes; then : + lt_cv_dlopen=dlopen lt_cv_dlopen_libs=-lsvld +else + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for dld_link in -ldld" >&5 +$as_echo_n "checking for dld_link in -ldld... " >&6; } +if ${ac_cv_lib_dld_dld_link+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_check_lib_save_LIBS=$LIBS +LIBS="-ldld $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char dld_link (); +int +main () +{ +return dld_link (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_lib_dld_dld_link=yes +else + ac_cv_lib_dld_dld_link=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_dld_dld_link" >&5 +$as_echo "$ac_cv_lib_dld_dld_link" >&6; } +if test "x$ac_cv_lib_dld_dld_link" = xyes; then : + lt_cv_dlopen=dld_link lt_cv_dlopen_libs=-ldld +fi + + +fi + + +fi + + +fi + + +fi + + +fi + + ;; + esac + + if test no = "$lt_cv_dlopen"; then + enable_dlopen=no + else + enable_dlopen=yes + fi + + case $lt_cv_dlopen in + dlopen) + save_CPPFLAGS=$CPPFLAGS + test yes = "$ac_cv_header_dlfcn_h" && CPPFLAGS="$CPPFLAGS -DHAVE_DLFCN_H" + + save_LDFLAGS=$LDFLAGS + wl=$lt_prog_compiler_wl eval LDFLAGS=\"\$LDFLAGS $export_dynamic_flag_spec\" + + save_LIBS=$LIBS + LIBS="$lt_cv_dlopen_libs $LIBS" + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether a program can dlopen itself" >&5 +$as_echo_n "checking whether a program can dlopen itself... " >&6; } +if ${lt_cv_dlopen_self+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test yes = "$cross_compiling"; then : + lt_cv_dlopen_self=cross +else + lt_dlunknown=0; lt_dlno_uscore=1; lt_dlneed_uscore=2 + lt_status=$lt_dlunknown + cat > conftest.$ac_ext <<_LT_EOF +#line $LINENO "configure" +#include "confdefs.h" + +#if HAVE_DLFCN_H +#include +#endif + +#include + +#ifdef RTLD_GLOBAL +# define LT_DLGLOBAL RTLD_GLOBAL +#else +# ifdef DL_GLOBAL +# define LT_DLGLOBAL DL_GLOBAL +# else +# define LT_DLGLOBAL 0 +# endif +#endif + +/* We may have to define LT_DLLAZY_OR_NOW in the command line if we + find out it does not work in some platform. */ +#ifndef LT_DLLAZY_OR_NOW +# ifdef RTLD_LAZY +# define LT_DLLAZY_OR_NOW RTLD_LAZY +# else +# ifdef DL_LAZY +# define LT_DLLAZY_OR_NOW DL_LAZY +# else +# ifdef RTLD_NOW +# define LT_DLLAZY_OR_NOW RTLD_NOW +# else +# ifdef DL_NOW +# define LT_DLLAZY_OR_NOW DL_NOW +# else +# define LT_DLLAZY_OR_NOW 0 +# endif +# endif +# endif +# endif +#endif + +/* When -fvisibility=hidden is used, assume the code has been annotated + correspondingly for the symbols needed. */ +#if defined __GNUC__ && (((__GNUC__ == 3) && (__GNUC_MINOR__ >= 3)) || (__GNUC__ > 3)) +int fnord () __attribute__((visibility("default"))); +#endif + +int fnord () { return 42; } +int main () +{ + void *self = dlopen (0, LT_DLGLOBAL|LT_DLLAZY_OR_NOW); + int status = $lt_dlunknown; + + if (self) + { + if (dlsym (self,"fnord")) status = $lt_dlno_uscore; + else + { + if (dlsym( self,"_fnord")) status = $lt_dlneed_uscore; + else puts (dlerror ()); + } + /* dlclose (self); */ + } + else + puts (dlerror ()); + + return status; +} +_LT_EOF + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_link\""; } >&5 + (eval $ac_link) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } && test -s "conftest$ac_exeext" 2>/dev/null; then + (./conftest; exit; ) >&5 2>/dev/null + lt_status=$? + case x$lt_status in + x$lt_dlno_uscore) lt_cv_dlopen_self=yes ;; + x$lt_dlneed_uscore) lt_cv_dlopen_self=yes ;; + x$lt_dlunknown|x*) lt_cv_dlopen_self=no ;; + esac + else : + # compilation failed + lt_cv_dlopen_self=no + fi +fi +rm -fr conftest* + + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_dlopen_self" >&5 +$as_echo "$lt_cv_dlopen_self" >&6; } + + if test yes = "$lt_cv_dlopen_self"; then + wl=$lt_prog_compiler_wl eval LDFLAGS=\"\$LDFLAGS $lt_prog_compiler_static\" + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether a statically linked program can dlopen itself" >&5 +$as_echo_n "checking whether a statically linked program can dlopen itself... " >&6; } +if ${lt_cv_dlopen_self_static+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test yes = "$cross_compiling"; then : + lt_cv_dlopen_self_static=cross +else + lt_dlunknown=0; lt_dlno_uscore=1; lt_dlneed_uscore=2 + lt_status=$lt_dlunknown + cat > conftest.$ac_ext <<_LT_EOF +#line $LINENO "configure" +#include "confdefs.h" + +#if HAVE_DLFCN_H +#include +#endif + +#include + +#ifdef RTLD_GLOBAL +# define LT_DLGLOBAL RTLD_GLOBAL +#else +# ifdef DL_GLOBAL +# define LT_DLGLOBAL DL_GLOBAL +# else +# define LT_DLGLOBAL 0 +# endif +#endif + +/* We may have to define LT_DLLAZY_OR_NOW in the command line if we + find out it does not work in some platform. */ +#ifndef LT_DLLAZY_OR_NOW +# ifdef RTLD_LAZY +# define LT_DLLAZY_OR_NOW RTLD_LAZY +# else +# ifdef DL_LAZY +# define LT_DLLAZY_OR_NOW DL_LAZY +# else +# ifdef RTLD_NOW +# define LT_DLLAZY_OR_NOW RTLD_NOW +# else +# ifdef DL_NOW +# define LT_DLLAZY_OR_NOW DL_NOW +# else +# define LT_DLLAZY_OR_NOW 0 +# endif +# endif +# endif +# endif +#endif + +/* When -fvisibility=hidden is used, assume the code has been annotated + correspondingly for the symbols needed. */ +#if defined __GNUC__ && (((__GNUC__ == 3) && (__GNUC_MINOR__ >= 3)) || (__GNUC__ > 3)) +int fnord () __attribute__((visibility("default"))); +#endif + +int fnord () { return 42; } +int main () +{ + void *self = dlopen (0, LT_DLGLOBAL|LT_DLLAZY_OR_NOW); + int status = $lt_dlunknown; + + if (self) + { + if (dlsym (self,"fnord")) status = $lt_dlno_uscore; + else + { + if (dlsym( self,"_fnord")) status = $lt_dlneed_uscore; + else puts (dlerror ()); + } + /* dlclose (self); */ + } + else + puts (dlerror ()); + + return status; +} +_LT_EOF + if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_link\""; } >&5 + (eval $ac_link) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } && test -s "conftest$ac_exeext" 2>/dev/null; then + (./conftest; exit; ) >&5 2>/dev/null + lt_status=$? + case x$lt_status in + x$lt_dlno_uscore) lt_cv_dlopen_self_static=yes ;; + x$lt_dlneed_uscore) lt_cv_dlopen_self_static=yes ;; + x$lt_dlunknown|x*) lt_cv_dlopen_self_static=no ;; + esac + else : + # compilation failed + lt_cv_dlopen_self_static=no + fi +fi +rm -fr conftest* + + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $lt_cv_dlopen_self_static" >&5 +$as_echo "$lt_cv_dlopen_self_static" >&6; } + fi + + CPPFLAGS=$save_CPPFLAGS + LDFLAGS=$save_LDFLAGS + LIBS=$save_LIBS + ;; + esac + + case $lt_cv_dlopen_self in + yes|no) enable_dlopen_self=$lt_cv_dlopen_self ;; + *) enable_dlopen_self=unknown ;; + esac + + case $lt_cv_dlopen_self_static in + yes|no) enable_dlopen_self_static=$lt_cv_dlopen_self_static ;; + *) enable_dlopen_self_static=unknown ;; + esac +fi + + + + + + + + + + + + + + + + + +striplib= +old_striplib= +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether stripping libraries is possible" >&5 +$as_echo_n "checking whether stripping libraries is possible... " >&6; } +if test -n "$STRIP" && $STRIP -V 2>&1 | $GREP "GNU strip" >/dev/null; then + test -z "$old_striplib" && old_striplib="$STRIP --strip-debug" + test -z "$striplib" && striplib="$STRIP --strip-unneeded" + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } +else +# FIXME - insert some real tests, host_os isn't really good enough + case $host_os in + darwin*) + if test -n "$STRIP"; then + striplib="$STRIP -x" + old_striplib="$STRIP -S" + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } + else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + fi + ;; + *) + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + ;; + esac +fi + + + + + + + + + + + + + # Report what library types will actually be built + { $as_echo "$as_me:${as_lineno-$LINENO}: checking if libtool supports shared libraries" >&5 +$as_echo_n "checking if libtool supports shared libraries... " >&6; } + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $can_build_shared" >&5 +$as_echo "$can_build_shared" >&6; } + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether to build shared libraries" >&5 +$as_echo_n "checking whether to build shared libraries... " >&6; } + test no = "$can_build_shared" && enable_shared=no + + # On AIX, shared libraries and static libraries use the same namespace, and + # are all built from PIC. + case $host_os in + aix3*) + test yes = "$enable_shared" && enable_static=no + if test -n "$RANLIB"; then + archive_cmds="$archive_cmds~\$RANLIB \$lib" + postinstall_cmds='$RANLIB $lib' + fi + ;; + + aix[4-9]*) + if test ia64 != "$host_cpu"; then + case $enable_shared,$with_aix_soname,$aix_use_runtimelinking in + yes,aix,yes) ;; # shared object as lib.so file only + yes,svr4,*) ;; # shared object as lib.so archive member only + yes,*) enable_static=no ;; # shared object in lib.a archive as well + esac + fi + ;; + esac + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $enable_shared" >&5 +$as_echo "$enable_shared" >&6; } + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether to build static libraries" >&5 +$as_echo_n "checking whether to build static libraries... " >&6; } + # Make sure either enable_shared or enable_static is yes. + test yes = "$enable_shared" || enable_static=yes + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $enable_static" >&5 +$as_echo "$enable_static" >&6; } + + + + +fi +ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + +CC=$lt_save_CC + + + + + + + + + + + + + + + + ac_config_commands="$ac_config_commands libtool" + + + + +# Only expand once: + + + + + + + + + # $is_release = (.git directory does not exist) + if test -d .git; then : + ax_is_release=no +else + ax_is_release=yes +fi + + + + + # C support is enabled by default. + + + # Only enable C++ support if AC_PROG_CXX is called. The redefinition of + # AC_PROG_CXX is so that a fatal error is emitted if this macro is called + # before AC_PROG_CXX, which would otherwise cause no C++ warnings to be + # checked. + + + + + # Default value for IS-RELEASE is $ax_is_release + ax_compiler_flags_is_release=$ax_is_release + + # Check whether --enable-compile-warnings was given. +if test "${enable_compile_warnings+set}" = set; then : + enableval=$enable_compile_warnings; +else + if test "$ax_compiler_flags_is_release" = "yes"; then : + enable_compile_warnings="yes" +else + enable_compile_warnings="error" +fi +fi + + # Check whether --enable-Werror was given. +if test "${enable_Werror+set}" = set; then : + enableval=$enable_Werror; +else + enable_Werror=maybe +fi + + + # Return the user's chosen warning level + if test "$enable_Werror" = "no" -a \ + "$enable_compile_warnings" = "error"; then : + + enable_compile_warnings="yes" + +fi + + ax_enable_compile_warnings=$enable_compile_warnings + + + + + + + + + + + # Variable names + + + ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + + + # Always pass -Werror=unknown-warning-option to get Clang to fail on bad + # flags, otherwise they are always appended to the warn_cflags variable, and + # Clang warns on them for every compilation unit. + # If this is passed to GCC, it will explode, so the flag must be enabled + # conditionally. + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts -Werror=unknown-warning-option" >&5 +$as_echo_n "checking whether C compiler accepts -Werror=unknown-warning-option... " >&6; } +if ${ax_cv_check_cflags___Werror_unknown_warning_option+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$CFLAGS + CFLAGS="$CFLAGS -Werror=unknown-warning-option" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ax_cv_check_cflags___Werror_unknown_warning_option=yes +else + ax_cv_check_cflags___Werror_unknown_warning_option=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + CFLAGS=$ax_check_save_flags +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ax_cv_check_cflags___Werror_unknown_warning_option" >&5 +$as_echo "$ax_cv_check_cflags___Werror_unknown_warning_option" >&6; } +if test "x$ax_cv_check_cflags___Werror_unknown_warning_option" = xyes; then : + + ax_compiler_flags_test="-Werror=unknown-warning-option" + +else + + ax_compiler_flags_test="" + +fi + + + # Base flags + + + + +for flag in -fno-strict-aliasing ; do + as_CACHEVAR=`$as_echo "ax_cv_check_cflags_$ax_compiler_flags_test_$flag" | $as_tr_sh` +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 +$as_echo_n "checking whether C compiler accepts $flag... " >&6; } +if eval \${$as_CACHEVAR+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$CFLAGS + CFLAGS="$CFLAGS $ax_compiler_flags_test $flag" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + eval "$as_CACHEVAR=yes" +else + eval "$as_CACHEVAR=no" +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + CFLAGS=$ax_check_save_flags +fi +eval ac_res=\$$as_CACHEVAR + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +if eval test \"x\$"$as_CACHEVAR"\" = x"yes"; then : + +if ${WARN_CFLAGS+:} false; then : + + case " $WARN_CFLAGS " in #( + *" $flag "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS already contains \$flag"; } >&5 + (: WARN_CFLAGS already contains $flag) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_CFLAGS " $flag" + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_CFLAGS=$flag + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + +else + : +fi + +done + + + if test "$ax_enable_compile_warnings" != "no"; then : + + # "yes" flags + + + + +for flag in -Wall -Wextra -Wundef -Wnested-externs -Wwrite-strings -Wpointer-arith -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wredundant-decls -Wno-unused-parameter -Wno-missing-field-initializers -Wdeclaration-after-statement -Wformat=2 -Wold-style-definition -Wcast-align -Wformat-nonliteral -Wformat-security -Wsign-compare -Wstrict-aliasing -Wshadow -Winline -Wpacked -Wmissing-format-attribute -Wmissing-noreturn -Winit-self -Wredundant-decls -Wmissing-include-dirs -Wunused-but-set-variable -Warray-bounds -Wimplicit-function-declaration -Wreturn-type -Wswitch-enum -Wswitch-default ; do + as_CACHEVAR=`$as_echo "ax_cv_check_cflags_$ax_compiler_flags_test_$flag" | $as_tr_sh` +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 +$as_echo_n "checking whether C compiler accepts $flag... " >&6; } +if eval \${$as_CACHEVAR+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$CFLAGS + CFLAGS="$CFLAGS $ax_compiler_flags_test $flag" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + eval "$as_CACHEVAR=yes" +else + eval "$as_CACHEVAR=no" +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + CFLAGS=$ax_check_save_flags +fi +eval ac_res=\$$as_CACHEVAR + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +if eval test \"x\$"$as_CACHEVAR"\" = x"yes"; then : + +if ${WARN_CFLAGS+:} false; then : + + case " $WARN_CFLAGS " in #( + *" $flag "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS already contains \$flag"; } >&5 + (: WARN_CFLAGS already contains $flag) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_CFLAGS " $flag" + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_CFLAGS=$flag + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + +else + : +fi + +done + + +fi + if test "$ax_enable_compile_warnings" = "error"; then : + + # "error" flags; -Werror has to be appended unconditionally because + # it's not possible to test for + # + # suggest-attribute=format is disabled because it gives too many false + # positives + +if ${WARN_CFLAGS+:} false; then : + + case " $WARN_CFLAGS " in #( + *" -Werror "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS already contains -Werror"; } >&5 + (: WARN_CFLAGS already contains -Werror) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_CFLAGS " -Werror" + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_CFLAGS=-Werror + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + + + + + + +for flag in -Wno-suggest-attribute=format ; do + as_CACHEVAR=`$as_echo "ax_cv_check_cflags_$ax_compiler_flags_test_$flag" | $as_tr_sh` +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 +$as_echo_n "checking whether C compiler accepts $flag... " >&6; } +if eval \${$as_CACHEVAR+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$CFLAGS + CFLAGS="$CFLAGS $ax_compiler_flags_test $flag" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + eval "$as_CACHEVAR=yes" +else + eval "$as_CACHEVAR=no" +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + CFLAGS=$ax_check_save_flags +fi +eval ac_res=\$$as_CACHEVAR + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +if eval test \"x\$"$as_CACHEVAR"\" = x"yes"; then : + +if ${WARN_CFLAGS+:} false; then : + + case " $WARN_CFLAGS " in #( + *" $flag "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS already contains \$flag"; } >&5 + (: WARN_CFLAGS already contains $flag) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_CFLAGS " $flag" + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_CFLAGS=$flag + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + +else + : +fi + +done + + +fi + + # In the flags below, when disabling specific flags, always add *both* + # -Wno-foo and -Wno-error=foo. This fixes the situation where (for example) + # we enable -Werror, disable a flag, and a build bot passes CFLAGS=-Wall, + # which effectively turns that flag back on again as an error. + for flag in $WARN_CFLAGS; do + case $flag in #( + -Wno-*=*) : + ;; #( + -Wno-*) : + + + + + +for flag in -Wno-error=$($as_echo $flag | $SED 's/^-Wno-//'); do + as_CACHEVAR=`$as_echo "ax_cv_check_cflags_$ax_compiler_flags_test_$flag" | $as_tr_sh` +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 +$as_echo_n "checking whether C compiler accepts $flag... " >&6; } +if eval \${$as_CACHEVAR+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$CFLAGS + CFLAGS="$CFLAGS $ax_compiler_flags_test $flag" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + eval "$as_CACHEVAR=yes" +else + eval "$as_CACHEVAR=no" +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + CFLAGS=$ax_check_save_flags +fi +eval ac_res=\$$as_CACHEVAR + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +if eval test \"x\$"$as_CACHEVAR"\" = x"yes"; then : + +if ${WARN_CFLAGS+:} false; then : + + case " $WARN_CFLAGS " in #( + *" $flag "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS already contains \$flag"; } >&5 + (: WARN_CFLAGS already contains $flag) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_CFLAGS " $flag" + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_CFLAGS=$flag + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_CFLAGS=\"\$WARN_CFLAGS\""; } >&5 + (: WARN_CFLAGS="$WARN_CFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + +else + : +fi + +done + + ;; #( + *) : + ;; +esac + done + + ac_ext=c +ac_cpp='$CPP $CPPFLAGS' +ac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5' +ac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5' +ac_compiler_gnu=$ac_cv_c_compiler_gnu + + + # Substitute the variables + + + + + + + + + + + + # Variable names + + + # Always pass -Werror=unknown-warning-option to get Clang to fail on bad + # flags, otherwise they are always appended to the warn_ldflags variable, + # and Clang warns on them for every compilation unit. + # If this is passed to GCC, it will explode, so the flag must be enabled + # conditionally. + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts -Werror=unknown-warning-option" >&5 +$as_echo_n "checking whether C compiler accepts -Werror=unknown-warning-option... " >&6; } +if ${ax_cv_check_cflags___Werror_unknown_warning_option+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$CFLAGS + CFLAGS="$CFLAGS -Werror=unknown-warning-option" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ax_cv_check_cflags___Werror_unknown_warning_option=yes +else + ax_cv_check_cflags___Werror_unknown_warning_option=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + CFLAGS=$ax_check_save_flags +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ax_cv_check_cflags___Werror_unknown_warning_option" >&5 +$as_echo "$ax_cv_check_cflags___Werror_unknown_warning_option" >&6; } +if test "x$ax_cv_check_cflags___Werror_unknown_warning_option" = xyes; then : + + ax_compiler_flags_test="-Werror=unknown-warning-option" + +else + + ax_compiler_flags_test="" + +fi + + + # Base flags + + + + +for flag in -Wl,--no-as-needed ; do + as_CACHEVAR=`$as_echo "ax_cv_check_ldflags_$ax_compiler_flags_test_$flag" | $as_tr_sh` +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether the linker accepts $flag" >&5 +$as_echo_n "checking whether the linker accepts $flag... " >&6; } +if eval \${$as_CACHEVAR+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$LDFLAGS + LDFLAGS="$LDFLAGS $ax_compiler_flags_test $flag" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + eval "$as_CACHEVAR=yes" +else + eval "$as_CACHEVAR=no" +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + LDFLAGS=$ax_check_save_flags +fi +eval ac_res=\$$as_CACHEVAR + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +if eval test \"x\$"$as_CACHEVAR"\" = x"yes"; then : + +if ${WARN_LDFLAGS+:} false; then : + + case " $WARN_LDFLAGS " in #( + *" $flag "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS already contains \$flag"; } >&5 + (: WARN_LDFLAGS already contains $flag) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_LDFLAGS " $flag" + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS=\"\$WARN_LDFLAGS\""; } >&5 + (: WARN_LDFLAGS="$WARN_LDFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_LDFLAGS=$flag + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS=\"\$WARN_LDFLAGS\""; } >&5 + (: WARN_LDFLAGS="$WARN_LDFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + +else + : +fi + +done + + + if test "$ax_enable_compile_warnings" != "no"; then : + + # "yes" flags + + + + +for flag in ; do + as_CACHEVAR=`$as_echo "ax_cv_check_ldflags_$ax_compiler_flags_test_$flag" | $as_tr_sh` +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether the linker accepts $flag" >&5 +$as_echo_n "checking whether the linker accepts $flag... " >&6; } +if eval \${$as_CACHEVAR+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$LDFLAGS + LDFLAGS="$LDFLAGS $ax_compiler_flags_test $flag" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + eval "$as_CACHEVAR=yes" +else + eval "$as_CACHEVAR=no" +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + LDFLAGS=$ax_check_save_flags +fi +eval ac_res=\$$as_CACHEVAR + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +if eval test \"x\$"$as_CACHEVAR"\" = x"yes"; then : + +if ${WARN_LDFLAGS+:} false; then : + + case " $WARN_LDFLAGS " in #( + *" $flag "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS already contains \$flag"; } >&5 + (: WARN_LDFLAGS already contains $flag) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_LDFLAGS " $flag" + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS=\"\$WARN_LDFLAGS\""; } >&5 + (: WARN_LDFLAGS="$WARN_LDFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_LDFLAGS=$flag + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS=\"\$WARN_LDFLAGS\""; } >&5 + (: WARN_LDFLAGS="$WARN_LDFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + +else + : +fi + +done + + +fi + if test "$ax_enable_compile_warnings" = "error"; then : + + # "error" flags; -Werror has to be appended unconditionally because + # it's not possible to test for + # + # suggest-attribute=format is disabled because it gives too many false + # positives + + + + +for flag in -Wl,--fatal-warnings ; do + as_CACHEVAR=`$as_echo "ax_cv_check_ldflags_$ax_compiler_flags_test_$flag" | $as_tr_sh` +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether the linker accepts $flag" >&5 +$as_echo_n "checking whether the linker accepts $flag... " >&6; } +if eval \${$as_CACHEVAR+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ax_check_save_flags=$LDFLAGS + LDFLAGS="$LDFLAGS $ax_compiler_flags_test $flag" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + eval "$as_CACHEVAR=yes" +else + eval "$as_CACHEVAR=no" +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + LDFLAGS=$ax_check_save_flags +fi +eval ac_res=\$$as_CACHEVAR + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +if eval test \"x\$"$as_CACHEVAR"\" = x"yes"; then : + +if ${WARN_LDFLAGS+:} false; then : + + case " $WARN_LDFLAGS " in #( + *" $flag "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS already contains \$flag"; } >&5 + (: WARN_LDFLAGS already contains $flag) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_LDFLAGS " $flag" + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS=\"\$WARN_LDFLAGS\""; } >&5 + (: WARN_LDFLAGS="$WARN_LDFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_LDFLAGS=$flag + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_LDFLAGS=\"\$WARN_LDFLAGS\""; } >&5 + (: WARN_LDFLAGS="$WARN_LDFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + +else + : +fi + +done + + +fi + + # Substitute the variables + + + + + + + # Variable names + + + # Base flags + +if ${WARN_SCANNERFLAGS+:} false; then : + + case " $WARN_SCANNERFLAGS " in #( + *" "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS already contains "; } >&5 + (: WARN_SCANNERFLAGS already contains ) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_SCANNERFLAGS " " + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS=\"\$WARN_SCANNERFLAGS\""; } >&5 + (: WARN_SCANNERFLAGS="$WARN_SCANNERFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_SCANNERFLAGS= + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS=\"\$WARN_SCANNERFLAGS\""; } >&5 + (: WARN_SCANNERFLAGS="$WARN_SCANNERFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + + + if test "$ax_enable_compile_warnings" != "no"; then : + + # "yes" flags + +if ${WARN_SCANNERFLAGS+:} false; then : + + case " $WARN_SCANNERFLAGS " in #( + *" --warn-all "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS already contains --warn-all "; } >&5 + (: WARN_SCANNERFLAGS already contains --warn-all ) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_SCANNERFLAGS " --warn-all " + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS=\"\$WARN_SCANNERFLAGS\""; } >&5 + (: WARN_SCANNERFLAGS="$WARN_SCANNERFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_SCANNERFLAGS= --warn-all + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS=\"\$WARN_SCANNERFLAGS\""; } >&5 + (: WARN_SCANNERFLAGS="$WARN_SCANNERFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + + +fi + if test "$ax_enable_compile_warnings" = "error"; then : + + # "error" flags + +if ${WARN_SCANNERFLAGS+:} false; then : + + case " $WARN_SCANNERFLAGS " in #( + *" --warn-error "*) : + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS already contains --warn-error "; } >&5 + (: WARN_SCANNERFLAGS already contains --warn-error ) 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append WARN_SCANNERFLAGS " --warn-error " + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS=\"\$WARN_SCANNERFLAGS\""; } >&5 + (: WARN_SCANNERFLAGS="$WARN_SCANNERFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else + + WARN_SCANNERFLAGS= --warn-error + { { $as_echo "$as_me:${as_lineno-$LINENO}: : WARN_SCANNERFLAGS=\"\$WARN_SCANNERFLAGS\""; } >&5 + (: WARN_SCANNERFLAGS="$WARN_SCANNERFLAGS") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + + +fi + + # Substitute the variables + + + + + + + + +# Checks for libraries. +save_LIBS=$LIBS +LIBS= +as_ac_Search=`$as_echo "ac_cv_search_clock_getm4_defn(2.69)" | $as_tr_sh` +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for library containing clock_getm4_defn(2.69)" >&5 +$as_echo_n "checking for library containing clock_getm4_defn(2.69)... " >&6; } +if eval \${$as_ac_Search+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_func_search_save_LIBS=$LIBS +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char clock_getm4_defn(2.69) (); +int +main () +{ +return clock_getm4_defn(2.69) (); + ; + return 0; +} +_ACEOF +for ac_lib in '' 2.68time; do + if test -z "$ac_lib"; then + ac_res="none required" + else + ac_res=-l$ac_lib + LIBS="-l$ac_lib $ac_func_search_save_LIBS" + fi + if ac_fn_c_try_link "$LINENO"; then : + eval "$as_ac_Search=\$ac_res" +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext + if eval \${$as_ac_Search+:} false; then : + break +fi +done +if eval \${$as_ac_Search+:} false; then : + +else + eval "$as_ac_Search=no" +fi +rm conftest.$ac_ext +LIBS=$ac_func_search_save_LIBS +fi +eval ac_res=\$$as_ac_Search + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +$as_echo "$ac_res" >&6; } +eval ac_res=\$$as_ac_Search +if test "$ac_res" != no; then : + test "$ac_res" = "none required" || LIBS="$ac_res $LIBS" + rt +fi + +LIBS=$save_LIBS + +# Checks for header files. +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for ANSI C header files" >&5 +$as_echo_n "checking for ANSI C header files... " >&6; } +if ${ac_cv_header_stdc+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +#include +#include +#include + +int +main () +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_header_stdc=yes +else + ac_cv_header_stdc=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + +if test $ac_cv_header_stdc = yes; then + # SunOS 4.x string.h does not declare mem*, contrary to ANSI. + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include + +_ACEOF +if (eval "$ac_cpp conftest.$ac_ext") 2>&5 | + $EGREP "memchr" >/dev/null 2>&1; then : + +else + ac_cv_header_stdc=no +fi +rm -f conftest* + +fi + +if test $ac_cv_header_stdc = yes; then + # ISC 2.0.2 stdlib.h does not declare free, contrary to ANSI. + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include + +_ACEOF +if (eval "$ac_cpp conftest.$ac_ext") 2>&5 | + $EGREP "free" >/dev/null 2>&1; then : + +else + ac_cv_header_stdc=no +fi +rm -f conftest* + +fi + +if test $ac_cv_header_stdc = yes; then + # /bin/cc in Irix-4.0.5 gets non-ANSI ctype macros unless using -ansi. + if test "$cross_compiling" = yes; then : + : +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +#include +#if ((' ' & 0x0FF) == 0x020) +# define ISLOWER(c) ('a' <= (c) && (c) <= 'z') +# define TOUPPER(c) (ISLOWER(c) ? 'A' + ((c) - 'a') : (c)) +#else +# define ISLOWER(c) \ + (('a' <= (c) && (c) <= 'i') \ + || ('j' <= (c) && (c) <= 'r') \ + || ('s' <= (c) && (c) <= 'z')) +# define TOUPPER(c) (ISLOWER(c) ? ((c) | 0x40) : (c)) +#endif + +#define XOR(e, f) (((e) && !(f)) || (!(e) && (f))) +int +main () +{ + int i; + for (i = 0; i < 256; i++) + if (XOR (islower (i), ISLOWER (i)) + || toupper (i) != TOUPPER (i)) + return 2; + return 0; +} +_ACEOF +if ac_fn_c_try_run "$LINENO"; then : + +else + ac_cv_header_stdc=no +fi +rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ + conftest.$ac_objext conftest.beam conftest.$ac_ext +fi + +fi +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_header_stdc" >&5 +$as_echo "$ac_cv_header_stdc" >&6; } +if test $ac_cv_header_stdc = yes; then + +$as_echo "#define STDC_HEADERS 1" >>confdefs.h + +fi + +#AC_CHECK_HEADERS([]) + +# Checks for typedefs, structures, and compiler characteristics. +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for an ANSI C-conforming const" >&5 +$as_echo_n "checking for an ANSI C-conforming const... " >&6; } +if ${ac_cv_c_const+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main () +{ + +#ifndef __cplusplus + /* Ultrix mips cc rejects this sort of thing. */ + typedef int charset[2]; + const charset cs = { 0, 0 }; + /* SunOS 4.1.1 cc rejects this. */ + char const *const *pcpcc; + char **ppc; + /* NEC SVR4.0.2 mips cc rejects this. */ + struct point {int x, y;}; + static struct point const zero = {0,0}; + /* AIX XL C 1.02.0.0 rejects this. + It does not let you subtract one const X* pointer from another in + an arm of an if-expression whose if-part is not a constant + expression */ + const char *g = "string"; + pcpcc = &g + (g ? g-g : 0); + /* HPUX 7.0 cc rejects these. */ + ++pcpcc; + ppc = (char**) pcpcc; + pcpcc = (char const *const *) ppc; + { /* SCO 3.2v4 cc rejects this sort of thing. */ + char tx; + char *t = &tx; + char const *s = 0 ? (char *) 0 : (char const *) 0; + + *t++ = 0; + if (s) return 0; + } + { /* Someone thinks the Sun supposedly-ANSI compiler will reject this. */ + int x[] = {25, 17}; + const int *foo = &x[0]; + ++foo; + } + { /* Sun SC1.0 ANSI compiler rejects this -- but not the above. */ + typedef const int *iptr; + iptr p = 0; + ++p; + } + { /* AIX XL C 1.02.0.0 rejects this sort of thing, saying + "k.c", line 2.27: 1506-025 (S) Operand must be a modifiable lvalue. */ + struct s { int j; const int *ap[3]; } bx; + struct s *b = &bx; b->j = 5; + } + { /* ULTRIX-32 V3.1 (Rev 9) vcc rejects this */ + const int foo = 10; + if (!foo) return 0; + } + return !cs[0] && !zero.x; +#endif + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_c_const=yes +else + ac_cv_c_const=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_c_const" >&5 +$as_echo "$ac_cv_c_const" >&6; } +if test $ac_cv_c_const = no; then + +$as_echo "#define const /**/" >>confdefs.h + +fi + +ac_fn_c_check_type "$LINENO" "size_t" "ac_cv_type_size_t" "$ac_includes_default" +if test "x$ac_cv_type_size_t" = xyes; then : + +else + +cat >>confdefs.h <<_ACEOF +#define size_t unsigned int +_ACEOF + +fi + +#AC_HEADER_TIME +#AC_STRUCT_TM + +# Checks for library functions. +for ac_header in sys/select.h sys/socket.h +do : + as_ac_Header=`$as_echo "ac_cv_header_$ac_header" | $as_tr_sh` +ac_fn_c_check_header_mongrel "$LINENO" "$ac_header" "$as_ac_Header" "$ac_includes_default" +if eval test \"x\$"$as_ac_Header"\" = x"yes"; then : + cat >>confdefs.h <<_ACEOF +#define `$as_echo "HAVE_$ac_header" | $as_tr_cpp` 1 +_ACEOF + +fi + +done + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking types of arguments for select" >&5 +$as_echo_n "checking types of arguments for select... " >&6; } +if ${ac_cv_func_select_args+:} false; then : + $as_echo_n "(cached) " >&6 +else + for ac_arg234 in 'fd_set *' 'int *' 'void *'; do + for ac_arg1 in 'int' 'size_t' 'unsigned long int' 'unsigned int'; do + for ac_arg5 in 'struct timeval *' 'const struct timeval *'; do + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +$ac_includes_default +#ifdef HAVE_SYS_SELECT_H +# include +#endif +#ifdef HAVE_SYS_SOCKET_H +# include +#endif + +int +main () +{ +extern int select ($ac_arg1, + $ac_arg234, $ac_arg234, $ac_arg234, + $ac_arg5); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_func_select_args="$ac_arg1,$ac_arg234,$ac_arg5"; break 3 +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + done + done +done +# Provide a safe default value. +: "${ac_cv_func_select_args=int,int *,struct timeval *}" + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_func_select_args" >&5 +$as_echo "$ac_cv_func_select_args" >&6; } +ac_save_IFS=$IFS; IFS=',' +set dummy `echo "$ac_cv_func_select_args" | sed 's/\*/\*/g'` +IFS=$ac_save_IFS +shift + +cat >>confdefs.h <<_ACEOF +#define SELECT_TYPE_ARG1 $1 +_ACEOF + + +cat >>confdefs.h <<_ACEOF +#define SELECT_TYPE_ARG234 ($2) +_ACEOF + + +cat >>confdefs.h <<_ACEOF +#define SELECT_TYPE_ARG5 ($3) +_ACEOF + +rm -f conftest* + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking return type of signal handlers" >&5 +$as_echo_n "checking return type of signal handlers... " >&6; } +if ${ac_cv_type_signal+:} false; then : + $as_echo_n "(cached) " >&6 +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +#include +#include + +int +main () +{ +return *(signal (0, 0)) (0) == 1; + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_type_signal=int +else + ac_cv_type_signal=void +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_type_signal" >&5 +$as_echo "$ac_cv_type_signal" >&6; } + +cat >>confdefs.h <<_ACEOF +#define RETSIGTYPE $ac_cv_type_signal +_ACEOF + + +ac_fn_c_check_decl "$LINENO" "strerror_r" "ac_cv_have_decl_strerror_r" "$ac_includes_default" +if test "x$ac_cv_have_decl_strerror_r" = xyes; then : + ac_have_decl=1 +else + ac_have_decl=0 +fi + +cat >>confdefs.h <<_ACEOF +#define HAVE_DECL_STRERROR_R $ac_have_decl +_ACEOF + +for ac_func in strerror_r +do : + ac_fn_c_check_func "$LINENO" "strerror_r" "ac_cv_func_strerror_r" +if test "x$ac_cv_func_strerror_r" = xyes; then : + cat >>confdefs.h <<_ACEOF +#define HAVE_STRERROR_R 1 +_ACEOF + +fi +done + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether strerror_r returns char *" >&5 +$as_echo_n "checking whether strerror_r returns char *... " >&6; } +if ${ac_cv_func_strerror_r_char_p+:} false; then : + $as_echo_n "(cached) " >&6 +else + + ac_cv_func_strerror_r_char_p=no + if test $ac_cv_have_decl_strerror_r = yes; then + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +$ac_includes_default +int +main () +{ + + char buf[100]; + char x = *strerror_r (0, buf, sizeof buf); + char *p = strerror_r (0, buf, sizeof buf); + return !p || x; + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO"; then : + ac_cv_func_strerror_r_char_p=yes +fi +rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext + else + # strerror_r is not declared. Choose between + # systems that have relatively inaccessible declarations for the + # function. BeOS and DEC UNIX 4.0 fall in this category, but the + # former has a strerror_r that returns char*, while the latter + # has a strerror_r that returns `int'. + # This test should segfault on the DEC system. + if test "$cross_compiling" = yes; then : + : +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ +$ac_includes_default + extern char *strerror_r (); +int +main () +{ +char buf[100]; + char x = *strerror_r (0, buf, sizeof buf); + return ! isalpha (x); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_run "$LINENO"; then : + ac_cv_func_strerror_r_char_p=yes +fi +rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ + conftest.$ac_objext conftest.beam conftest.$ac_ext +fi + + fi + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_func_strerror_r_char_p" >&5 +$as_echo "$ac_cv_func_strerror_r_char_p" >&6; } +if test $ac_cv_func_strerror_r_char_p = yes; then + +$as_echo "#define STRERROR_R_CHAR_P 1" >>confdefs.h + +fi + +for ac_func in strdup strndup strtok_r +do : + as_ac_var=`$as_echo "ac_cv_func_$ac_func" | $as_tr_sh` +ac_fn_c_check_func "$LINENO" "$ac_func" "$as_ac_var" +if eval test \"x\$"$as_ac_var"\" = x"yes"; then : + cat >>confdefs.h <<_ACEOF +#define `$as_echo "HAVE_$ac_func" | $as_tr_cpp` 1 +_ACEOF + +fi +done + + +LIBLOGNORM_CFLAGS="-I\$(top_srcdir)/src" +LIBLOGNORM_LIBS="\$(top_builddir)/src/liblognorm.la" + + + +# modules we require + + + + + + + +if test "x$ac_cv_env_PKG_CONFIG_set" != "xset"; then + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}pkg-config", so it can be a program name with args. +set dummy ${ac_tool_prefix}pkg-config; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_PKG_CONFIG+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $PKG_CONFIG in + [\\/]* | ?:[\\/]*) + ac_cv_path_PKG_CONFIG="$PKG_CONFIG" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_PKG_CONFIG="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + + ;; +esac +fi +PKG_CONFIG=$ac_cv_path_PKG_CONFIG +if test -n "$PKG_CONFIG"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $PKG_CONFIG" >&5 +$as_echo "$PKG_CONFIG" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + +fi +if test -z "$ac_cv_path_PKG_CONFIG"; then + ac_pt_PKG_CONFIG=$PKG_CONFIG + # Extract the first word of "pkg-config", so it can be a program name with args. +set dummy pkg-config; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_ac_pt_PKG_CONFIG+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $ac_pt_PKG_CONFIG in + [\\/]* | ?:[\\/]*) + ac_cv_path_ac_pt_PKG_CONFIG="$ac_pt_PKG_CONFIG" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_ac_pt_PKG_CONFIG="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + + ;; +esac +fi +ac_pt_PKG_CONFIG=$ac_cv_path_ac_pt_PKG_CONFIG +if test -n "$ac_pt_PKG_CONFIG"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_pt_PKG_CONFIG" >&5 +$as_echo "$ac_pt_PKG_CONFIG" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_pt_PKG_CONFIG" = x; then + PKG_CONFIG="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + PKG_CONFIG=$ac_pt_PKG_CONFIG + fi +else + PKG_CONFIG="$ac_cv_path_PKG_CONFIG" +fi + +fi +if test -n "$PKG_CONFIG"; then + _pkg_min_version=0.9.0 + { $as_echo "$as_me:${as_lineno-$LINENO}: checking pkg-config is at least version $_pkg_min_version" >&5 +$as_echo_n "checking pkg-config is at least version $_pkg_min_version... " >&6; } + if $PKG_CONFIG --atleast-pkgconfig-version $_pkg_min_version; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } + else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + PKG_CONFIG="" + fi +fi + +pkg_failed=no +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for LIBESTR" >&5 +$as_echo_n "checking for LIBESTR... " >&6; } + +if test -n "$LIBESTR_CFLAGS"; then + pkg_cv_LIBESTR_CFLAGS="$LIBESTR_CFLAGS" + elif test -n "$PKG_CONFIG"; then + if test -n "$PKG_CONFIG" && \ + { { $as_echo "$as_me:${as_lineno-$LINENO}: \$PKG_CONFIG --exists --print-errors \"libestr >= 0.0.0\""; } >&5 + ($PKG_CONFIG --exists --print-errors "libestr >= 0.0.0") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + pkg_cv_LIBESTR_CFLAGS=`$PKG_CONFIG --cflags "libestr >= 0.0.0" 2>/dev/null` + test "x$?" != "x0" && pkg_failed=yes +else + pkg_failed=yes +fi + else + pkg_failed=untried +fi +if test -n "$LIBESTR_LIBS"; then + pkg_cv_LIBESTR_LIBS="$LIBESTR_LIBS" + elif test -n "$PKG_CONFIG"; then + if test -n "$PKG_CONFIG" && \ + { { $as_echo "$as_me:${as_lineno-$LINENO}: \$PKG_CONFIG --exists --print-errors \"libestr >= 0.0.0\""; } >&5 + ($PKG_CONFIG --exists --print-errors "libestr >= 0.0.0") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + pkg_cv_LIBESTR_LIBS=`$PKG_CONFIG --libs "libestr >= 0.0.0" 2>/dev/null` + test "x$?" != "x0" && pkg_failed=yes +else + pkg_failed=yes +fi + else + pkg_failed=untried +fi + + + +if test $pkg_failed = yes; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + +if $PKG_CONFIG --atleast-pkgconfig-version 0.20; then + _pkg_short_errors_supported=yes +else + _pkg_short_errors_supported=no +fi + if test $_pkg_short_errors_supported = yes; then + LIBESTR_PKG_ERRORS=`$PKG_CONFIG --short-errors --print-errors --cflags --libs "libestr >= 0.0.0" 2>&1` + else + LIBESTR_PKG_ERRORS=`$PKG_CONFIG --print-errors --cflags --libs "libestr >= 0.0.0" 2>&1` + fi + # Put the nasty error message in config.log where it belongs + echo "$LIBESTR_PKG_ERRORS" >&5 + + as_fn_error $? "Package requirements (libestr >= 0.0.0) were not met: + +$LIBESTR_PKG_ERRORS + +Consider adjusting the PKG_CONFIG_PATH environment variable if you +installed software in a non-standard prefix. + +Alternatively, you may set the environment variables LIBESTR_CFLAGS +and LIBESTR_LIBS to avoid the need to call pkg-config. +See the pkg-config man page for more details." "$LINENO" 5 +elif test $pkg_failed = untried; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "The pkg-config script could not be found or is too old. Make sure it +is in your PATH or set the PKG_CONFIG environment variable to the full +path to pkg-config. + +Alternatively, you may set the environment variables LIBESTR_CFLAGS +and LIBESTR_LIBS to avoid the need to call pkg-config. +See the pkg-config man page for more details. + +To get pkg-config, see . +See \`config.log' for more details" "$LINENO" 5; } +else + LIBESTR_CFLAGS=$pkg_cv_LIBESTR_CFLAGS + LIBESTR_LIBS=$pkg_cv_LIBESTR_LIBS + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } + +fi + +pkg_failed=no +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for JSON_C" >&5 +$as_echo_n "checking for JSON_C... " >&6; } + +if test -n "$JSON_C_CFLAGS"; then + pkg_cv_JSON_C_CFLAGS="$JSON_C_CFLAGS" + elif test -n "$PKG_CONFIG"; then + if test -n "$PKG_CONFIG" && \ + { { $as_echo "$as_me:${as_lineno-$LINENO}: \$PKG_CONFIG --exists --print-errors \"libfastjson\""; } >&5 + ($PKG_CONFIG --exists --print-errors "libfastjson") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + pkg_cv_JSON_C_CFLAGS=`$PKG_CONFIG --cflags "libfastjson" 2>/dev/null` + test "x$?" != "x0" && pkg_failed=yes +else + pkg_failed=yes +fi + else + pkg_failed=untried +fi +if test -n "$JSON_C_LIBS"; then + pkg_cv_JSON_C_LIBS="$JSON_C_LIBS" + elif test -n "$PKG_CONFIG"; then + if test -n "$PKG_CONFIG" && \ + { { $as_echo "$as_me:${as_lineno-$LINENO}: \$PKG_CONFIG --exists --print-errors \"libfastjson\""; } >&5 + ($PKG_CONFIG --exists --print-errors "libfastjson") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + pkg_cv_JSON_C_LIBS=`$PKG_CONFIG --libs "libfastjson" 2>/dev/null` + test "x$?" != "x0" && pkg_failed=yes +else + pkg_failed=yes +fi + else + pkg_failed=untried +fi + + + +if test $pkg_failed = yes; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + +if $PKG_CONFIG --atleast-pkgconfig-version 0.20; then + _pkg_short_errors_supported=yes +else + _pkg_short_errors_supported=no +fi + if test $_pkg_short_errors_supported = yes; then + JSON_C_PKG_ERRORS=`$PKG_CONFIG --short-errors --print-errors --cflags --libs "libfastjson" 2>&1` + else + JSON_C_PKG_ERRORS=`$PKG_CONFIG --print-errors --cflags --libs "libfastjson" 2>&1` + fi + # Put the nasty error message in config.log where it belongs + echo "$JSON_C_PKG_ERRORS" >&5 + + as_fn_error $? "Package requirements (libfastjson) were not met: + +$JSON_C_PKG_ERRORS + +Consider adjusting the PKG_CONFIG_PATH environment variable if you +installed software in a non-standard prefix. + +Alternatively, you may set the environment variables JSON_C_CFLAGS +and JSON_C_LIBS to avoid the need to call pkg-config. +See the pkg-config man page for more details." "$LINENO" 5 +elif test $pkg_failed = untried; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "The pkg-config script could not be found or is too old. Make sure it +is in your PATH or set the PKG_CONFIG environment variable to the full +path to pkg-config. + +Alternatively, you may set the environment variables JSON_C_CFLAGS +and JSON_C_LIBS to avoid the need to call pkg-config. +See the pkg-config man page for more details. + +To get pkg-config, see . +See \`config.log' for more details" "$LINENO" 5; } +else + JSON_C_CFLAGS=$pkg_cv_JSON_C_CFLAGS + JSON_C_LIBS=$pkg_cv_JSON_C_LIBS + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } + +fi +# add libestr flags to pkgconfig file for static linking +pkg_config_libs_private=$LIBESTR_LIBS + + +# Regular expressions +# Check whether --enable-regexp was given. +if test "${enable_regexp+set}" = set; then : + enableval=$enable_regexp; case "${enableval}" in + yes) enable_regexp="yes" ;; + no) enable_regexp="no" ;; + *) as_fn_error $? "bad value ${enableval} for --enable-regexp" "$LINENO" 5 ;; + esac +else + enable_regexp="no" + +fi + + if test x$enable_regexp = xyes; then + ENABLE_REGEXP_TRUE= + ENABLE_REGEXP_FALSE='#' +else + ENABLE_REGEXP_TRUE='#' + ENABLE_REGEXP_FALSE= +fi + +if test "$enable_regexp" = "yes"; then + +pkg_failed=no +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for PCRE" >&5 +$as_echo_n "checking for PCRE... " >&6; } + +if test -n "$PCRE_CFLAGS"; then + pkg_cv_PCRE_CFLAGS="$PCRE_CFLAGS" + elif test -n "$PKG_CONFIG"; then + if test -n "$PKG_CONFIG" && \ + { { $as_echo "$as_me:${as_lineno-$LINENO}: \$PKG_CONFIG --exists --print-errors \"libpcre\""; } >&5 + ($PKG_CONFIG --exists --print-errors "libpcre") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + pkg_cv_PCRE_CFLAGS=`$PKG_CONFIG --cflags "libpcre" 2>/dev/null` + test "x$?" != "x0" && pkg_failed=yes +else + pkg_failed=yes +fi + else + pkg_failed=untried +fi +if test -n "$PCRE_LIBS"; then + pkg_cv_PCRE_LIBS="$PCRE_LIBS" + elif test -n "$PKG_CONFIG"; then + if test -n "$PKG_CONFIG" && \ + { { $as_echo "$as_me:${as_lineno-$LINENO}: \$PKG_CONFIG --exists --print-errors \"libpcre\""; } >&5 + ($PKG_CONFIG --exists --print-errors "libpcre") 2>&5 + ac_status=$? + $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; }; then + pkg_cv_PCRE_LIBS=`$PKG_CONFIG --libs "libpcre" 2>/dev/null` + test "x$?" != "x0" && pkg_failed=yes +else + pkg_failed=yes +fi + else + pkg_failed=untried +fi + + + +if test $pkg_failed = yes; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + +if $PKG_CONFIG --atleast-pkgconfig-version 0.20; then + _pkg_short_errors_supported=yes +else + _pkg_short_errors_supported=no +fi + if test $_pkg_short_errors_supported = yes; then + PCRE_PKG_ERRORS=`$PKG_CONFIG --short-errors --print-errors --cflags --libs "libpcre" 2>&1` + else + PCRE_PKG_ERRORS=`$PKG_CONFIG --print-errors --cflags --libs "libpcre" 2>&1` + fi + # Put the nasty error message in config.log where it belongs + echo "$PCRE_PKG_ERRORS" >&5 + + as_fn_error $? "Package requirements (libpcre) were not met: + +$PCRE_PKG_ERRORS + +Consider adjusting the PKG_CONFIG_PATH environment variable if you +installed software in a non-standard prefix. + +Alternatively, you may set the environment variables PCRE_CFLAGS +and PCRE_LIBS to avoid the need to call pkg-config. +See the pkg-config man page for more details." "$LINENO" 5 +elif test $pkg_failed = untried; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } + { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "The pkg-config script could not be found or is too old. Make sure it +is in your PATH or set the PKG_CONFIG environment variable to the full +path to pkg-config. + +Alternatively, you may set the environment variables PCRE_CFLAGS +and PCRE_LIBS to avoid the need to call pkg-config. +See the pkg-config man page for more details. + +To get pkg-config, see . +See \`config.log' for more details" "$LINENO" 5; } +else + PCRE_CFLAGS=$pkg_cv_PCRE_CFLAGS + PCRE_LIBS=$pkg_cv_PCRE_LIBS + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } + +fi + +$as_echo "#define FEATURE_REGEXP 1" >>confdefs.h + + FEATURE_REGEXP=1 +else + FEATURE_REGEXP=0 +fi + + +# debug mode settings +# Check whether --enable-debug was given. +if test "${enable_debug+set}" = set; then : + enableval=$enable_debug; case "${enableval}" in + yes) enable_debug="yes" ;; + no) enable_debug="no" ;; + *) as_fn_error $? "bad value ${enableval} for --enable-debug" "$LINENO" 5 ;; + esac +else + enable_debug="no" + +fi + +if test "$enable_debug" = "yes"; then + +$as_echo "#define DEBUG 1" >>confdefs.h + +fi +if test "$enable_debug" = "no"; then + +$as_echo "#define NDEBUG 1" >>confdefs.h + +fi + +# advanced statistics +# Check whether --enable-advanced-stats was given. +if test "${enable_advanced_stats+set}" = set; then : + enableval=$enable_advanced_stats; case "${enableval}" in + yes) enable_advstats="yes" ;; + no) enable_advstats="no" ;; + *) as_fn_error $? "bad value ${enableval} for --enable-advstats" "$LINENO" 5 ;; + esac +else + enable_advstats="no" + +fi + +if test "$enable_advstats" = "yes"; then + +$as_echo "#define ADVANCED_STATS 1" >>confdefs.h + +fi + +# docs (html) build settings +# Check whether --enable-docs was given. +if test "${enable_docs+set}" = set; then : + enableval=$enable_docs; case "${enableval}" in + yes) enable_docs="yes" ;; + no) enable_docs="no" ;; + *) as_fn_error $? "bad value ${enableval} for --enable-docs" "$LINENO" 5 ;; + esac +else + enable_docs="no" + +fi + +if test "$enable_docs" = "yes"; then + for ac_prog in sphinx-build sphinx-build3 sphinx-build2 +do + # Extract the first word of "$ac_prog", so it can be a program name with args. +set dummy $ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_prog_SPHINXBUILD+:} false; then : + $as_echo_n "(cached) " >&6 +else + if test -n "$SPHINXBUILD"; then + ac_cv_prog_SPHINXBUILD="$SPHINXBUILD" # Let the user override the test. +else +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_prog_SPHINXBUILD="$ac_prog" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + +fi +fi +SPHINXBUILD=$ac_cv_prog_SPHINXBUILD +if test -n "$SPHINXBUILD"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $SPHINXBUILD" >&5 +$as_echo "$SPHINXBUILD" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$SPHINXBUILD" && break +done +test -n "$SPHINXBUILD" || SPHINXBUILD="no" + + if test "$SPHINXBUILD" = "no"; then + as_fn_error $? "sphinx-build is required to build documentation, install it or try --disable-docs" "$LINENO" 5 + fi +fi + if test "$enable_docs" = "yes"; then + ENABLE_DOCS_TRUE= + ENABLE_DOCS_FALSE='#' +else + ENABLE_DOCS_TRUE='#' + ENABLE_DOCS_FALSE= +fi + + +# Check whether --enable-testbench was given. +if test "${enable_testbench+set}" = set; then : + enableval=$enable_testbench; case "${enableval}" in + yes) enable_testbench="yes" ;; + no) enable_testbench="no" ;; + *) as_fn_error $? "bad value ${enableval} for --enable-testbench" "$LINENO" 5 ;; + esac +else + enable_testbench=yes + +fi + + if test x$enable_testbench = xyes; then + ENABLE_TESTBENCH_TRUE= + ENABLE_TESTBENCH_FALSE='#' +else + ENABLE_TESTBENCH_TRUE='#' + ENABLE_TESTBENCH_FALSE= +fi + + +# Check whether --enable-valgrind was given. +if test "${enable_valgrind+set}" = set; then : + enableval=$enable_valgrind; case "${enableval}" in + yes) enable_valgrind="yes" ;; + no) enable_valgrind=="no" ;; + *) as_fn_error $? "bad value ${enableval} for --enable-valgrind" "$LINENO" 5 ;; + esac +else + enable_valgrind=no + +fi + + if test x$enable_valgrind = xyes; then + ENABLE_VALGRIND_TRUE= + ENABLE_VALGRIND_FALSE='#' +else + ENABLE_VALGRIND_TRUE='#' + ENABLE_VALGRIND_FALSE= +fi + +VALGRIND="$enable_valgrind" + + +# Check whether --enable-tools was given. +if test "${enable_tools+set}" = set; then : + enableval=$enable_tools; case "${enableval}" in + yes) enable_tools="yes" ;; + no) enable_tools="no" ;; + *) as_fn_error $? "bad value ${enableval} for --enable-tools" "$LINENO" 5 ;; + esac +else + enable_tools=yes + +fi + + if test x$enable_tools = xyes; then + ENABLE_TOOLS_TRUE= + ENABLE_TOOLS_FALSE='#' +else + ENABLE_TOOLS_TRUE='#' + ENABLE_TOOLS_FALSE= +fi + + +ac_config_files="$ac_config_files Makefile lognorm.pc compat/Makefile doc/Makefile src/Makefile src/lognorm-features.h tools/Makefile tests/Makefile tests/options.sh" + +cat >confcache <<\_ACEOF +# This file is a shell script that caches the results of configure +# tests run on this system so they can be shared between configure +# scripts and configure runs, see configure's option --config-cache. +# It is not useful on other systems. If it contains results you don't +# want to keep, you may remove or edit it. +# +# config.status only pays attention to the cache file if you give it +# the --recheck option to rerun configure. +# +# `ac_cv_env_foo' variables (set or unset) will be overridden when +# loading this file, other *unset* `ac_cv_foo' will be assigned the +# following values. + +_ACEOF + +# The following way of writing the cache mishandles newlines in values, +# but we know of no workaround that is simple, portable, and efficient. +# So, we kill variables containing newlines. +# Ultrix sh set writes to stderr and can't be redirected directly, +# and sets the high bit in the cache file unless we assign to the vars. +( + for ac_var in `(set) 2>&1 | sed -n 's/^\([a-zA-Z_][a-zA-Z0-9_]*\)=.*/\1/p'`; do + eval ac_val=\$$ac_var + case $ac_val in #( + *${as_nl}*) + case $ac_var in #( + *_cv_*) { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: cache variable $ac_var contains a newline" >&5 +$as_echo "$as_me: WARNING: cache variable $ac_var contains a newline" >&2;} ;; + esac + case $ac_var in #( + _ | IFS | as_nl) ;; #( + BASH_ARGV | BASH_SOURCE) eval $ac_var= ;; #( + *) { eval $ac_var=; unset $ac_var;} ;; + esac ;; + esac + done + + (set) 2>&1 | + case $as_nl`(ac_space=' '; set) 2>&1` in #( + *${as_nl}ac_space=\ *) + # `set' does not quote correctly, so add quotes: double-quote + # substitution turns \\\\ into \\, and sed turns \\ into \. + sed -n \ + "s/'/'\\\\''/g; + s/^\\([_$as_cr_alnum]*_cv_[_$as_cr_alnum]*\\)=\\(.*\\)/\\1='\\2'/p" + ;; #( + *) + # `set' quotes correctly as required by POSIX, so do not add quotes. + sed -n "/^[_$as_cr_alnum]*_cv_[_$as_cr_alnum]*=/p" + ;; + esac | + sort +) | + sed ' + /^ac_cv_env_/b end + t clear + :clear + s/^\([^=]*\)=\(.*[{}].*\)$/test "${\1+set}" = set || &/ + t end + s/^\([^=]*\)=\(.*\)$/\1=${\1=\2}/ + :end' >>confcache +if diff "$cache_file" confcache >/dev/null 2>&1; then :; else + if test -w "$cache_file"; then + if test "x$cache_file" != "x/dev/null"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: updating cache $cache_file" >&5 +$as_echo "$as_me: updating cache $cache_file" >&6;} + if test ! -f "$cache_file" || test -h "$cache_file"; then + cat confcache >"$cache_file" + else + case $cache_file in #( + */* | ?:*) + mv -f confcache "$cache_file"$$ && + mv -f "$cache_file"$$ "$cache_file" ;; #( + *) + mv -f confcache "$cache_file" ;; + esac + fi + fi + else + { $as_echo "$as_me:${as_lineno-$LINENO}: not updating unwritable cache $cache_file" >&5 +$as_echo "$as_me: not updating unwritable cache $cache_file" >&6;} + fi +fi +rm -f confcache + +test "x$prefix" = xNONE && prefix=$ac_default_prefix +# Let make expand exec_prefix. +test "x$exec_prefix" = xNONE && exec_prefix='${prefix}' + +DEFS=-DHAVE_CONFIG_H + +ac_libobjs= +ac_ltlibobjs= +U= +for ac_i in : $LIBOBJS; do test "x$ac_i" = x: && continue + # 1. Remove the extension, and $U if already installed. + ac_script='s/\$U\././;s/\.o$//;s/\.obj$//' + ac_i=`$as_echo "$ac_i" | sed "$ac_script"` + # 2. Prepend LIBOBJDIR. When used with automake>=1.10 LIBOBJDIR + # will be set to the directory where LIBOBJS objects are built. + as_fn_append ac_libobjs " \${LIBOBJDIR}$ac_i\$U.$ac_objext" + as_fn_append ac_ltlibobjs " \${LIBOBJDIR}$ac_i"'$U.lo' +done +LIBOBJS=$ac_libobjs + +LTLIBOBJS=$ac_ltlibobjs + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking that generated files are newer than configure" >&5 +$as_echo_n "checking that generated files are newer than configure... " >&6; } + if test -n "$am_sleep_pid"; then + # Hide warnings about reused PIDs. + wait $am_sleep_pid 2>/dev/null + fi + { $as_echo "$as_me:${as_lineno-$LINENO}: result: done" >&5 +$as_echo "done" >&6; } + if test -n "$EXEEXT"; then + am__EXEEXT_TRUE= + am__EXEEXT_FALSE='#' +else + am__EXEEXT_TRUE='#' + am__EXEEXT_FALSE= +fi + +if test -z "${AMDEP_TRUE}" && test -z "${AMDEP_FALSE}"; then + as_fn_error $? "conditional \"AMDEP\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi +if test -z "${am__fastdepCC_TRUE}" && test -z "${am__fastdepCC_FALSE}"; then + as_fn_error $? "conditional \"am__fastdepCC\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi +if test -z "${am__fastdepCC_TRUE}" && test -z "${am__fastdepCC_FALSE}"; then + as_fn_error $? "conditional \"am__fastdepCC\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi +if test -z "${ENABLE_REGEXP_TRUE}" && test -z "${ENABLE_REGEXP_FALSE}"; then + as_fn_error $? "conditional \"ENABLE_REGEXP\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi +if test -z "${ENABLE_DOCS_TRUE}" && test -z "${ENABLE_DOCS_FALSE}"; then + as_fn_error $? "conditional \"ENABLE_DOCS\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi +if test -z "${ENABLE_TESTBENCH_TRUE}" && test -z "${ENABLE_TESTBENCH_FALSE}"; then + as_fn_error $? "conditional \"ENABLE_TESTBENCH\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi +if test -z "${ENABLE_VALGRIND_TRUE}" && test -z "${ENABLE_VALGRIND_FALSE}"; then + as_fn_error $? "conditional \"ENABLE_VALGRIND\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi +if test -z "${ENABLE_TOOLS_TRUE}" && test -z "${ENABLE_TOOLS_FALSE}"; then + as_fn_error $? "conditional \"ENABLE_TOOLS\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi + +: "${CONFIG_STATUS=./config.status}" +ac_write_fail=0 +ac_clean_files_save=$ac_clean_files +ac_clean_files="$ac_clean_files $CONFIG_STATUS" +{ $as_echo "$as_me:${as_lineno-$LINENO}: creating $CONFIG_STATUS" >&5 +$as_echo "$as_me: creating $CONFIG_STATUS" >&6;} +as_write_fail=0 +cat >$CONFIG_STATUS <<_ASEOF || as_write_fail=1 +#! $SHELL +# Generated by $as_me. +# Run this file to recreate the current configuration. +# Compiler output produced by configure, useful for debugging +# configure, is in config.log if it exists. + +debug=false +ac_cs_recheck=false +ac_cs_silent=false + +SHELL=\${CONFIG_SHELL-$SHELL} +export SHELL +_ASEOF +cat >>$CONFIG_STATUS <<\_ASEOF || as_write_fail=1 +## -------------------- ## +## M4sh Initialization. ## +## -------------------- ## + +# Be more Bourne compatible +DUALCASE=1; export DUALCASE # for MKS sh +if test -n "${ZSH_VERSION+set}" && (emulate sh) >/dev/null 2>&1; then : + emulate sh + NULLCMD=: + # Pre-4.2 versions of Zsh do word splitting on ${1+"$@"}, which + # is contrary to our usage. Disable this feature. + alias -g '${1+"$@"}'='"$@"' + setopt NO_GLOB_SUBST +else + case `(set -o) 2>/dev/null` in #( + *posix*) : + set -o posix ;; #( + *) : + ;; +esac +fi + + +as_nl=' +' +export as_nl +# Printing a long string crashes Solaris 7 /usr/bin/printf. +as_echo='\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\' +as_echo=$as_echo$as_echo$as_echo$as_echo$as_echo +as_echo=$as_echo$as_echo$as_echo$as_echo$as_echo$as_echo +# Prefer a ksh shell builtin over an external printf program on Solaris, +# but without wasting forks for bash or zsh. +if test -z "$BASH_VERSION$ZSH_VERSION" \ + && (test "X`print -r -- $as_echo`" = "X$as_echo") 2>/dev/null; then + as_echo='print -r --' + as_echo_n='print -rn --' +elif (test "X`printf %s $as_echo`" = "X$as_echo") 2>/dev/null; then + as_echo='printf %s\n' + as_echo_n='printf %s' +else + if test "X`(/usr/ucb/echo -n -n $as_echo) 2>/dev/null`" = "X-n $as_echo"; then + as_echo_body='eval /usr/ucb/echo -n "$1$as_nl"' + as_echo_n='/usr/ucb/echo -n' + else + as_echo_body='eval expr "X$1" : "X\\(.*\\)"' + as_echo_n_body='eval + arg=$1; + case $arg in #( + *"$as_nl"*) + expr "X$arg" : "X\\(.*\\)$as_nl"; + arg=`expr "X$arg" : ".*$as_nl\\(.*\\)"`;; + esac; + expr "X$arg" : "X\\(.*\\)" | tr -d "$as_nl" + ' + export as_echo_n_body + as_echo_n='sh -c $as_echo_n_body as_echo' + fi + export as_echo_body + as_echo='sh -c $as_echo_body as_echo' +fi + +# The user is always right. +if test "${PATH_SEPARATOR+set}" != set; then + PATH_SEPARATOR=: + (PATH='/bin;/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 && { + (PATH='/bin:/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 || + PATH_SEPARATOR=';' + } +fi + + +# IFS +# We need space, tab and new line, in precisely that order. Quoting is +# there to prevent editors from complaining about space-tab. +# (If _AS_PATH_WALK were called with IFS unset, it would disable word +# splitting by setting IFS to empty value.) +IFS=" "" $as_nl" + +# Find who we are. Look in the path if we contain no directory separator. +as_myself= +case $0 in #(( + *[\\/]* ) as_myself=$0 ;; + *) as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + test -r "$as_dir/$0" && as_myself=$as_dir/$0 && break + done +IFS=$as_save_IFS + + ;; +esac +# We did not find ourselves, most probably we were run as `sh COMMAND' +# in which case we are not to be found in the path. +if test "x$as_myself" = x; then + as_myself=$0 +fi +if test ! -f "$as_myself"; then + $as_echo "$as_myself: error: cannot find myself; rerun with an absolute file name" >&2 + exit 1 +fi + +# Unset variables that we do not need and which cause bugs (e.g. in +# pre-3.0 UWIN ksh). But do not cause bugs in bash 2.01; the "|| exit 1" +# suppresses any "Segmentation fault" message there. '((' could +# trigger a bug in pdksh 5.2.14. +for as_var in BASH_ENV ENV MAIL MAILPATH +do eval test x\${$as_var+set} = xset \ + && ( (unset $as_var) || exit 1) >/dev/null 2>&1 && unset $as_var || : +done +PS1='$ ' +PS2='> ' +PS4='+ ' + +# NLS nuisances. +LC_ALL=C +export LC_ALL +LANGUAGE=C +export LANGUAGE + +# CDPATH. +(unset CDPATH) >/dev/null 2>&1 && unset CDPATH + + +# as_fn_error STATUS ERROR [LINENO LOG_FD] +# ---------------------------------------- +# Output "`basename $0`: error: ERROR" to stderr. If LINENO and LOG_FD are +# provided, also output the error to LOG_FD, referencing LINENO. Then exit the +# script with STATUS, using 1 if that was 0. +as_fn_error () +{ + as_status=$1; test $as_status -eq 0 && as_status=1 + if test "$4"; then + as_lineno=${as_lineno-"$3"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + $as_echo "$as_me:${as_lineno-$LINENO}: error: $2" >&$4 + fi + $as_echo "$as_me: error: $2" >&2 + as_fn_exit $as_status +} # as_fn_error + + +# as_fn_set_status STATUS +# ----------------------- +# Set $? to STATUS, without forking. +as_fn_set_status () +{ + return $1 +} # as_fn_set_status + +# as_fn_exit STATUS +# ----------------- +# Exit the shell with STATUS, even in a "trap 0" or "set -e" context. +as_fn_exit () +{ + set +e + as_fn_set_status $1 + exit $1 +} # as_fn_exit + +# as_fn_unset VAR +# --------------- +# Portably unset VAR. +as_fn_unset () +{ + { eval $1=; unset $1;} +} +as_unset=as_fn_unset +# as_fn_append VAR VALUE +# ---------------------- +# Append the text in VALUE to the end of the definition contained in VAR. Take +# advantage of any shell optimizations that allow amortized linear growth over +# repeated appends, instead of the typical quadratic growth present in naive +# implementations. +if (eval "as_var=1; as_var+=2; test x\$as_var = x12") 2>/dev/null; then : + eval 'as_fn_append () + { + eval $1+=\$2 + }' +else + as_fn_append () + { + eval $1=\$$1\$2 + } +fi # as_fn_append + +# as_fn_arith ARG... +# ------------------ +# Perform arithmetic evaluation on the ARGs, and store the result in the +# global $as_val. Take advantage of shells that can avoid forks. The arguments +# must be portable across $(()) and expr. +if (eval "test \$(( 1 + 1 )) = 2") 2>/dev/null; then : + eval 'as_fn_arith () + { + as_val=$(( $* )) + }' +else + as_fn_arith () + { + as_val=`expr "$@" || test $? -eq 1` + } +fi # as_fn_arith + + +if expr a : '\(a\)' >/dev/null 2>&1 && + test "X`expr 00001 : '.*\(...\)'`" = X001; then + as_expr=expr +else + as_expr=false +fi + +if (basename -- /) >/dev/null 2>&1 && test "X`basename -- / 2>&1`" = "X/"; then + as_basename=basename +else + as_basename=false +fi + +if (as_dir=`dirname -- /` && test "X$as_dir" = X/) >/dev/null 2>&1; then + as_dirname=dirname +else + as_dirname=false +fi + +as_me=`$as_basename -- "$0" || +$as_expr X/"$0" : '.*/\([^/][^/]*\)/*$' \| \ + X"$0" : 'X\(//\)$' \| \ + X"$0" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X/"$0" | + sed '/^.*\/\([^/][^/]*\)\/*$/{ + s//\1/ + q + } + /^X\/\(\/\/\)$/{ + s//\1/ + q + } + /^X\/\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'` + +# Avoid depending upon Character Ranges. +as_cr_letters='abcdefghijklmnopqrstuvwxyz' +as_cr_LETTERS='ABCDEFGHIJKLMNOPQRSTUVWXYZ' +as_cr_Letters=$as_cr_letters$as_cr_LETTERS +as_cr_digits='0123456789' +as_cr_alnum=$as_cr_Letters$as_cr_digits + +ECHO_C= ECHO_N= ECHO_T= +case `echo -n x` in #((((( +-n*) + case `echo 'xy\c'` in + *c*) ECHO_T=' ';; # ECHO_T is single tab character. + xy) ECHO_C='\c';; + *) echo `echo ksh88 bug on AIX 6.1` > /dev/null + ECHO_T=' ';; + esac;; +*) + ECHO_N='-n';; +esac + +rm -f conf$$ conf$$.exe conf$$.file +if test -d conf$$.dir; then + rm -f conf$$.dir/conf$$.file +else + rm -f conf$$.dir + mkdir conf$$.dir 2>/dev/null +fi +if (echo >conf$$.file) 2>/dev/null; then + if ln -s conf$$.file conf$$ 2>/dev/null; then + as_ln_s='ln -s' + # ... but there are two gotchas: + # 1) On MSYS, both `ln -s file dir' and `ln file dir' fail. + # 2) DJGPP < 2.04 has no symlinks; `ln -s' creates a wrapper executable. + # In both cases, we have to default to `cp -pR'. + ln -s conf$$.file conf$$.dir 2>/dev/null && test ! -f conf$$.exe || + as_ln_s='cp -pR' + elif ln conf$$.file conf$$ 2>/dev/null; then + as_ln_s=ln + else + as_ln_s='cp -pR' + fi +else + as_ln_s='cp -pR' +fi +rm -f conf$$ conf$$.exe conf$$.dir/conf$$.file conf$$.file +rmdir conf$$.dir 2>/dev/null + + +# as_fn_mkdir_p +# ------------- +# Create "$as_dir" as a directory, including parents if necessary. +as_fn_mkdir_p () +{ + + case $as_dir in #( + -*) as_dir=./$as_dir;; + esac + test -d "$as_dir" || eval $as_mkdir_p || { + as_dirs= + while :; do + case $as_dir in #( + *\'*) as_qdir=`$as_echo "$as_dir" | sed "s/'/'\\\\\\\\''/g"`;; #'( + *) as_qdir=$as_dir;; + esac + as_dirs="'$as_qdir' $as_dirs" + as_dir=`$as_dirname -- "$as_dir" || +$as_expr X"$as_dir" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \ + X"$as_dir" : 'X\(//\)[^/]' \| \ + X"$as_dir" : 'X\(//\)$' \| \ + X"$as_dir" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X"$as_dir" | + sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{ + s//\1/ + q + } + /^X\(\/\/\)[^/].*/{ + s//\1/ + q + } + /^X\(\/\/\)$/{ + s//\1/ + q + } + /^X\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'` + test -d "$as_dir" && break + done + test -z "$as_dirs" || eval "mkdir $as_dirs" + } || test -d "$as_dir" || as_fn_error $? "cannot create directory $as_dir" + + +} # as_fn_mkdir_p +if mkdir -p . 2>/dev/null; then + as_mkdir_p='mkdir -p "$as_dir"' +else + test -d ./-p && rmdir ./-p + as_mkdir_p=false +fi + + +# as_fn_executable_p FILE +# ----------------------- +# Test if FILE is an executable regular file. +as_fn_executable_p () +{ + test -f "$1" && test -x "$1" +} # as_fn_executable_p +as_test_x='test -x' +as_executable_p=as_fn_executable_p + +# Sed expression to map a string onto a valid CPP name. +as_tr_cpp="eval sed 'y%*$as_cr_letters%P$as_cr_LETTERS%;s%[^_$as_cr_alnum]%_%g'" + +# Sed expression to map a string onto a valid variable name. +as_tr_sh="eval sed 'y%*+%pp%;s%[^_$as_cr_alnum]%_%g'" + + +exec 6>&1 +## ----------------------------------- ## +## Main body of $CONFIG_STATUS script. ## +## ----------------------------------- ## +_ASEOF +test $as_write_fail = 0 && chmod +x $CONFIG_STATUS || ac_write_fail=1 + +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 +# Save the log message, to keep $0 and so on meaningful, and to +# report actual input values of CONFIG_FILES etc. instead of their +# values after options handling. +ac_log=" +This file was extended by liblognorm $as_me 2.0.5, which was +generated by GNU Autoconf 2.69. Invocation command line was + + CONFIG_FILES = $CONFIG_FILES + CONFIG_HEADERS = $CONFIG_HEADERS + CONFIG_LINKS = $CONFIG_LINKS + CONFIG_COMMANDS = $CONFIG_COMMANDS + $ $0 $@ + +on `(hostname || uname -n) 2>/dev/null | sed 1q` +" + +_ACEOF + +case $ac_config_files in *" +"*) set x $ac_config_files; shift; ac_config_files=$*;; +esac + +case $ac_config_headers in *" +"*) set x $ac_config_headers; shift; ac_config_headers=$*;; +esac + + +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 +# Files that config.status was made for. +config_files="$ac_config_files" +config_headers="$ac_config_headers" +config_commands="$ac_config_commands" + +_ACEOF + +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 +ac_cs_usage="\ +\`$as_me' instantiates files and other configuration actions +from templates according to the current configuration. Unless the files +and actions are specified as TAGs, all are instantiated by default. + +Usage: $0 [OPTION]... [TAG]... + + -h, --help print this help, then exit + -V, --version print version number and configuration settings, then exit + --config print configuration, then exit + -q, --quiet, --silent + do not print progress messages + -d, --debug don't remove temporary files + --recheck update $as_me by reconfiguring in the same conditions + --file=FILE[:TEMPLATE] + instantiate the configuration file FILE + --header=FILE[:TEMPLATE] + instantiate the configuration header FILE + +Configuration files: +$config_files + +Configuration headers: +$config_headers + +Configuration commands: +$config_commands + +Report bugs to ." + +_ACEOF +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 +ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`" +ac_cs_version="\\ +liblognorm config.status 2.0.5 +configured by $0, generated by GNU Autoconf 2.69, + with options \\"\$ac_cs_config\\" + +Copyright (C) 2012 Free Software Foundation, Inc. +This config.status script is free software; the Free Software Foundation +gives unlimited permission to copy, distribute and modify it." + +ac_pwd='$ac_pwd' +srcdir='$srcdir' +INSTALL='$INSTALL' +MKDIR_P='$MKDIR_P' +AWK='$AWK' +test -n "\$AWK" || AWK=awk +_ACEOF + +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 +# The default lists apply if the user does not specify any file. +ac_need_defaults=: +while test $# != 0 +do + case $1 in + --*=?*) + ac_option=`expr "X$1" : 'X\([^=]*\)='` + ac_optarg=`expr "X$1" : 'X[^=]*=\(.*\)'` + ac_shift=: + ;; + --*=) + ac_option=`expr "X$1" : 'X\([^=]*\)='` + ac_optarg= + ac_shift=: + ;; + *) + ac_option=$1 + ac_optarg=$2 + ac_shift=shift + ;; + esac + + case $ac_option in + # Handling of the options. + -recheck | --recheck | --rechec | --reche | --rech | --rec | --re | --r) + ac_cs_recheck=: ;; + --version | --versio | --versi | --vers | --ver | --ve | --v | -V ) + $as_echo "$ac_cs_version"; exit ;; + --config | --confi | --conf | --con | --co | --c ) + $as_echo "$ac_cs_config"; exit ;; + --debug | --debu | --deb | --de | --d | -d ) + debug=: ;; + --file | --fil | --fi | --f ) + $ac_shift + case $ac_optarg in + *\'*) ac_optarg=`$as_echo "$ac_optarg" | sed "s/'/'\\\\\\\\''/g"` ;; + '') as_fn_error $? "missing file argument" ;; + esac + as_fn_append CONFIG_FILES " '$ac_optarg'" + ac_need_defaults=false;; + --header | --heade | --head | --hea ) + $ac_shift + case $ac_optarg in + *\'*) ac_optarg=`$as_echo "$ac_optarg" | sed "s/'/'\\\\\\\\''/g"` ;; + esac + as_fn_append CONFIG_HEADERS " '$ac_optarg'" + ac_need_defaults=false;; + --he | --h) + # Conflict between --help and --header + as_fn_error $? "ambiguous option: \`$1' +Try \`$0 --help' for more information.";; + --help | --hel | -h ) + $as_echo "$ac_cs_usage"; exit ;; + -q | -quiet | --quiet | --quie | --qui | --qu | --q \ + | -silent | --silent | --silen | --sile | --sil | --si | --s) + ac_cs_silent=: ;; + + # This is an error. + -*) as_fn_error $? "unrecognized option: \`$1' +Try \`$0 --help' for more information." ;; + + *) as_fn_append ac_config_targets " $1" + ac_need_defaults=false ;; + + esac + shift +done + +ac_configure_extra_args= + +if $ac_cs_silent; then + exec 6>/dev/null + ac_configure_extra_args="$ac_configure_extra_args --silent" +fi + +_ACEOF +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 +if \$ac_cs_recheck; then + set X $SHELL '$0' $ac_configure_args \$ac_configure_extra_args --no-create --no-recursion + shift + \$as_echo "running CONFIG_SHELL=$SHELL \$*" >&6 + CONFIG_SHELL='$SHELL' + export CONFIG_SHELL + exec "\$@" +fi + +_ACEOF +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 +exec 5>>config.log +{ + echo + sed 'h;s/./-/g;s/^.../## /;s/...$/ ##/;p;x;p;x' <<_ASBOX +## Running $as_me. ## +_ASBOX + $as_echo "$ac_log" +} >&5 + +_ACEOF +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 +# +# INIT-COMMANDS +# +AMDEP_TRUE="$AMDEP_TRUE" ac_aux_dir="$ac_aux_dir" + + +# The HP-UX ksh and POSIX shell print the target directory to stdout +# if CDPATH is set. +(unset CDPATH) >/dev/null 2>&1 && unset CDPATH + +sed_quote_subst='$sed_quote_subst' +double_quote_subst='$double_quote_subst' +delay_variable_subst='$delay_variable_subst' +macro_version='`$ECHO "$macro_version" | $SED "$delay_single_quote_subst"`' +macro_revision='`$ECHO "$macro_revision" | $SED "$delay_single_quote_subst"`' +enable_shared='`$ECHO "$enable_shared" | $SED "$delay_single_quote_subst"`' +enable_static='`$ECHO "$enable_static" | $SED "$delay_single_quote_subst"`' +pic_mode='`$ECHO "$pic_mode" | $SED "$delay_single_quote_subst"`' +enable_fast_install='`$ECHO "$enable_fast_install" | $SED "$delay_single_quote_subst"`' +shared_archive_member_spec='`$ECHO "$shared_archive_member_spec" | $SED "$delay_single_quote_subst"`' +SHELL='`$ECHO "$SHELL" | $SED "$delay_single_quote_subst"`' +ECHO='`$ECHO "$ECHO" | $SED "$delay_single_quote_subst"`' +PATH_SEPARATOR='`$ECHO "$PATH_SEPARATOR" | $SED "$delay_single_quote_subst"`' +host_alias='`$ECHO "$host_alias" | $SED "$delay_single_quote_subst"`' +host='`$ECHO "$host" | $SED "$delay_single_quote_subst"`' +host_os='`$ECHO "$host_os" | $SED "$delay_single_quote_subst"`' +build_alias='`$ECHO "$build_alias" | $SED "$delay_single_quote_subst"`' +build='`$ECHO "$build" | $SED "$delay_single_quote_subst"`' +build_os='`$ECHO "$build_os" | $SED "$delay_single_quote_subst"`' +SED='`$ECHO "$SED" | $SED "$delay_single_quote_subst"`' +Xsed='`$ECHO "$Xsed" | $SED "$delay_single_quote_subst"`' +GREP='`$ECHO "$GREP" | $SED "$delay_single_quote_subst"`' +EGREP='`$ECHO "$EGREP" | $SED "$delay_single_quote_subst"`' +FGREP='`$ECHO "$FGREP" | $SED "$delay_single_quote_subst"`' +LD='`$ECHO "$LD" | $SED "$delay_single_quote_subst"`' +NM='`$ECHO "$NM" | $SED "$delay_single_quote_subst"`' +LN_S='`$ECHO "$LN_S" | $SED "$delay_single_quote_subst"`' +max_cmd_len='`$ECHO "$max_cmd_len" | $SED "$delay_single_quote_subst"`' +ac_objext='`$ECHO "$ac_objext" | $SED "$delay_single_quote_subst"`' +exeext='`$ECHO "$exeext" | $SED "$delay_single_quote_subst"`' +lt_unset='`$ECHO "$lt_unset" | $SED "$delay_single_quote_subst"`' +lt_SP2NL='`$ECHO "$lt_SP2NL" | $SED "$delay_single_quote_subst"`' +lt_NL2SP='`$ECHO "$lt_NL2SP" | $SED "$delay_single_quote_subst"`' +lt_cv_to_host_file_cmd='`$ECHO "$lt_cv_to_host_file_cmd" | $SED "$delay_single_quote_subst"`' +lt_cv_to_tool_file_cmd='`$ECHO "$lt_cv_to_tool_file_cmd" | $SED "$delay_single_quote_subst"`' +reload_flag='`$ECHO "$reload_flag" | $SED "$delay_single_quote_subst"`' +reload_cmds='`$ECHO "$reload_cmds" | $SED "$delay_single_quote_subst"`' +OBJDUMP='`$ECHO "$OBJDUMP" | $SED "$delay_single_quote_subst"`' +deplibs_check_method='`$ECHO "$deplibs_check_method" | $SED "$delay_single_quote_subst"`' +file_magic_cmd='`$ECHO "$file_magic_cmd" | $SED "$delay_single_quote_subst"`' +file_magic_glob='`$ECHO "$file_magic_glob" | $SED "$delay_single_quote_subst"`' +want_nocaseglob='`$ECHO "$want_nocaseglob" | $SED "$delay_single_quote_subst"`' +DLLTOOL='`$ECHO "$DLLTOOL" | $SED "$delay_single_quote_subst"`' +sharedlib_from_linklib_cmd='`$ECHO "$sharedlib_from_linklib_cmd" | $SED "$delay_single_quote_subst"`' +AR='`$ECHO "$AR" | $SED "$delay_single_quote_subst"`' +AR_FLAGS='`$ECHO "$AR_FLAGS" | $SED "$delay_single_quote_subst"`' +archiver_list_spec='`$ECHO "$archiver_list_spec" | $SED "$delay_single_quote_subst"`' +STRIP='`$ECHO "$STRIP" | $SED "$delay_single_quote_subst"`' +RANLIB='`$ECHO "$RANLIB" | $SED "$delay_single_quote_subst"`' +old_postinstall_cmds='`$ECHO "$old_postinstall_cmds" | $SED "$delay_single_quote_subst"`' +old_postuninstall_cmds='`$ECHO "$old_postuninstall_cmds" | $SED "$delay_single_quote_subst"`' +old_archive_cmds='`$ECHO "$old_archive_cmds" | $SED "$delay_single_quote_subst"`' +lock_old_archive_extraction='`$ECHO "$lock_old_archive_extraction" | $SED "$delay_single_quote_subst"`' +CC='`$ECHO "$CC" | $SED "$delay_single_quote_subst"`' +CFLAGS='`$ECHO "$CFLAGS" | $SED "$delay_single_quote_subst"`' +compiler='`$ECHO "$compiler" | $SED "$delay_single_quote_subst"`' +GCC='`$ECHO "$GCC" | $SED "$delay_single_quote_subst"`' +lt_cv_sys_global_symbol_pipe='`$ECHO "$lt_cv_sys_global_symbol_pipe" | $SED "$delay_single_quote_subst"`' +lt_cv_sys_global_symbol_to_cdecl='`$ECHO "$lt_cv_sys_global_symbol_to_cdecl" | $SED "$delay_single_quote_subst"`' +lt_cv_sys_global_symbol_to_import='`$ECHO "$lt_cv_sys_global_symbol_to_import" | $SED "$delay_single_quote_subst"`' +lt_cv_sys_global_symbol_to_c_name_address='`$ECHO "$lt_cv_sys_global_symbol_to_c_name_address" | $SED "$delay_single_quote_subst"`' +lt_cv_sys_global_symbol_to_c_name_address_lib_prefix='`$ECHO "$lt_cv_sys_global_symbol_to_c_name_address_lib_prefix" | $SED "$delay_single_quote_subst"`' +lt_cv_nm_interface='`$ECHO "$lt_cv_nm_interface" | $SED "$delay_single_quote_subst"`' +nm_file_list_spec='`$ECHO "$nm_file_list_spec" | $SED "$delay_single_quote_subst"`' +lt_sysroot='`$ECHO "$lt_sysroot" | $SED "$delay_single_quote_subst"`' +lt_cv_truncate_bin='`$ECHO "$lt_cv_truncate_bin" | $SED "$delay_single_quote_subst"`' +objdir='`$ECHO "$objdir" | $SED "$delay_single_quote_subst"`' +MAGIC_CMD='`$ECHO "$MAGIC_CMD" | $SED "$delay_single_quote_subst"`' +lt_prog_compiler_no_builtin_flag='`$ECHO "$lt_prog_compiler_no_builtin_flag" | $SED "$delay_single_quote_subst"`' +lt_prog_compiler_pic='`$ECHO "$lt_prog_compiler_pic" | $SED "$delay_single_quote_subst"`' +lt_prog_compiler_wl='`$ECHO "$lt_prog_compiler_wl" | $SED "$delay_single_quote_subst"`' +lt_prog_compiler_static='`$ECHO "$lt_prog_compiler_static" | $SED "$delay_single_quote_subst"`' +lt_cv_prog_compiler_c_o='`$ECHO "$lt_cv_prog_compiler_c_o" | $SED "$delay_single_quote_subst"`' +need_locks='`$ECHO "$need_locks" | $SED "$delay_single_quote_subst"`' +MANIFEST_TOOL='`$ECHO "$MANIFEST_TOOL" | $SED "$delay_single_quote_subst"`' +DSYMUTIL='`$ECHO "$DSYMUTIL" | $SED "$delay_single_quote_subst"`' +NMEDIT='`$ECHO "$NMEDIT" | $SED "$delay_single_quote_subst"`' +LIPO='`$ECHO "$LIPO" | $SED "$delay_single_quote_subst"`' +OTOOL='`$ECHO "$OTOOL" | $SED "$delay_single_quote_subst"`' +OTOOL64='`$ECHO "$OTOOL64" | $SED "$delay_single_quote_subst"`' +libext='`$ECHO "$libext" | $SED "$delay_single_quote_subst"`' +shrext_cmds='`$ECHO "$shrext_cmds" | $SED "$delay_single_quote_subst"`' +extract_expsyms_cmds='`$ECHO "$extract_expsyms_cmds" | $SED "$delay_single_quote_subst"`' +archive_cmds_need_lc='`$ECHO "$archive_cmds_need_lc" | $SED "$delay_single_quote_subst"`' +enable_shared_with_static_runtimes='`$ECHO "$enable_shared_with_static_runtimes" | $SED "$delay_single_quote_subst"`' +export_dynamic_flag_spec='`$ECHO "$export_dynamic_flag_spec" | $SED "$delay_single_quote_subst"`' +whole_archive_flag_spec='`$ECHO "$whole_archive_flag_spec" | $SED "$delay_single_quote_subst"`' +compiler_needs_object='`$ECHO "$compiler_needs_object" | $SED "$delay_single_quote_subst"`' +old_archive_from_new_cmds='`$ECHO "$old_archive_from_new_cmds" | $SED "$delay_single_quote_subst"`' +old_archive_from_expsyms_cmds='`$ECHO "$old_archive_from_expsyms_cmds" | $SED "$delay_single_quote_subst"`' +archive_cmds='`$ECHO "$archive_cmds" | $SED "$delay_single_quote_subst"`' +archive_expsym_cmds='`$ECHO "$archive_expsym_cmds" | $SED "$delay_single_quote_subst"`' +module_cmds='`$ECHO "$module_cmds" | $SED "$delay_single_quote_subst"`' +module_expsym_cmds='`$ECHO "$module_expsym_cmds" | $SED "$delay_single_quote_subst"`' +with_gnu_ld='`$ECHO "$with_gnu_ld" | $SED "$delay_single_quote_subst"`' +allow_undefined_flag='`$ECHO "$allow_undefined_flag" | $SED "$delay_single_quote_subst"`' +no_undefined_flag='`$ECHO "$no_undefined_flag" | $SED "$delay_single_quote_subst"`' +hardcode_libdir_flag_spec='`$ECHO "$hardcode_libdir_flag_spec" | $SED "$delay_single_quote_subst"`' +hardcode_libdir_separator='`$ECHO "$hardcode_libdir_separator" | $SED "$delay_single_quote_subst"`' +hardcode_direct='`$ECHO "$hardcode_direct" | $SED "$delay_single_quote_subst"`' +hardcode_direct_absolute='`$ECHO "$hardcode_direct_absolute" | $SED "$delay_single_quote_subst"`' +hardcode_minus_L='`$ECHO "$hardcode_minus_L" | $SED "$delay_single_quote_subst"`' +hardcode_shlibpath_var='`$ECHO "$hardcode_shlibpath_var" | $SED "$delay_single_quote_subst"`' +hardcode_automatic='`$ECHO "$hardcode_automatic" | $SED "$delay_single_quote_subst"`' +inherit_rpath='`$ECHO "$inherit_rpath" | $SED "$delay_single_quote_subst"`' +link_all_deplibs='`$ECHO "$link_all_deplibs" | $SED "$delay_single_quote_subst"`' +always_export_symbols='`$ECHO "$always_export_symbols" | $SED "$delay_single_quote_subst"`' +export_symbols_cmds='`$ECHO "$export_symbols_cmds" | $SED "$delay_single_quote_subst"`' +exclude_expsyms='`$ECHO "$exclude_expsyms" | $SED "$delay_single_quote_subst"`' +include_expsyms='`$ECHO "$include_expsyms" | $SED "$delay_single_quote_subst"`' +prelink_cmds='`$ECHO "$prelink_cmds" | $SED "$delay_single_quote_subst"`' +postlink_cmds='`$ECHO "$postlink_cmds" | $SED "$delay_single_quote_subst"`' +file_list_spec='`$ECHO "$file_list_spec" | $SED "$delay_single_quote_subst"`' +variables_saved_for_relink='`$ECHO "$variables_saved_for_relink" | $SED "$delay_single_quote_subst"`' +need_lib_prefix='`$ECHO "$need_lib_prefix" | $SED "$delay_single_quote_subst"`' +need_version='`$ECHO "$need_version" | $SED "$delay_single_quote_subst"`' +version_type='`$ECHO "$version_type" | $SED "$delay_single_quote_subst"`' +runpath_var='`$ECHO "$runpath_var" | $SED "$delay_single_quote_subst"`' +shlibpath_var='`$ECHO "$shlibpath_var" | $SED "$delay_single_quote_subst"`' +shlibpath_overrides_runpath='`$ECHO "$shlibpath_overrides_runpath" | $SED "$delay_single_quote_subst"`' +libname_spec='`$ECHO "$libname_spec" | $SED "$delay_single_quote_subst"`' +library_names_spec='`$ECHO "$library_names_spec" | $SED "$delay_single_quote_subst"`' +soname_spec='`$ECHO "$soname_spec" | $SED "$delay_single_quote_subst"`' +install_override_mode='`$ECHO "$install_override_mode" | $SED "$delay_single_quote_subst"`' +postinstall_cmds='`$ECHO "$postinstall_cmds" | $SED "$delay_single_quote_subst"`' +postuninstall_cmds='`$ECHO "$postuninstall_cmds" | $SED "$delay_single_quote_subst"`' +finish_cmds='`$ECHO "$finish_cmds" | $SED "$delay_single_quote_subst"`' +finish_eval='`$ECHO "$finish_eval" | $SED "$delay_single_quote_subst"`' +hardcode_into_libs='`$ECHO "$hardcode_into_libs" | $SED "$delay_single_quote_subst"`' +sys_lib_search_path_spec='`$ECHO "$sys_lib_search_path_spec" | $SED "$delay_single_quote_subst"`' +configure_time_dlsearch_path='`$ECHO "$configure_time_dlsearch_path" | $SED "$delay_single_quote_subst"`' +configure_time_lt_sys_library_path='`$ECHO "$configure_time_lt_sys_library_path" | $SED "$delay_single_quote_subst"`' +hardcode_action='`$ECHO "$hardcode_action" | $SED "$delay_single_quote_subst"`' +enable_dlopen='`$ECHO "$enable_dlopen" | $SED "$delay_single_quote_subst"`' +enable_dlopen_self='`$ECHO "$enable_dlopen_self" | $SED "$delay_single_quote_subst"`' +enable_dlopen_self_static='`$ECHO "$enable_dlopen_self_static" | $SED "$delay_single_quote_subst"`' +old_striplib='`$ECHO "$old_striplib" | $SED "$delay_single_quote_subst"`' +striplib='`$ECHO "$striplib" | $SED "$delay_single_quote_subst"`' + +LTCC='$LTCC' +LTCFLAGS='$LTCFLAGS' +compiler='$compiler_DEFAULT' + +# A function that is used when there is no print builtin or printf. +func_fallback_echo () +{ + eval 'cat <<_LTECHO_EOF +\$1 +_LTECHO_EOF' +} + +# Quote evaled strings. +for var in SHELL \ +ECHO \ +PATH_SEPARATOR \ +SED \ +GREP \ +EGREP \ +FGREP \ +LD \ +NM \ +LN_S \ +lt_SP2NL \ +lt_NL2SP \ +reload_flag \ +OBJDUMP \ +deplibs_check_method \ +file_magic_cmd \ +file_magic_glob \ +want_nocaseglob \ +DLLTOOL \ +sharedlib_from_linklib_cmd \ +AR \ +AR_FLAGS \ +archiver_list_spec \ +STRIP \ +RANLIB \ +CC \ +CFLAGS \ +compiler \ +lt_cv_sys_global_symbol_pipe \ +lt_cv_sys_global_symbol_to_cdecl \ +lt_cv_sys_global_symbol_to_import \ +lt_cv_sys_global_symbol_to_c_name_address \ +lt_cv_sys_global_symbol_to_c_name_address_lib_prefix \ +lt_cv_nm_interface \ +nm_file_list_spec \ +lt_cv_truncate_bin \ +lt_prog_compiler_no_builtin_flag \ +lt_prog_compiler_pic \ +lt_prog_compiler_wl \ +lt_prog_compiler_static \ +lt_cv_prog_compiler_c_o \ +need_locks \ +MANIFEST_TOOL \ +DSYMUTIL \ +NMEDIT \ +LIPO \ +OTOOL \ +OTOOL64 \ +shrext_cmds \ +export_dynamic_flag_spec \ +whole_archive_flag_spec \ +compiler_needs_object \ +with_gnu_ld \ +allow_undefined_flag \ +no_undefined_flag \ +hardcode_libdir_flag_spec \ +hardcode_libdir_separator \ +exclude_expsyms \ +include_expsyms \ +file_list_spec \ +variables_saved_for_relink \ +libname_spec \ +library_names_spec \ +soname_spec \ +install_override_mode \ +finish_eval \ +old_striplib \ +striplib; do + case \`eval \\\\\$ECHO \\\\""\\\\\$\$var"\\\\"\` in + *[\\\\\\\`\\"\\\$]*) + eval "lt_\$var=\\\\\\"\\\`\\\$ECHO \\"\\\$\$var\\" | \\\$SED \\"\\\$sed_quote_subst\\"\\\`\\\\\\"" ## exclude from sc_prohibit_nested_quotes + ;; + *) + eval "lt_\$var=\\\\\\"\\\$\$var\\\\\\"" + ;; + esac +done + +# Double-quote double-evaled strings. +for var in reload_cmds \ +old_postinstall_cmds \ +old_postuninstall_cmds \ +old_archive_cmds \ +extract_expsyms_cmds \ +old_archive_from_new_cmds \ +old_archive_from_expsyms_cmds \ +archive_cmds \ +archive_expsym_cmds \ +module_cmds \ +module_expsym_cmds \ +export_symbols_cmds \ +prelink_cmds \ +postlink_cmds \ +postinstall_cmds \ +postuninstall_cmds \ +finish_cmds \ +sys_lib_search_path_spec \ +configure_time_dlsearch_path \ +configure_time_lt_sys_library_path; do + case \`eval \\\\\$ECHO \\\\""\\\\\$\$var"\\\\"\` in + *[\\\\\\\`\\"\\\$]*) + eval "lt_\$var=\\\\\\"\\\`\\\$ECHO \\"\\\$\$var\\" | \\\$SED -e \\"\\\$double_quote_subst\\" -e \\"\\\$sed_quote_subst\\" -e \\"\\\$delay_variable_subst\\"\\\`\\\\\\"" ## exclude from sc_prohibit_nested_quotes + ;; + *) + eval "lt_\$var=\\\\\\"\\\$\$var\\\\\\"" + ;; + esac +done + +ac_aux_dir='$ac_aux_dir' + +# See if we are running on zsh, and set the options that allow our +# commands through without removal of \ escapes INIT. +if test -n "\${ZSH_VERSION+set}"; then + setopt NO_GLOB_SUBST +fi + + + PACKAGE='$PACKAGE' + VERSION='$VERSION' + RM='$RM' + ofile='$ofile' + + + + +_ACEOF + +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 + +# Handling of arguments. +for ac_config_target in $ac_config_targets +do + case $ac_config_target in + "config.h") CONFIG_HEADERS="$CONFIG_HEADERS config.h" ;; + "depfiles") CONFIG_COMMANDS="$CONFIG_COMMANDS depfiles" ;; + "libtool") CONFIG_COMMANDS="$CONFIG_COMMANDS libtool" ;; + "Makefile") CONFIG_FILES="$CONFIG_FILES Makefile" ;; + "lognorm.pc") CONFIG_FILES="$CONFIG_FILES lognorm.pc" ;; + "compat/Makefile") CONFIG_FILES="$CONFIG_FILES compat/Makefile" ;; + "doc/Makefile") CONFIG_FILES="$CONFIG_FILES doc/Makefile" ;; + "src/Makefile") CONFIG_FILES="$CONFIG_FILES src/Makefile" ;; + "src/lognorm-features.h") CONFIG_FILES="$CONFIG_FILES src/lognorm-features.h" ;; + "tools/Makefile") CONFIG_FILES="$CONFIG_FILES tools/Makefile" ;; + "tests/Makefile") CONFIG_FILES="$CONFIG_FILES tests/Makefile" ;; + "tests/options.sh") CONFIG_FILES="$CONFIG_FILES tests/options.sh" ;; + + *) as_fn_error $? "invalid argument: \`$ac_config_target'" "$LINENO" 5;; + esac +done + + +# If the user did not use the arguments to specify the items to instantiate, +# then the envvar interface is used. Set only those that are not. +# We use the long form for the default assignment because of an extremely +# bizarre bug on SunOS 4.1.3. +if $ac_need_defaults; then + test "${CONFIG_FILES+set}" = set || CONFIG_FILES=$config_files + test "${CONFIG_HEADERS+set}" = set || CONFIG_HEADERS=$config_headers + test "${CONFIG_COMMANDS+set}" = set || CONFIG_COMMANDS=$config_commands +fi + +# Have a temporary directory for convenience. Make it in the build tree +# simply because there is no reason against having it here, and in addition, +# creating and moving files from /tmp can sometimes cause problems. +# Hook for its removal unless debugging. +# Note that there is a small window in which the directory will not be cleaned: +# after its creation but before its name has been assigned to `$tmp'. +$debug || +{ + tmp= ac_tmp= + trap 'exit_status=$? + : "${ac_tmp:=$tmp}" + { test ! -d "$ac_tmp" || rm -fr "$ac_tmp"; } && exit $exit_status +' 0 + trap 'as_fn_exit 1' 1 2 13 15 +} +# Create a (secure) tmp directory for tmp files. + +{ + tmp=`(umask 077 && mktemp -d "./confXXXXXX") 2>/dev/null` && + test -d "$tmp" +} || +{ + tmp=./conf$$-$RANDOM + (umask 077 && mkdir "$tmp") +} || as_fn_error $? "cannot create a temporary directory in ." "$LINENO" 5 +ac_tmp=$tmp + +# Set up the scripts for CONFIG_FILES section. +# No need to generate them if there are no CONFIG_FILES. +# This happens for instance with `./config.status config.h'. +if test -n "$CONFIG_FILES"; then + + +ac_cr=`echo X | tr X '\015'` +# On cygwin, bash can eat \r inside `` if the user requested igncr. +# But we know of no other shell where ac_cr would be empty at this +# point, so we can use a bashism as a fallback. +if test "x$ac_cr" = x; then + eval ac_cr=\$\'\\r\' +fi +ac_cs_awk_cr=`$AWK 'BEGIN { print "a\rb" }' /dev/null` +if test "$ac_cs_awk_cr" = "a${ac_cr}b"; then + ac_cs_awk_cr='\\r' +else + ac_cs_awk_cr=$ac_cr +fi + +echo 'BEGIN {' >"$ac_tmp/subs1.awk" && +_ACEOF + + +{ + echo "cat >conf$$subs.awk <<_ACEOF" && + echo "$ac_subst_vars" | sed 's/.*/&!$&$ac_delim/' && + echo "_ACEOF" +} >conf$$subs.sh || + as_fn_error $? "could not make $CONFIG_STATUS" "$LINENO" 5 +ac_delim_num=`echo "$ac_subst_vars" | grep -c '^'` +ac_delim='%!_!# ' +for ac_last_try in false false false false false :; do + . ./conf$$subs.sh || + as_fn_error $? "could not make $CONFIG_STATUS" "$LINENO" 5 + + ac_delim_n=`sed -n "s/.*$ac_delim\$/X/p" conf$$subs.awk | grep -c X` + if test $ac_delim_n = $ac_delim_num; then + break + elif $ac_last_try; then + as_fn_error $? "could not make $CONFIG_STATUS" "$LINENO" 5 + else + ac_delim="$ac_delim!$ac_delim _$ac_delim!! " + fi +done +rm -f conf$$subs.sh + +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 +cat >>"\$ac_tmp/subs1.awk" <<\\_ACAWK && +_ACEOF +sed -n ' +h +s/^/S["/; s/!.*/"]=/ +p +g +s/^[^!]*!// +:repl +t repl +s/'"$ac_delim"'$// +t delim +:nl +h +s/\(.\{148\}\)..*/\1/ +t more1 +s/["\\]/\\&/g; s/^/"/; s/$/\\n"\\/ +p +n +b repl +:more1 +s/["\\]/\\&/g; s/^/"/; s/$/"\\/ +p +g +s/.\{148\}// +t nl +:delim +h +s/\(.\{148\}\)..*/\1/ +t more2 +s/["\\]/\\&/g; s/^/"/; s/$/"/ +p +b +:more2 +s/["\\]/\\&/g; s/^/"/; s/$/"\\/ +p +g +s/.\{148\}// +t delim +' >$CONFIG_STATUS || ac_write_fail=1 +rm -f conf$$subs.awk +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 +_ACAWK +cat >>"\$ac_tmp/subs1.awk" <<_ACAWK && + for (key in S) S_is_set[key] = 1 + FS = "" + +} +{ + line = $ 0 + nfields = split(line, field, "@") + substed = 0 + len = length(field[1]) + for (i = 2; i < nfields; i++) { + key = field[i] + keylen = length(key) + if (S_is_set[key]) { + value = S[key] + line = substr(line, 1, len) "" value "" substr(line, len + keylen + 3) + len += length(value) + length(field[++i]) + substed = 1 + } else + len += 1 + keylen + } + + print line +} + +_ACAWK +_ACEOF +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 +if sed "s/$ac_cr//" < /dev/null > /dev/null 2>&1; then + sed "s/$ac_cr\$//; s/$ac_cr/$ac_cs_awk_cr/g" +else + cat +fi < "$ac_tmp/subs1.awk" > "$ac_tmp/subs.awk" \ + || as_fn_error $? "could not setup config files machinery" "$LINENO" 5 +_ACEOF + +# VPATH may cause trouble with some makes, so we remove sole $(srcdir), +# ${srcdir} and @srcdir@ entries from VPATH if srcdir is ".", strip leading and +# trailing colons and then remove the whole line if VPATH becomes empty +# (actually we leave an empty line to preserve line numbers). +if test "x$srcdir" = x.; then + ac_vpsub='/^[ ]*VPATH[ ]*=[ ]*/{ +h +s/// +s/^/:/ +s/[ ]*$/:/ +s/:\$(srcdir):/:/g +s/:\${srcdir}:/:/g +s/:@srcdir@:/:/g +s/^:*// +s/:*$// +x +s/\(=[ ]*\).*/\1/ +G +s/\n// +s/^[^=]*=[ ]*$// +}' +fi + +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 +fi # test -n "$CONFIG_FILES" + +# Set up the scripts for CONFIG_HEADERS section. +# No need to generate them if there are no CONFIG_HEADERS. +# This happens for instance with `./config.status Makefile'. +if test -n "$CONFIG_HEADERS"; then +cat >"$ac_tmp/defines.awk" <<\_ACAWK || +BEGIN { +_ACEOF + +# Transform confdefs.h into an awk script `defines.awk', embedded as +# here-document in config.status, that substitutes the proper values into +# config.h.in to produce config.h. + +# Create a delimiter string that does not exist in confdefs.h, to ease +# handling of long lines. +ac_delim='%!_!# ' +for ac_last_try in false false :; do + ac_tt=`sed -n "/$ac_delim/p" confdefs.h` + if test -z "$ac_tt"; then + break + elif $ac_last_try; then + as_fn_error $? "could not make $CONFIG_HEADERS" "$LINENO" 5 + else + ac_delim="$ac_delim!$ac_delim _$ac_delim!! " + fi +done + +# For the awk script, D is an array of macro values keyed by name, +# likewise P contains macro parameters if any. Preserve backslash +# newline sequences. + +ac_word_re=[_$as_cr_Letters][_$as_cr_alnum]* +sed -n ' +s/.\{148\}/&'"$ac_delim"'/g +t rset +:rset +s/^[ ]*#[ ]*define[ ][ ]*/ / +t def +d +:def +s/\\$// +t bsnl +s/["\\]/\\&/g +s/^ \('"$ac_word_re"'\)\(([^()]*)\)[ ]*\(.*\)/P["\1"]="\2"\ +D["\1"]=" \3"/p +s/^ \('"$ac_word_re"'\)[ ]*\(.*\)/D["\1"]=" \2"/p +d +:bsnl +s/["\\]/\\&/g +s/^ \('"$ac_word_re"'\)\(([^()]*)\)[ ]*\(.*\)/P["\1"]="\2"\ +D["\1"]=" \3\\\\\\n"\\/p +t cont +s/^ \('"$ac_word_re"'\)[ ]*\(.*\)/D["\1"]=" \2\\\\\\n"\\/p +t cont +d +:cont +n +s/.\{148\}/&'"$ac_delim"'/g +t clear +:clear +s/\\$// +t bsnlc +s/["\\]/\\&/g; s/^/"/; s/$/"/p +d +:bsnlc +s/["\\]/\\&/g; s/^/"/; s/$/\\\\\\n"\\/p +b cont +' >$CONFIG_STATUS || ac_write_fail=1 + +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 + for (key in D) D_is_set[key] = 1 + FS = "" +} +/^[\t ]*#[\t ]*(define|undef)[\t ]+$ac_word_re([\t (]|\$)/ { + line = \$ 0 + split(line, arg, " ") + if (arg[1] == "#") { + defundef = arg[2] + mac1 = arg[3] + } else { + defundef = substr(arg[1], 2) + mac1 = arg[2] + } + split(mac1, mac2, "(") #) + macro = mac2[1] + prefix = substr(line, 1, index(line, defundef) - 1) + if (D_is_set[macro]) { + # Preserve the white space surrounding the "#". + print prefix "define", macro P[macro] D[macro] + next + } else { + # Replace #undef with comments. This is necessary, for example, + # in the case of _POSIX_SOURCE, which is predefined and required + # on some systems where configure will not decide to define it. + if (defundef == "undef") { + print "/*", prefix defundef, macro, "*/" + next + } + } +} +{ print } +_ACAWK +_ACEOF +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 + as_fn_error $? "could not setup config headers machinery" "$LINENO" 5 +fi # test -n "$CONFIG_HEADERS" + + +eval set X " :F $CONFIG_FILES :H $CONFIG_HEADERS :C $CONFIG_COMMANDS" +shift +for ac_tag +do + case $ac_tag in + :[FHLC]) ac_mode=$ac_tag; continue;; + esac + case $ac_mode$ac_tag in + :[FHL]*:*);; + :L* | :C*:*) as_fn_error $? "invalid tag \`$ac_tag'" "$LINENO" 5;; + :[FH]-) ac_tag=-:-;; + :[FH]*) ac_tag=$ac_tag:$ac_tag.in;; + esac + ac_save_IFS=$IFS + IFS=: + set x $ac_tag + IFS=$ac_save_IFS + shift + ac_file=$1 + shift + + case $ac_mode in + :L) ac_source=$1;; + :[FH]) + ac_file_inputs= + for ac_f + do + case $ac_f in + -) ac_f="$ac_tmp/stdin";; + *) # Look for the file first in the build tree, then in the source tree + # (if the path is not absolute). The absolute path cannot be DOS-style, + # because $ac_f cannot contain `:'. + test -f "$ac_f" || + case $ac_f in + [\\/$]*) false;; + *) test -f "$srcdir/$ac_f" && ac_f="$srcdir/$ac_f";; + esac || + as_fn_error 1 "cannot find input file: \`$ac_f'" "$LINENO" 5;; + esac + case $ac_f in *\'*) ac_f=`$as_echo "$ac_f" | sed "s/'/'\\\\\\\\''/g"`;; esac + as_fn_append ac_file_inputs " '$ac_f'" + done + + # Let's still pretend it is `configure' which instantiates (i.e., don't + # use $as_me), people would be surprised to read: + # /* config.h. Generated by config.status. */ + configure_input='Generated from '` + $as_echo "$*" | sed 's|^[^:]*/||;s|:[^:]*/|, |g' + `' by configure.' + if test x"$ac_file" != x-; then + configure_input="$ac_file. $configure_input" + { $as_echo "$as_me:${as_lineno-$LINENO}: creating $ac_file" >&5 +$as_echo "$as_me: creating $ac_file" >&6;} + fi + # Neutralize special characters interpreted by sed in replacement strings. + case $configure_input in #( + *\&* | *\|* | *\\* ) + ac_sed_conf_input=`$as_echo "$configure_input" | + sed 's/[\\\\&|]/\\\\&/g'`;; #( + *) ac_sed_conf_input=$configure_input;; + esac + + case $ac_tag in + *:-:* | *:-) cat >"$ac_tmp/stdin" \ + || as_fn_error $? "could not create $ac_file" "$LINENO" 5 ;; + esac + ;; + esac + + ac_dir=`$as_dirname -- "$ac_file" || +$as_expr X"$ac_file" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \ + X"$ac_file" : 'X\(//\)[^/]' \| \ + X"$ac_file" : 'X\(//\)$' \| \ + X"$ac_file" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X"$ac_file" | + sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{ + s//\1/ + q + } + /^X\(\/\/\)[^/].*/{ + s//\1/ + q + } + /^X\(\/\/\)$/{ + s//\1/ + q + } + /^X\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'` + as_dir="$ac_dir"; as_fn_mkdir_p + ac_builddir=. + +case "$ac_dir" in +.) ac_dir_suffix= ac_top_builddir_sub=. ac_top_build_prefix= ;; +*) + ac_dir_suffix=/`$as_echo "$ac_dir" | sed 's|^\.[\\/]||'` + # A ".." for each directory in $ac_dir_suffix. + ac_top_builddir_sub=`$as_echo "$ac_dir_suffix" | sed 's|/[^\\/]*|/..|g;s|/||'` + case $ac_top_builddir_sub in + "") ac_top_builddir_sub=. ac_top_build_prefix= ;; + *) ac_top_build_prefix=$ac_top_builddir_sub/ ;; + esac ;; +esac +ac_abs_top_builddir=$ac_pwd +ac_abs_builddir=$ac_pwd$ac_dir_suffix +# for backward compatibility: +ac_top_builddir=$ac_top_build_prefix + +case $srcdir in + .) # We are building in place. + ac_srcdir=. + ac_top_srcdir=$ac_top_builddir_sub + ac_abs_top_srcdir=$ac_pwd ;; + [\\/]* | ?:[\\/]* ) # Absolute name. + ac_srcdir=$srcdir$ac_dir_suffix; + ac_top_srcdir=$srcdir + ac_abs_top_srcdir=$srcdir ;; + *) # Relative name. + ac_srcdir=$ac_top_build_prefix$srcdir$ac_dir_suffix + ac_top_srcdir=$ac_top_build_prefix$srcdir + ac_abs_top_srcdir=$ac_pwd/$srcdir ;; +esac +ac_abs_srcdir=$ac_abs_top_srcdir$ac_dir_suffix + + + case $ac_mode in + :F) + # + # CONFIG_FILE + # + + case $INSTALL in + [\\/$]* | ?:[\\/]* ) ac_INSTALL=$INSTALL ;; + *) ac_INSTALL=$ac_top_build_prefix$INSTALL ;; + esac + ac_MKDIR_P=$MKDIR_P + case $MKDIR_P in + [\\/$]* | ?:[\\/]* ) ;; + */*) ac_MKDIR_P=$ac_top_build_prefix$MKDIR_P ;; + esac +_ACEOF + +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 +# If the template does not know about datarootdir, expand it. +# FIXME: This hack should be removed a few years after 2.60. +ac_datarootdir_hack=; ac_datarootdir_seen= +ac_sed_dataroot=' +/datarootdir/ { + p + q +} +/@datadir@/p +/@docdir@/p +/@infodir@/p +/@localedir@/p +/@mandir@/p' +case `eval "sed -n \"\$ac_sed_dataroot\" $ac_file_inputs"` in +*datarootdir*) ac_datarootdir_seen=yes;; +*@datadir@*|*@docdir@*|*@infodir@*|*@localedir@*|*@mandir@*) + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $ac_file_inputs seems to ignore the --datarootdir setting" >&5 +$as_echo "$as_me: WARNING: $ac_file_inputs seems to ignore the --datarootdir setting" >&2;} +_ACEOF +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 + ac_datarootdir_hack=' + s&@datadir@&$datadir&g + s&@docdir@&$docdir&g + s&@infodir@&$infodir&g + s&@localedir@&$localedir&g + s&@mandir@&$mandir&g + s&\\\${datarootdir}&$datarootdir&g' ;; +esac +_ACEOF + +# Neutralize VPATH when `$srcdir' = `.'. +# Shell code in configure.ac might set extrasub. +# FIXME: do we really want to maintain this feature? +cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 +ac_sed_extra="$ac_vpsub +$extrasub +_ACEOF +cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 +:t +/@[a-zA-Z_][a-zA-Z_0-9]*@/!b +s|@configure_input@|$ac_sed_conf_input|;t t +s&@top_builddir@&$ac_top_builddir_sub&;t t +s&@top_build_prefix@&$ac_top_build_prefix&;t t +s&@srcdir@&$ac_srcdir&;t t +s&@abs_srcdir@&$ac_abs_srcdir&;t t +s&@top_srcdir@&$ac_top_srcdir&;t t +s&@abs_top_srcdir@&$ac_abs_top_srcdir&;t t +s&@builddir@&$ac_builddir&;t t +s&@abs_builddir@&$ac_abs_builddir&;t t +s&@abs_top_builddir@&$ac_abs_top_builddir&;t t +s&@INSTALL@&$ac_INSTALL&;t t +s&@MKDIR_P@&$ac_MKDIR_P&;t t +$ac_datarootdir_hack +" +eval sed \"\$ac_sed_extra\" "$ac_file_inputs" | $AWK -f "$ac_tmp/subs.awk" \ + >$ac_tmp/out || as_fn_error $? "could not create $ac_file" "$LINENO" 5 + +test -z "$ac_datarootdir_hack$ac_datarootdir_seen" && + { ac_out=`sed -n '/\${datarootdir}/p' "$ac_tmp/out"`; test -n "$ac_out"; } && + { ac_out=`sed -n '/^[ ]*datarootdir[ ]*:*=/p' \ + "$ac_tmp/out"`; test -z "$ac_out"; } && + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $ac_file contains a reference to the variable \`datarootdir' +which seems to be undefined. Please make sure it is defined" >&5 +$as_echo "$as_me: WARNING: $ac_file contains a reference to the variable \`datarootdir' +which seems to be undefined. Please make sure it is defined" >&2;} + + rm -f "$ac_tmp/stdin" + case $ac_file in + -) cat "$ac_tmp/out" && rm -f "$ac_tmp/out";; + *) rm -f "$ac_file" && mv "$ac_tmp/out" "$ac_file";; + esac \ + || as_fn_error $? "could not create $ac_file" "$LINENO" 5 + ;; + :H) + # + # CONFIG_HEADER + # + if test x"$ac_file" != x-; then + { + $as_echo "/* $configure_input */" \ + && eval '$AWK -f "$ac_tmp/defines.awk"' "$ac_file_inputs" + } >"$ac_tmp/config.h" \ + || as_fn_error $? "could not create $ac_file" "$LINENO" 5 + if diff "$ac_file" "$ac_tmp/config.h" >/dev/null 2>&1; then + { $as_echo "$as_me:${as_lineno-$LINENO}: $ac_file is unchanged" >&5 +$as_echo "$as_me: $ac_file is unchanged" >&6;} + else + rm -f "$ac_file" + mv "$ac_tmp/config.h" "$ac_file" \ + || as_fn_error $? "could not create $ac_file" "$LINENO" 5 + fi + else + $as_echo "/* $configure_input */" \ + && eval '$AWK -f "$ac_tmp/defines.awk"' "$ac_file_inputs" \ + || as_fn_error $? "could not create -" "$LINENO" 5 + fi +# Compute "$ac_file"'s index in $config_headers. +_am_arg="$ac_file" +_am_stamp_count=1 +for _am_header in $config_headers :; do + case $_am_header in + $_am_arg | $_am_arg:* ) + break ;; + * ) + _am_stamp_count=`expr $_am_stamp_count + 1` ;; + esac +done +echo "timestamp for $_am_arg" >`$as_dirname -- "$_am_arg" || +$as_expr X"$_am_arg" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \ + X"$_am_arg" : 'X\(//\)[^/]' \| \ + X"$_am_arg" : 'X\(//\)$' \| \ + X"$_am_arg" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X"$_am_arg" | + sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{ + s//\1/ + q + } + /^X\(\/\/\)[^/].*/{ + s//\1/ + q + } + /^X\(\/\/\)$/{ + s//\1/ + q + } + /^X\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'`/stamp-h$_am_stamp_count + ;; + + :C) { $as_echo "$as_me:${as_lineno-$LINENO}: executing $ac_file commands" >&5 +$as_echo "$as_me: executing $ac_file commands" >&6;} + ;; + esac + + + case $ac_file$ac_mode in + "depfiles":C) test x"$AMDEP_TRUE" != x"" || { + # Older Autoconf quotes --file arguments for eval, but not when files + # are listed without --file. Let's play safe and only enable the eval + # if we detect the quoting. + case $CONFIG_FILES in + *\'*) eval set x "$CONFIG_FILES" ;; + *) set x $CONFIG_FILES ;; + esac + shift + for mf + do + # Strip MF so we end up with the name of the file. + mf=`echo "$mf" | sed -e 's/:.*$//'` + # Check whether this is an Automake generated Makefile or not. + # We used to match only the files named 'Makefile.in', but + # some people rename them; so instead we look at the file content. + # Grep'ing the first line is not enough: some people post-process + # each Makefile.in and add a new line on top of each file to say so. + # Grep'ing the whole file is not good either: AIX grep has a line + # limit of 2048, but all sed's we know have understand at least 4000. + if sed -n 's,^#.*generated by automake.*,X,p' "$mf" | grep X >/dev/null 2>&1; then + dirpart=`$as_dirname -- "$mf" || +$as_expr X"$mf" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \ + X"$mf" : 'X\(//\)[^/]' \| \ + X"$mf" : 'X\(//\)$' \| \ + X"$mf" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X"$mf" | + sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{ + s//\1/ + q + } + /^X\(\/\/\)[^/].*/{ + s//\1/ + q + } + /^X\(\/\/\)$/{ + s//\1/ + q + } + /^X\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'` + else + continue + fi + # Extract the definition of DEPDIR, am__include, and am__quote + # from the Makefile without running 'make'. + DEPDIR=`sed -n 's/^DEPDIR = //p' < "$mf"` + test -z "$DEPDIR" && continue + am__include=`sed -n 's/^am__include = //p' < "$mf"` + test -z "$am__include" && continue + am__quote=`sed -n 's/^am__quote = //p' < "$mf"` + # Find all dependency output files, they are included files with + # $(DEPDIR) in their names. We invoke sed twice because it is the + # simplest approach to changing $(DEPDIR) to its actual value in the + # expansion. + for file in `sed -n " + s/^$am__include $am__quote\(.*(DEPDIR).*\)$am__quote"'$/\1/p' <"$mf" | \ + sed -e 's/\$(DEPDIR)/'"$DEPDIR"'/g'`; do + # Make sure the directory exists. + test -f "$dirpart/$file" && continue + fdir=`$as_dirname -- "$file" || +$as_expr X"$file" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \ + X"$file" : 'X\(//\)[^/]' \| \ + X"$file" : 'X\(//\)$' \| \ + X"$file" : 'X\(/\)' \| . 2>/dev/null || +$as_echo X"$file" | + sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{ + s//\1/ + q + } + /^X\(\/\/\)[^/].*/{ + s//\1/ + q + } + /^X\(\/\/\)$/{ + s//\1/ + q + } + /^X\(\/\).*/{ + s//\1/ + q + } + s/.*/./; q'` + as_dir=$dirpart/$fdir; as_fn_mkdir_p + # echo "creating $dirpart/$file" + echo '# dummy' > "$dirpart/$file" + done + done +} + ;; + "libtool":C) + + # See if we are running on zsh, and set the options that allow our + # commands through without removal of \ escapes. + if test -n "${ZSH_VERSION+set}"; then + setopt NO_GLOB_SUBST + fi + + cfgfile=${ofile}T + trap "$RM \"$cfgfile\"; exit 1" 1 2 15 + $RM "$cfgfile" + + cat <<_LT_EOF >> "$cfgfile" +#! $SHELL +# Generated automatically by $as_me ($PACKAGE) $VERSION +# Libtool was configured on host `(hostname || uname -n) 2>/dev/null | sed 1q`: +# NOTE: Changes made to this file will be lost: look at ltmain.sh. + +# Provide generalized library-building support services. +# Written by Gordon Matzigkeit, 1996 + +# Copyright (C) 2014 Free Software Foundation, Inc. +# This is free software; see the source for copying conditions. There is NO +# warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. + +# GNU Libtool is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 2 of of the License, or +# (at your option) any later version. +# +# As a special exception to the GNU General Public License, if you +# distribute this file as part of a program or library that is built +# using GNU Libtool, you may include this file under the same +# distribution terms that you use for the rest of that program. +# +# GNU Libtool is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + + +# The names of the tagged configurations supported by this script. +available_tags='' + +# Configured defaults for sys_lib_dlsearch_path munging. +: \${LT_SYS_LIBRARY_PATH="$configure_time_lt_sys_library_path"} + +# ### BEGIN LIBTOOL CONFIG + +# Which release of libtool.m4 was used? +macro_version=$macro_version +macro_revision=$macro_revision + +# Whether or not to build shared libraries. +build_libtool_libs=$enable_shared + +# Whether or not to build static libraries. +build_old_libs=$enable_static + +# What type of objects to build. +pic_mode=$pic_mode + +# Whether or not to optimize for fast installation. +fast_install=$enable_fast_install + +# Shared archive member basename,for filename based shared library versioning on AIX. +shared_archive_member_spec=$shared_archive_member_spec + +# Shell to use when invoking shell scripts. +SHELL=$lt_SHELL + +# An echo program that protects backslashes. +ECHO=$lt_ECHO + +# The PATH separator for the build system. +PATH_SEPARATOR=$lt_PATH_SEPARATOR + +# The host system. +host_alias=$host_alias +host=$host +host_os=$host_os + +# The build system. +build_alias=$build_alias +build=$build +build_os=$build_os + +# A sed program that does not truncate output. +SED=$lt_SED + +# Sed that helps us avoid accidentally triggering echo(1) options like -n. +Xsed="\$SED -e 1s/^X//" + +# A grep program that handles long lines. +GREP=$lt_GREP + +# An ERE matcher. +EGREP=$lt_EGREP + +# A literal string matcher. +FGREP=$lt_FGREP + +# A BSD- or MS-compatible name lister. +NM=$lt_NM + +# Whether we need soft or hard links. +LN_S=$lt_LN_S + +# What is the maximum length of a command? +max_cmd_len=$max_cmd_len + +# Object file suffix (normally "o"). +objext=$ac_objext + +# Executable file suffix (normally ""). +exeext=$exeext + +# whether the shell understands "unset". +lt_unset=$lt_unset + +# turn spaces into newlines. +SP2NL=$lt_lt_SP2NL + +# turn newlines into spaces. +NL2SP=$lt_lt_NL2SP + +# convert \$build file names to \$host format. +to_host_file_cmd=$lt_cv_to_host_file_cmd + +# convert \$build files to toolchain format. +to_tool_file_cmd=$lt_cv_to_tool_file_cmd + +# An object symbol dumper. +OBJDUMP=$lt_OBJDUMP + +# Method to check whether dependent libraries are shared objects. +deplibs_check_method=$lt_deplibs_check_method + +# Command to use when deplibs_check_method = "file_magic". +file_magic_cmd=$lt_file_magic_cmd + +# How to find potential files when deplibs_check_method = "file_magic". +file_magic_glob=$lt_file_magic_glob + +# Find potential files using nocaseglob when deplibs_check_method = "file_magic". +want_nocaseglob=$lt_want_nocaseglob + +# DLL creation program. +DLLTOOL=$lt_DLLTOOL + +# Command to associate shared and link libraries. +sharedlib_from_linklib_cmd=$lt_sharedlib_from_linklib_cmd + +# The archiver. +AR=$lt_AR + +# Flags to create an archive. +AR_FLAGS=$lt_AR_FLAGS + +# How to feed a file listing to the archiver. +archiver_list_spec=$lt_archiver_list_spec + +# A symbol stripping program. +STRIP=$lt_STRIP + +# Commands used to install an old-style archive. +RANLIB=$lt_RANLIB +old_postinstall_cmds=$lt_old_postinstall_cmds +old_postuninstall_cmds=$lt_old_postuninstall_cmds + +# Whether to use a lock for old archive extraction. +lock_old_archive_extraction=$lock_old_archive_extraction + +# A C compiler. +LTCC=$lt_CC + +# LTCC compiler flags. +LTCFLAGS=$lt_CFLAGS + +# Take the output of nm and produce a listing of raw symbols and C names. +global_symbol_pipe=$lt_lt_cv_sys_global_symbol_pipe + +# Transform the output of nm in a proper C declaration. +global_symbol_to_cdecl=$lt_lt_cv_sys_global_symbol_to_cdecl + +# Transform the output of nm into a list of symbols to manually relocate. +global_symbol_to_import=$lt_lt_cv_sys_global_symbol_to_import + +# Transform the output of nm in a C name address pair. +global_symbol_to_c_name_address=$lt_lt_cv_sys_global_symbol_to_c_name_address + +# Transform the output of nm in a C name address pair when lib prefix is needed. +global_symbol_to_c_name_address_lib_prefix=$lt_lt_cv_sys_global_symbol_to_c_name_address_lib_prefix + +# The name lister interface. +nm_interface=$lt_lt_cv_nm_interface + +# Specify filename containing input files for \$NM. +nm_file_list_spec=$lt_nm_file_list_spec + +# The root where to search for dependent libraries,and where our libraries should be installed. +lt_sysroot=$lt_sysroot + +# Command to truncate a binary pipe. +lt_truncate_bin=$lt_lt_cv_truncate_bin + +# The name of the directory that contains temporary libtool files. +objdir=$objdir + +# Used to examine libraries when file_magic_cmd begins with "file". +MAGIC_CMD=$MAGIC_CMD + +# Must we lock files when doing compilation? +need_locks=$lt_need_locks + +# Manifest tool. +MANIFEST_TOOL=$lt_MANIFEST_TOOL + +# Tool to manipulate archived DWARF debug symbol files on Mac OS X. +DSYMUTIL=$lt_DSYMUTIL + +# Tool to change global to local symbols on Mac OS X. +NMEDIT=$lt_NMEDIT + +# Tool to manipulate fat objects and archives on Mac OS X. +LIPO=$lt_LIPO + +# ldd/readelf like tool for Mach-O binaries on Mac OS X. +OTOOL=$lt_OTOOL + +# ldd/readelf like tool for 64 bit Mach-O binaries on Mac OS X 10.4. +OTOOL64=$lt_OTOOL64 + +# Old archive suffix (normally "a"). +libext=$libext + +# Shared library suffix (normally ".so"). +shrext_cmds=$lt_shrext_cmds + +# The commands to extract the exported symbol list from a shared archive. +extract_expsyms_cmds=$lt_extract_expsyms_cmds + +# Variables whose values should be saved in libtool wrapper scripts and +# restored at link time. +variables_saved_for_relink=$lt_variables_saved_for_relink + +# Do we need the "lib" prefix for modules? +need_lib_prefix=$need_lib_prefix + +# Do we need a version for libraries? +need_version=$need_version + +# Library versioning type. +version_type=$version_type + +# Shared library runtime path variable. +runpath_var=$runpath_var + +# Shared library path variable. +shlibpath_var=$shlibpath_var + +# Is shlibpath searched before the hard-coded library search path? +shlibpath_overrides_runpath=$shlibpath_overrides_runpath + +# Format of library name prefix. +libname_spec=$lt_libname_spec + +# List of archive names. First name is the real one, the rest are links. +# The last name is the one that the linker finds with -lNAME +library_names_spec=$lt_library_names_spec + +# The coded name of the library, if different from the real name. +soname_spec=$lt_soname_spec + +# Permission mode override for installation of shared libraries. +install_override_mode=$lt_install_override_mode + +# Command to use after installation of a shared archive. +postinstall_cmds=$lt_postinstall_cmds + +# Command to use after uninstallation of a shared archive. +postuninstall_cmds=$lt_postuninstall_cmds + +# Commands used to finish a libtool library installation in a directory. +finish_cmds=$lt_finish_cmds + +# As "finish_cmds", except a single script fragment to be evaled but +# not shown. +finish_eval=$lt_finish_eval + +# Whether we should hardcode library paths into libraries. +hardcode_into_libs=$hardcode_into_libs + +# Compile-time system search path for libraries. +sys_lib_search_path_spec=$lt_sys_lib_search_path_spec + +# Detected run-time system search path for libraries. +sys_lib_dlsearch_path_spec=$lt_configure_time_dlsearch_path + +# Explicit LT_SYS_LIBRARY_PATH set during ./configure time. +configure_time_lt_sys_library_path=$lt_configure_time_lt_sys_library_path + +# Whether dlopen is supported. +dlopen_support=$enable_dlopen + +# Whether dlopen of programs is supported. +dlopen_self=$enable_dlopen_self + +# Whether dlopen of statically linked programs is supported. +dlopen_self_static=$enable_dlopen_self_static + +# Commands to strip libraries. +old_striplib=$lt_old_striplib +striplib=$lt_striplib + + +# The linker used to build libraries. +LD=$lt_LD + +# How to create reloadable object files. +reload_flag=$lt_reload_flag +reload_cmds=$lt_reload_cmds + +# Commands used to build an old-style archive. +old_archive_cmds=$lt_old_archive_cmds + +# A language specific compiler. +CC=$lt_compiler + +# Is the compiler the GNU compiler? +with_gcc=$GCC + +# Compiler flag to turn off builtin functions. +no_builtin_flag=$lt_lt_prog_compiler_no_builtin_flag + +# Additional compiler flags for building library objects. +pic_flag=$lt_lt_prog_compiler_pic + +# How to pass a linker flag through the compiler. +wl=$lt_lt_prog_compiler_wl + +# Compiler flag to prevent dynamic linking. +link_static_flag=$lt_lt_prog_compiler_static + +# Does compiler simultaneously support -c and -o options? +compiler_c_o=$lt_lt_cv_prog_compiler_c_o + +# Whether or not to add -lc for building shared libraries. +build_libtool_need_lc=$archive_cmds_need_lc + +# Whether or not to disallow shared libs when runtime libs are static. +allow_libtool_libs_with_static_runtimes=$enable_shared_with_static_runtimes + +# Compiler flag to allow reflexive dlopens. +export_dynamic_flag_spec=$lt_export_dynamic_flag_spec + +# Compiler flag to generate shared objects directly from archives. +whole_archive_flag_spec=$lt_whole_archive_flag_spec + +# Whether the compiler copes with passing no objects directly. +compiler_needs_object=$lt_compiler_needs_object + +# Create an old-style archive from a shared archive. +old_archive_from_new_cmds=$lt_old_archive_from_new_cmds + +# Create a temporary old-style archive to link instead of a shared archive. +old_archive_from_expsyms_cmds=$lt_old_archive_from_expsyms_cmds + +# Commands used to build a shared archive. +archive_cmds=$lt_archive_cmds +archive_expsym_cmds=$lt_archive_expsym_cmds + +# Commands used to build a loadable module if different from building +# a shared archive. +module_cmds=$lt_module_cmds +module_expsym_cmds=$lt_module_expsym_cmds + +# Whether we are building with GNU ld or not. +with_gnu_ld=$lt_with_gnu_ld + +# Flag that allows shared libraries with undefined symbols to be built. +allow_undefined_flag=$lt_allow_undefined_flag + +# Flag that enforces no undefined symbols. +no_undefined_flag=$lt_no_undefined_flag + +# Flag to hardcode \$libdir into a binary during linking. +# This must work even if \$libdir does not exist +hardcode_libdir_flag_spec=$lt_hardcode_libdir_flag_spec + +# Whether we need a single "-rpath" flag with a separated argument. +hardcode_libdir_separator=$lt_hardcode_libdir_separator + +# Set to "yes" if using DIR/libNAME\$shared_ext during linking hardcodes +# DIR into the resulting binary. +hardcode_direct=$hardcode_direct + +# Set to "yes" if using DIR/libNAME\$shared_ext during linking hardcodes +# DIR into the resulting binary and the resulting library dependency is +# "absolute",i.e impossible to change by setting \$shlibpath_var if the +# library is relocated. +hardcode_direct_absolute=$hardcode_direct_absolute + +# Set to "yes" if using the -LDIR flag during linking hardcodes DIR +# into the resulting binary. +hardcode_minus_L=$hardcode_minus_L + +# Set to "yes" if using SHLIBPATH_VAR=DIR during linking hardcodes DIR +# into the resulting binary. +hardcode_shlibpath_var=$hardcode_shlibpath_var + +# Set to "yes" if building a shared library automatically hardcodes DIR +# into the library and all subsequent libraries and executables linked +# against it. +hardcode_automatic=$hardcode_automatic + +# Set to yes if linker adds runtime paths of dependent libraries +# to runtime path list. +inherit_rpath=$inherit_rpath + +# Whether libtool must link a program against all its dependency libraries. +link_all_deplibs=$link_all_deplibs + +# Set to "yes" if exported symbols are required. +always_export_symbols=$always_export_symbols + +# The commands to list exported symbols. +export_symbols_cmds=$lt_export_symbols_cmds + +# Symbols that should not be listed in the preloaded symbols. +exclude_expsyms=$lt_exclude_expsyms + +# Symbols that must always be exported. +include_expsyms=$lt_include_expsyms + +# Commands necessary for linking programs (against libraries) with templates. +prelink_cmds=$lt_prelink_cmds + +# Commands necessary for finishing linking programs. +postlink_cmds=$lt_postlink_cmds + +# Specify filename containing input files. +file_list_spec=$lt_file_list_spec + +# How to hardcode a shared library path into an executable. +hardcode_action=$hardcode_action + +# ### END LIBTOOL CONFIG + +_LT_EOF + + cat <<'_LT_EOF' >> "$cfgfile" + +# ### BEGIN FUNCTIONS SHARED WITH CONFIGURE + +# func_munge_path_list VARIABLE PATH +# ----------------------------------- +# VARIABLE is name of variable containing _space_ separated list of +# directories to be munged by the contents of PATH, which is string +# having a format: +# "DIR[:DIR]:" +# string "DIR[ DIR]" will be prepended to VARIABLE +# ":DIR[:DIR]" +# string "DIR[ DIR]" will be appended to VARIABLE +# "DIRP[:DIRP]::[DIRA:]DIRA" +# string "DIRP[ DIRP]" will be prepended to VARIABLE and string +# "DIRA[ DIRA]" will be appended to VARIABLE +# "DIR[:DIR]" +# VARIABLE will be replaced by "DIR[ DIR]" +func_munge_path_list () +{ + case x$2 in + x) + ;; + *:) + eval $1=\"`$ECHO $2 | $SED 's/:/ /g'` \$$1\" + ;; + x:*) + eval $1=\"\$$1 `$ECHO $2 | $SED 's/:/ /g'`\" + ;; + *::*) + eval $1=\"\$$1\ `$ECHO $2 | $SED -e 's/.*:://' -e 's/:/ /g'`\" + eval $1=\"`$ECHO $2 | $SED -e 's/::.*//' -e 's/:/ /g'`\ \$$1\" + ;; + *) + eval $1=\"`$ECHO $2 | $SED 's/:/ /g'`\" + ;; + esac +} + + +# Calculate cc_basename. Skip known compiler wrappers and cross-prefix. +func_cc_basename () +{ + for cc_temp in $*""; do + case $cc_temp in + compile | *[\\/]compile | ccache | *[\\/]ccache ) ;; + distcc | *[\\/]distcc | purify | *[\\/]purify ) ;; + \-*) ;; + *) break;; + esac + done + func_cc_basename_result=`$ECHO "$cc_temp" | $SED "s%.*/%%; s%^$host_alias-%%"` +} + + +# ### END FUNCTIONS SHARED WITH CONFIGURE + +_LT_EOF + + case $host_os in + aix3*) + cat <<\_LT_EOF >> "$cfgfile" +# AIX sometimes has problems with the GCC collect2 program. For some +# reason, if we set the COLLECT_NAMES environment variable, the problems +# vanish in a puff of smoke. +if test set != "${COLLECT_NAMES+set}"; then + COLLECT_NAMES= + export COLLECT_NAMES +fi +_LT_EOF + ;; + esac + + +ltmain=$ac_aux_dir/ltmain.sh + + + # We use sed instead of cat because bash on DJGPP gets confused if + # if finds mixed CR/LF and LF-only lines. Since sed operates in + # text mode, it properly converts lines to CR/LF. This bash problem + # is reportedly fixed, but why not run on old versions too? + sed '$q' "$ltmain" >> "$cfgfile" \ + || (rm -f "$cfgfile"; exit 1) + + mv -f "$cfgfile" "$ofile" || + (rm -f "$ofile" && cp "$cfgfile" "$ofile" && rm -f "$cfgfile") + chmod +x "$ofile" + + ;; + + esac +done # for ac_tag + + +as_fn_exit 0 +_ACEOF +ac_clean_files=$ac_clean_files_save + +test $ac_write_fail = 0 || + as_fn_error $? "write failure creating $CONFIG_STATUS" "$LINENO" 5 + + +# configure is writing to config.log, and then calls config.status. +# config.status does its own redirection, appending to config.log. +# Unfortunately, on DOS this fails, as config.log is still kept open +# by configure, so config.status won't be able to write to it; its +# output is simply discarded. So we exec the FD to /dev/null, +# effectively closing config.log, so it can be properly (re)opened and +# appended to by config.status. When coming back to configure, we +# need to make the FD available again. +if test "$no_create" != yes; then + ac_cs_success=: + ac_config_status_args= + test "$silent" = yes && + ac_config_status_args="$ac_config_status_args --quiet" + exec 5>/dev/null + $SHELL $CONFIG_STATUS $ac_config_status_args || ac_cs_success=false + exec 5>>config.log + # Use ||, not &&, to avoid exiting from the if with $? = 1, which + # would make configure fail if this is the last instruction. + $ac_cs_success || as_fn_exit 1 +fi +if test -n "$ac_unrecognized_opts" && test "$enable_option_checking" != no; then + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: unrecognized options: $ac_unrecognized_opts" >&5 +$as_echo "$as_me: WARNING: unrecognized options: $ac_unrecognized_opts" >&2;} +fi + + + + +echo "*****************************************************" +echo "liblognorm will be compiled with the following settings:" +echo +echo "Regex enabled: $enable_regexp" +echo "Advanced Statistics enabled: $enable_advstats" +echo "Testbench enabled: $enable_testbench" +echo "Valgrind enabled: $enable_valgrind" +echo "Debug mode enabled: $enable_debug" +echo "Tools enabled: $enable_tools" +echo "Docs enabled: $enable_docs" + diff --git a/configure.ac b/configure.ac new file mode 100644 index 0000000..94bb78b --- /dev/null +++ b/configure.ac @@ -0,0 +1,198 @@ + -*- Autoconf -*- +# Process this file with autoconf to produce a configure script. + +AC_PREREQ(2.61) +AC_INIT([liblognorm], [2.0.5], [rgerhards@adiscon.com]) +AM_INIT_AUTOMAKE +m4_ifdef([AM_SILENT_RULES], [AM_SILENT_RULES([yes])]) +AC_CONFIG_SRCDIR([src/lognorm.c]) +AC_CONFIG_HEADER([config.h]) + +AC_USE_SYSTEM_EXTENSIONS + +# Checks for programs. +AC_PROG_CC +AM_PROG_CC_C_O +AC_PROG_CC_C99 +AC_PROG_LIBTOOL + +m4_ifdef([AX_IS_RELEASE], [ + AX_IS_RELEASE([git-directory]) + m4_ifdef([AX_COMPILER_FLAGS], [ + AX_COMPILER_FLAGS() + ], [ + if test "$GCC" = "yes" + then CFLAGS="$CFLAGS -W -Wall -Wformat-security -Wshadow -Wcast-align -Wpointer-arith -Wmissing-format-attribute -g" + fi + AC_MSG_WARN([missing AX_COMPILER_FLAGS macro, not using it]) + ]) +], [ + if test "$GCC" = "yes" + then CFLAGS="$CFLAGS -W -Wall -Wformat-security -Wshadow -Wcast-align -Wpointer-arith -Wmissing-format-attribute -g" + fi + AC_MSG_WARN([missing AX_IS_RELEASE macro, not using AX_COMPILER_FLAGS macro because of this]) +]) + + + +# Checks for libraries. +save_LIBS=$LIBS +LIBS= +AC_SEARCH_LIBS(clock_getm4_defn([AC_AUTOCONF_VERSION]), [2.68]time, rt) +LIBS=$save_LIBS + +# Checks for header files. +AC_HEADER_STDC +#AC_CHECK_HEADERS([]) + +# Checks for typedefs, structures, and compiler characteristics. +AC_C_CONST +AC_TYPE_SIZE_T +#AC_HEADER_TIME +#AC_STRUCT_TM + +# Checks for library functions. +AC_FUNC_SELECT_ARGTYPES +AC_TYPE_SIGNAL +AC_FUNC_STRERROR_R +AC_CHECK_FUNCS([strdup strndup strtok_r]) + +LIBLOGNORM_CFLAGS="-I\$(top_srcdir)/src" +LIBLOGNORM_LIBS="\$(top_builddir)/src/liblognorm.la" +AC_SUBST(LIBLOGNORM_CFLAGS) +AC_SUBST(LIBLOGNORM_LIBS) + +# modules we require +PKG_CHECK_MODULES(LIBESTR, libestr >= 0.0.0) +PKG_CHECK_MODULES(JSON_C, libfastjson,, ) +# add libestr flags to pkgconfig file for static linking +AC_SUBST(pkg_config_libs_private, $LIBESTR_LIBS) + +# Regular expressions +AC_ARG_ENABLE(regexp, + [AS_HELP_STRING([--enable-regexp],[Enable regular expressions support @<:@default=no@:>@])], + [case "${enableval}" in + yes) enable_regexp="yes" ;; + no) enable_regexp="no" ;; + *) AC_MSG_ERROR(bad value ${enableval} for --enable-regexp) ;; + esac], + [enable_regexp="no"] +) +AM_CONDITIONAL(ENABLE_REGEXP, test x$enable_regexp = xyes) +if test "$enable_regexp" = "yes"; then + PKG_CHECK_MODULES(PCRE, libpcre) + AC_DEFINE(FEATURE_REGEXP, 1, [Regular expressions support enabled.]) + FEATURE_REGEXP=1 +else + FEATURE_REGEXP=0 +fi +AC_SUBST(FEATURE_REGEXP) + +# debug mode settings +AC_ARG_ENABLE(debug, + [AS_HELP_STRING([--enable-debug],[Enable debug mode @<:@default=no@:>@])], + [case "${enableval}" in + yes) enable_debug="yes" ;; + no) enable_debug="no" ;; + *) AC_MSG_ERROR(bad value ${enableval} for --enable-debug) ;; + esac], + [enable_debug="no"] +) +if test "$enable_debug" = "yes"; then + AC_DEFINE(DEBUG, 1, [Defined if debug mode is enabled.]) +fi +if test "$enable_debug" = "no"; then + AC_DEFINE(NDEBUG, 1, [Defined if debug mode is disabled.]) +fi + +# advanced statistics +AC_ARG_ENABLE(advanced-stats, + [AS_HELP_STRING([--enable-advanced-stats],[Enable advanced statistics @<:@default=no@:>@])], + [case "${enableval}" in + yes) enable_advstats="yes" ;; + no) enable_advstats="no" ;; + *) AC_MSG_ERROR(bad value ${enableval} for --enable-advstats) ;; + esac], + [enable_advstats="no"] +) +if test "$enable_advstats" = "yes"; then + AC_DEFINE(ADVANCED_STATS, 1, [Defined if advanced statistics are enabled.]) +fi + +# docs (html) build settings +AC_ARG_ENABLE(docs, + [AS_HELP_STRING([--enable-docs],[Enable building HTML docs (requires Sphinx) @<:@default=no@:>@])], + [case "${enableval}" in + yes) enable_docs="yes" ;; + no) enable_docs="no" ;; + *) AC_MSG_ERROR(bad value ${enableval} for --enable-docs) ;; + esac], + [enable_docs="no"] +) +if test "$enable_docs" = "yes"; then + AC_CHECK_PROGS([SPHINXBUILD], [sphinx-build sphinx-build3 sphinx-build2], [no]) + if test "$SPHINXBUILD" = "no"; then + AC_MSG_ERROR([sphinx-build is required to build documentation, install it or try --disable-docs]) + fi +fi +AM_CONDITIONAL([ENABLE_DOCS], [test "$enable_docs" = "yes"]) + +AC_ARG_ENABLE(testbench, + [AS_HELP_STRING([--enable-testbench],[testbench enabled @<:@default=yes@:>@])], + [case "${enableval}" in + yes) enable_testbench="yes" ;; + no) enable_testbench="no" ;; + *) AC_MSG_ERROR(bad value ${enableval} for --enable-testbench) ;; + esac], + [enable_testbench=yes] +) +AM_CONDITIONAL(ENABLE_TESTBENCH, test x$enable_testbench = xyes) + +AC_ARG_ENABLE(valgrind, + [AS_HELP_STRING([--enable-valgrind],[valgrind enabled @<:@default=no@:>@])], + [case "${enableval}" in + yes) enable_valgrind="yes" ;; + no) enable_valgrind=="no" ;; + *) AC_MSG_ERROR(bad value ${enableval} for --enable-valgrind) ;; + esac], + [enable_valgrind=no] +) +AM_CONDITIONAL(ENABLE_VALGRIND, test x$enable_valgrind = xyes) +VALGRIND="$enable_valgrind" +AC_SUBST(VALGRIND) + +AC_ARG_ENABLE(tools, + [AS_HELP_STRING([--enable-tools],[lognorm toolset enabled @<:@default=yes@:>@])], + [case "${enableval}" in + yes) enable_tools="yes" ;; + no) enable_tools="no" ;; + *) AC_MSG_ERROR(bad value ${enableval} for --enable-tools) ;; + esac], + [enable_tools=yes] +) +AM_CONDITIONAL(ENABLE_TOOLS, test x$enable_tools = xyes) + +AC_CONFIG_FILES([Makefile \ + lognorm.pc \ + compat/Makefile \ + doc/Makefile \ + src/Makefile \ + src/lognorm-features.h \ + tools/Makefile \ + tests/Makefile \ + tests/options.sh]) +AC_OUTPUT +AC_CONFIG_MACRO_DIR([m4]) + + +echo "*****************************************************" +echo "liblognorm will be compiled with the following settings:" +echo +echo "Regex enabled: $enable_regexp" +echo "Advanced Statistics enabled: $enable_advstats" +echo "Testbench enabled: $enable_testbench" +echo "Valgrind enabled: $enable_valgrind" +echo "Debug mode enabled: $enable_debug" +echo "Tools enabled: $enable_tools" +echo "Docs enabled: $enable_docs" + diff --git a/depcomp b/depcomp new file mode 100755 index 0000000..fc98710 --- /dev/null +++ b/depcomp @@ -0,0 +1,791 @@ +#! /bin/sh +# depcomp - compile a program generating dependencies as side-effects + +scriptversion=2013-05-30.07; # UTC + +# Copyright (C) 1999-2014 Free Software Foundation, Inc. + +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 2, or (at your option) +# any later version. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. + +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +# As a special exception to the GNU General Public License, if you +# distribute this file as part of a program that contains a +# configuration script generated by Autoconf, you may include it under +# the same distribution terms that you use for the rest of that program. + +# Originally written by Alexandre Oliva . + +case $1 in + '') + echo "$0: No command. Try '$0 --help' for more information." 1>&2 + exit 1; + ;; + -h | --h*) + cat <<\EOF +Usage: depcomp [--help] [--version] PROGRAM [ARGS] + +Run PROGRAMS ARGS to compile a file, generating dependencies +as side-effects. + +Environment variables: + depmode Dependency tracking mode. + source Source file read by 'PROGRAMS ARGS'. + object Object file output by 'PROGRAMS ARGS'. + DEPDIR directory where to store dependencies. + depfile Dependency file to output. + tmpdepfile Temporary file to use when outputting dependencies. + libtool Whether libtool is used (yes/no). + +Report bugs to . +EOF + exit $? + ;; + -v | --v*) + echo "depcomp $scriptversion" + exit $? + ;; +esac + +# Get the directory component of the given path, and save it in the +# global variables '$dir'. Note that this directory component will +# be either empty or ending with a '/' character. This is deliberate. +set_dir_from () +{ + case $1 in + */*) dir=`echo "$1" | sed -e 's|/[^/]*$|/|'`;; + *) dir=;; + esac +} + +# Get the suffix-stripped basename of the given path, and save it the +# global variable '$base'. +set_base_from () +{ + base=`echo "$1" | sed -e 's|^.*/||' -e 's/\.[^.]*$//'` +} + +# If no dependency file was actually created by the compiler invocation, +# we still have to create a dummy depfile, to avoid errors with the +# Makefile "include basename.Plo" scheme. +make_dummy_depfile () +{ + echo "#dummy" > "$depfile" +} + +# Factor out some common post-processing of the generated depfile. +# Requires the auxiliary global variable '$tmpdepfile' to be set. +aix_post_process_depfile () +{ + # If the compiler actually managed to produce a dependency file, + # post-process it. + if test -f "$tmpdepfile"; then + # Each line is of the form 'foo.o: dependency.h'. + # Do two passes, one to just change these to + # $object: dependency.h + # and one to simply output + # dependency.h: + # which is needed to avoid the deleted-header problem. + { sed -e "s,^.*\.[$lower]*:,$object:," < "$tmpdepfile" + sed -e "s,^.*\.[$lower]*:[$tab ]*,," -e 's,$,:,' < "$tmpdepfile" + } > "$depfile" + rm -f "$tmpdepfile" + else + make_dummy_depfile + fi +} + +# A tabulation character. +tab=' ' +# A newline character. +nl=' +' +# Character ranges might be problematic outside the C locale. +# These definitions help. +upper=ABCDEFGHIJKLMNOPQRSTUVWXYZ +lower=abcdefghijklmnopqrstuvwxyz +digits=0123456789 +alpha=${upper}${lower} + +if test -z "$depmode" || test -z "$source" || test -z "$object"; then + echo "depcomp: Variables source, object and depmode must be set" 1>&2 + exit 1 +fi + +# Dependencies for sub/bar.o or sub/bar.obj go into sub/.deps/bar.Po. +depfile=${depfile-`echo "$object" | + sed 's|[^\\/]*$|'${DEPDIR-.deps}'/&|;s|\.\([^.]*\)$|.P\1|;s|Pobj$|Po|'`} +tmpdepfile=${tmpdepfile-`echo "$depfile" | sed 's/\.\([^.]*\)$/.T\1/'`} + +rm -f "$tmpdepfile" + +# Avoid interferences from the environment. +gccflag= dashmflag= + +# Some modes work just like other modes, but use different flags. We +# parameterize here, but still list the modes in the big case below, +# to make depend.m4 easier to write. Note that we *cannot* use a case +# here, because this file can only contain one case statement. +if test "$depmode" = hp; then + # HP compiler uses -M and no extra arg. + gccflag=-M + depmode=gcc +fi + +if test "$depmode" = dashXmstdout; then + # This is just like dashmstdout with a different argument. + dashmflag=-xM + depmode=dashmstdout +fi + +cygpath_u="cygpath -u -f -" +if test "$depmode" = msvcmsys; then + # This is just like msvisualcpp but w/o cygpath translation. + # Just convert the backslash-escaped backslashes to single forward + # slashes to satisfy depend.m4 + cygpath_u='sed s,\\\\,/,g' + depmode=msvisualcpp +fi + +if test "$depmode" = msvc7msys; then + # This is just like msvc7 but w/o cygpath translation. + # Just convert the backslash-escaped backslashes to single forward + # slashes to satisfy depend.m4 + cygpath_u='sed s,\\\\,/,g' + depmode=msvc7 +fi + +if test "$depmode" = xlc; then + # IBM C/C++ Compilers xlc/xlC can output gcc-like dependency information. + gccflag=-qmakedep=gcc,-MF + depmode=gcc +fi + +case "$depmode" in +gcc3) +## gcc 3 implements dependency tracking that does exactly what +## we want. Yay! Note: for some reason libtool 1.4 doesn't like +## it if -MD -MP comes after the -MF stuff. Hmm. +## Unfortunately, FreeBSD c89 acceptance of flags depends upon +## the command line argument order; so add the flags where they +## appear in depend2.am. Note that the slowdown incurred here +## affects only configure: in makefiles, %FASTDEP% shortcuts this. + for arg + do + case $arg in + -c) set fnord "$@" -MT "$object" -MD -MP -MF "$tmpdepfile" "$arg" ;; + *) set fnord "$@" "$arg" ;; + esac + shift # fnord + shift # $arg + done + "$@" + stat=$? + if test $stat -ne 0; then + rm -f "$tmpdepfile" + exit $stat + fi + mv "$tmpdepfile" "$depfile" + ;; + +gcc) +## Note that this doesn't just cater to obsosete pre-3.x GCC compilers. +## but also to in-use compilers like IMB xlc/xlC and the HP C compiler. +## (see the conditional assignment to $gccflag above). +## There are various ways to get dependency output from gcc. Here's +## why we pick this rather obscure method: +## - Don't want to use -MD because we'd like the dependencies to end +## up in a subdir. Having to rename by hand is ugly. +## (We might end up doing this anyway to support other compilers.) +## - The DEPENDENCIES_OUTPUT environment variable makes gcc act like +## -MM, not -M (despite what the docs say). Also, it might not be +## supported by the other compilers which use the 'gcc' depmode. +## - Using -M directly means running the compiler twice (even worse +## than renaming). + if test -z "$gccflag"; then + gccflag=-MD, + fi + "$@" -Wp,"$gccflag$tmpdepfile" + stat=$? + if test $stat -ne 0; then + rm -f "$tmpdepfile" + exit $stat + fi + rm -f "$depfile" + echo "$object : \\" > "$depfile" + # The second -e expression handles DOS-style file names with drive + # letters. + sed -e 's/^[^:]*: / /' \ + -e 's/^['$alpha']:\/[^:]*: / /' < "$tmpdepfile" >> "$depfile" +## This next piece of magic avoids the "deleted header file" problem. +## The problem is that when a header file which appears in a .P file +## is deleted, the dependency causes make to die (because there is +## typically no way to rebuild the header). We avoid this by adding +## dummy dependencies for each header file. Too bad gcc doesn't do +## this for us directly. +## Some versions of gcc put a space before the ':'. On the theory +## that the space means something, we add a space to the output as +## well. hp depmode also adds that space, but also prefixes the VPATH +## to the object. Take care to not repeat it in the output. +## Some versions of the HPUX 10.20 sed can't process this invocation +## correctly. Breaking it into two sed invocations is a workaround. + tr ' ' "$nl" < "$tmpdepfile" \ + | sed -e 's/^\\$//' -e '/^$/d' -e "s|.*$object$||" -e '/:$/d' \ + | sed -e 's/$/ :/' >> "$depfile" + rm -f "$tmpdepfile" + ;; + +hp) + # This case exists only to let depend.m4 do its work. It works by + # looking at the text of this script. This case will never be run, + # since it is checked for above. + exit 1 + ;; + +sgi) + if test "$libtool" = yes; then + "$@" "-Wp,-MDupdate,$tmpdepfile" + else + "$@" -MDupdate "$tmpdepfile" + fi + stat=$? + if test $stat -ne 0; then + rm -f "$tmpdepfile" + exit $stat + fi + rm -f "$depfile" + + if test -f "$tmpdepfile"; then # yes, the sourcefile depend on other files + echo "$object : \\" > "$depfile" + # Clip off the initial element (the dependent). Don't try to be + # clever and replace this with sed code, as IRIX sed won't handle + # lines with more than a fixed number of characters (4096 in + # IRIX 6.2 sed, 8192 in IRIX 6.5). We also remove comment lines; + # the IRIX cc adds comments like '#:fec' to the end of the + # dependency line. + tr ' ' "$nl" < "$tmpdepfile" \ + | sed -e 's/^.*\.o://' -e 's/#.*$//' -e '/^$/ d' \ + | tr "$nl" ' ' >> "$depfile" + echo >> "$depfile" + # The second pass generates a dummy entry for each header file. + tr ' ' "$nl" < "$tmpdepfile" \ + | sed -e 's/^.*\.o://' -e 's/#.*$//' -e '/^$/ d' -e 's/$/:/' \ + >> "$depfile" + else + make_dummy_depfile + fi + rm -f "$tmpdepfile" + ;; + +xlc) + # This case exists only to let depend.m4 do its work. It works by + # looking at the text of this script. This case will never be run, + # since it is checked for above. + exit 1 + ;; + +aix) + # The C for AIX Compiler uses -M and outputs the dependencies + # in a .u file. In older versions, this file always lives in the + # current directory. Also, the AIX compiler puts '$object:' at the + # start of each line; $object doesn't have directory information. + # Version 6 uses the directory in both cases. + set_dir_from "$object" + set_base_from "$object" + if test "$libtool" = yes; then + tmpdepfile1=$dir$base.u + tmpdepfile2=$base.u + tmpdepfile3=$dir.libs/$base.u + "$@" -Wc,-M + else + tmpdepfile1=$dir$base.u + tmpdepfile2=$dir$base.u + tmpdepfile3=$dir$base.u + "$@" -M + fi + stat=$? + if test $stat -ne 0; then + rm -f "$tmpdepfile1" "$tmpdepfile2" "$tmpdepfile3" + exit $stat + fi + + for tmpdepfile in "$tmpdepfile1" "$tmpdepfile2" "$tmpdepfile3" + do + test -f "$tmpdepfile" && break + done + aix_post_process_depfile + ;; + +tcc) + # tcc (Tiny C Compiler) understand '-MD -MF file' since version 0.9.26 + # FIXME: That version still under development at the moment of writing. + # Make that this statement remains true also for stable, released + # versions. + # It will wrap lines (doesn't matter whether long or short) with a + # trailing '\', as in: + # + # foo.o : \ + # foo.c \ + # foo.h \ + # + # It will put a trailing '\' even on the last line, and will use leading + # spaces rather than leading tabs (at least since its commit 0394caf7 + # "Emit spaces for -MD"). + "$@" -MD -MF "$tmpdepfile" + stat=$? + if test $stat -ne 0; then + rm -f "$tmpdepfile" + exit $stat + fi + rm -f "$depfile" + # Each non-empty line is of the form 'foo.o : \' or ' dep.h \'. + # We have to change lines of the first kind to '$object: \'. + sed -e "s|.*:|$object :|" < "$tmpdepfile" > "$depfile" + # And for each line of the second kind, we have to emit a 'dep.h:' + # dummy dependency, to avoid the deleted-header problem. + sed -n -e 's|^ *\(.*\) *\\$|\1:|p' < "$tmpdepfile" >> "$depfile" + rm -f "$tmpdepfile" + ;; + +## The order of this option in the case statement is important, since the +## shell code in configure will try each of these formats in the order +## listed in this file. A plain '-MD' option would be understood by many +## compilers, so we must ensure this comes after the gcc and icc options. +pgcc) + # Portland's C compiler understands '-MD'. + # Will always output deps to 'file.d' where file is the root name of the + # source file under compilation, even if file resides in a subdirectory. + # The object file name does not affect the name of the '.d' file. + # pgcc 10.2 will output + # foo.o: sub/foo.c sub/foo.h + # and will wrap long lines using '\' : + # foo.o: sub/foo.c ... \ + # sub/foo.h ... \ + # ... + set_dir_from "$object" + # Use the source, not the object, to determine the base name, since + # that's sadly what pgcc will do too. + set_base_from "$source" + tmpdepfile=$base.d + + # For projects that build the same source file twice into different object + # files, the pgcc approach of using the *source* file root name can cause + # problems in parallel builds. Use a locking strategy to avoid stomping on + # the same $tmpdepfile. + lockdir=$base.d-lock + trap " + echo '$0: caught signal, cleaning up...' >&2 + rmdir '$lockdir' + exit 1 + " 1 2 13 15 + numtries=100 + i=$numtries + while test $i -gt 0; do + # mkdir is a portable test-and-set. + if mkdir "$lockdir" 2>/dev/null; then + # This process acquired the lock. + "$@" -MD + stat=$? + # Release the lock. + rmdir "$lockdir" + break + else + # If the lock is being held by a different process, wait + # until the winning process is done or we timeout. + while test -d "$lockdir" && test $i -gt 0; do + sleep 1 + i=`expr $i - 1` + done + fi + i=`expr $i - 1` + done + trap - 1 2 13 15 + if test $i -le 0; then + echo "$0: failed to acquire lock after $numtries attempts" >&2 + echo "$0: check lockdir '$lockdir'" >&2 + exit 1 + fi + + if test $stat -ne 0; then + rm -f "$tmpdepfile" + exit $stat + fi + rm -f "$depfile" + # Each line is of the form `foo.o: dependent.h', + # or `foo.o: dep1.h dep2.h \', or ` dep3.h dep4.h \'. + # Do two passes, one to just change these to + # `$object: dependent.h' and one to simply `dependent.h:'. + sed "s,^[^:]*:,$object :," < "$tmpdepfile" > "$depfile" + # Some versions of the HPUX 10.20 sed can't process this invocation + # correctly. Breaking it into two sed invocations is a workaround. + sed 's,^[^:]*: \(.*\)$,\1,;s/^\\$//;/^$/d;/:$/d' < "$tmpdepfile" \ + | sed -e 's/$/ :/' >> "$depfile" + rm -f "$tmpdepfile" + ;; + +hp2) + # The "hp" stanza above does not work with aCC (C++) and HP's ia64 + # compilers, which have integrated preprocessors. The correct option + # to use with these is +Maked; it writes dependencies to a file named + # 'foo.d', which lands next to the object file, wherever that + # happens to be. + # Much of this is similar to the tru64 case; see comments there. + set_dir_from "$object" + set_base_from "$object" + if test "$libtool" = yes; then + tmpdepfile1=$dir$base.d + tmpdepfile2=$dir.libs/$base.d + "$@" -Wc,+Maked + else + tmpdepfile1=$dir$base.d + tmpdepfile2=$dir$base.d + "$@" +Maked + fi + stat=$? + if test $stat -ne 0; then + rm -f "$tmpdepfile1" "$tmpdepfile2" + exit $stat + fi + + for tmpdepfile in "$tmpdepfile1" "$tmpdepfile2" + do + test -f "$tmpdepfile" && break + done + if test -f "$tmpdepfile"; then + sed -e "s,^.*\.[$lower]*:,$object:," "$tmpdepfile" > "$depfile" + # Add 'dependent.h:' lines. + sed -ne '2,${ + s/^ *// + s/ \\*$// + s/$/:/ + p + }' "$tmpdepfile" >> "$depfile" + else + make_dummy_depfile + fi + rm -f "$tmpdepfile" "$tmpdepfile2" + ;; + +tru64) + # The Tru64 compiler uses -MD to generate dependencies as a side + # effect. 'cc -MD -o foo.o ...' puts the dependencies into 'foo.o.d'. + # At least on Alpha/Redhat 6.1, Compaq CCC V6.2-504 seems to put + # dependencies in 'foo.d' instead, so we check for that too. + # Subdirectories are respected. + set_dir_from "$object" + set_base_from "$object" + + if test "$libtool" = yes; then + # Libtool generates 2 separate objects for the 2 libraries. These + # two compilations output dependencies in $dir.libs/$base.o.d and + # in $dir$base.o.d. We have to check for both files, because + # one of the two compilations can be disabled. We should prefer + # $dir$base.o.d over $dir.libs/$base.o.d because the latter is + # automatically cleaned when .libs/ is deleted, while ignoring + # the former would cause a distcleancheck panic. + tmpdepfile1=$dir$base.o.d # libtool 1.5 + tmpdepfile2=$dir.libs/$base.o.d # Likewise. + tmpdepfile3=$dir.libs/$base.d # Compaq CCC V6.2-504 + "$@" -Wc,-MD + else + tmpdepfile1=$dir$base.d + tmpdepfile2=$dir$base.d + tmpdepfile3=$dir$base.d + "$@" -MD + fi + + stat=$? + if test $stat -ne 0; then + rm -f "$tmpdepfile1" "$tmpdepfile2" "$tmpdepfile3" + exit $stat + fi + + for tmpdepfile in "$tmpdepfile1" "$tmpdepfile2" "$tmpdepfile3" + do + test -f "$tmpdepfile" && break + done + # Same post-processing that is required for AIX mode. + aix_post_process_depfile + ;; + +msvc7) + if test "$libtool" = yes; then + showIncludes=-Wc,-showIncludes + else + showIncludes=-showIncludes + fi + "$@" $showIncludes > "$tmpdepfile" + stat=$? + grep -v '^Note: including file: ' "$tmpdepfile" + if test $stat -ne 0; then + rm -f "$tmpdepfile" + exit $stat + fi + rm -f "$depfile" + echo "$object : \\" > "$depfile" + # The first sed program below extracts the file names and escapes + # backslashes for cygpath. The second sed program outputs the file + # name when reading, but also accumulates all include files in the + # hold buffer in order to output them again at the end. This only + # works with sed implementations that can handle large buffers. + sed < "$tmpdepfile" -n ' +/^Note: including file: *\(.*\)/ { + s//\1/ + s/\\/\\\\/g + p +}' | $cygpath_u | sort -u | sed -n ' +s/ /\\ /g +s/\(.*\)/'"$tab"'\1 \\/p +s/.\(.*\) \\/\1:/ +H +$ { + s/.*/'"$tab"'/ + G + p +}' >> "$depfile" + echo >> "$depfile" # make sure the fragment doesn't end with a backslash + rm -f "$tmpdepfile" + ;; + +msvc7msys) + # This case exists only to let depend.m4 do its work. It works by + # looking at the text of this script. This case will never be run, + # since it is checked for above. + exit 1 + ;; + +#nosideeffect) + # This comment above is used by automake to tell side-effect + # dependency tracking mechanisms from slower ones. + +dashmstdout) + # Important note: in order to support this mode, a compiler *must* + # always write the preprocessed file to stdout, regardless of -o. + "$@" || exit $? + + # Remove the call to Libtool. + if test "$libtool" = yes; then + while test "X$1" != 'X--mode=compile'; do + shift + done + shift + fi + + # Remove '-o $object'. + IFS=" " + for arg + do + case $arg in + -o) + shift + ;; + $object) + shift + ;; + *) + set fnord "$@" "$arg" + shift # fnord + shift # $arg + ;; + esac + done + + test -z "$dashmflag" && dashmflag=-M + # Require at least two characters before searching for ':' + # in the target name. This is to cope with DOS-style filenames: + # a dependency such as 'c:/foo/bar' could be seen as target 'c' otherwise. + "$@" $dashmflag | + sed "s|^[$tab ]*[^:$tab ][^:][^:]*:[$tab ]*|$object: |" > "$tmpdepfile" + rm -f "$depfile" + cat < "$tmpdepfile" > "$depfile" + # Some versions of the HPUX 10.20 sed can't process this sed invocation + # correctly. Breaking it into two sed invocations is a workaround. + tr ' ' "$nl" < "$tmpdepfile" \ + | sed -e 's/^\\$//' -e '/^$/d' -e '/:$/d' \ + | sed -e 's/$/ :/' >> "$depfile" + rm -f "$tmpdepfile" + ;; + +dashXmstdout) + # This case only exists to satisfy depend.m4. It is never actually + # run, as this mode is specially recognized in the preamble. + exit 1 + ;; + +makedepend) + "$@" || exit $? + # Remove any Libtool call + if test "$libtool" = yes; then + while test "X$1" != 'X--mode=compile'; do + shift + done + shift + fi + # X makedepend + shift + cleared=no eat=no + for arg + do + case $cleared in + no) + set ""; shift + cleared=yes ;; + esac + if test $eat = yes; then + eat=no + continue + fi + case "$arg" in + -D*|-I*) + set fnord "$@" "$arg"; shift ;; + # Strip any option that makedepend may not understand. Remove + # the object too, otherwise makedepend will parse it as a source file. + -arch) + eat=yes ;; + -*|$object) + ;; + *) + set fnord "$@" "$arg"; shift ;; + esac + done + obj_suffix=`echo "$object" | sed 's/^.*\././'` + touch "$tmpdepfile" + ${MAKEDEPEND-makedepend} -o"$obj_suffix" -f"$tmpdepfile" "$@" + rm -f "$depfile" + # makedepend may prepend the VPATH from the source file name to the object. + # No need to regex-escape $object, excess matching of '.' is harmless. + sed "s|^.*\($object *:\)|\1|" "$tmpdepfile" > "$depfile" + # Some versions of the HPUX 10.20 sed can't process the last invocation + # correctly. Breaking it into two sed invocations is a workaround. + sed '1,2d' "$tmpdepfile" \ + | tr ' ' "$nl" \ + | sed -e 's/^\\$//' -e '/^$/d' -e '/:$/d' \ + | sed -e 's/$/ :/' >> "$depfile" + rm -f "$tmpdepfile" "$tmpdepfile".bak + ;; + +cpp) + # Important note: in order to support this mode, a compiler *must* + # always write the preprocessed file to stdout. + "$@" || exit $? + + # Remove the call to Libtool. + if test "$libtool" = yes; then + while test "X$1" != 'X--mode=compile'; do + shift + done + shift + fi + + # Remove '-o $object'. + IFS=" " + for arg + do + case $arg in + -o) + shift + ;; + $object) + shift + ;; + *) + set fnord "$@" "$arg" + shift # fnord + shift # $arg + ;; + esac + done + + "$@" -E \ + | sed -n -e '/^# [0-9][0-9]* "\([^"]*\)".*/ s:: \1 \\:p' \ + -e '/^#line [0-9][0-9]* "\([^"]*\)".*/ s:: \1 \\:p' \ + | sed '$ s: \\$::' > "$tmpdepfile" + rm -f "$depfile" + echo "$object : \\" > "$depfile" + cat < "$tmpdepfile" >> "$depfile" + sed < "$tmpdepfile" '/^$/d;s/^ //;s/ \\$//;s/$/ :/' >> "$depfile" + rm -f "$tmpdepfile" + ;; + +msvisualcpp) + # Important note: in order to support this mode, a compiler *must* + # always write the preprocessed file to stdout. + "$@" || exit $? + + # Remove the call to Libtool. + if test "$libtool" = yes; then + while test "X$1" != 'X--mode=compile'; do + shift + done + shift + fi + + IFS=" " + for arg + do + case "$arg" in + -o) + shift + ;; + $object) + shift + ;; + "-Gm"|"/Gm"|"-Gi"|"/Gi"|"-ZI"|"/ZI") + set fnord "$@" + shift + shift + ;; + *) + set fnord "$@" "$arg" + shift + shift + ;; + esac + done + "$@" -E 2>/dev/null | + sed -n '/^#line [0-9][0-9]* "\([^"]*\)"/ s::\1:p' | $cygpath_u | sort -u > "$tmpdepfile" + rm -f "$depfile" + echo "$object : \\" > "$depfile" + sed < "$tmpdepfile" -n -e 's% %\\ %g' -e '/^\(.*\)$/ s::'"$tab"'\1 \\:p' >> "$depfile" + echo "$tab" >> "$depfile" + sed < "$tmpdepfile" -n -e 's% %\\ %g' -e '/^\(.*\)$/ s::\1\::p' >> "$depfile" + rm -f "$tmpdepfile" + ;; + +msvcmsys) + # This case exists only to let depend.m4 do its work. It works by + # looking at the text of this script. This case will never be run, + # since it is checked for above. + exit 1 + ;; + +none) + exec "$@" + ;; + +*) + echo "Unknown depmode $depmode" 1>&2 + exit 1 + ;; +esac + +exit 0 + +# Local Variables: +# mode: shell-script +# sh-indentation: 2 +# eval: (add-hook 'write-file-hooks 'time-stamp) +# time-stamp-start: "scriptversion=" +# time-stamp-format: "%:y-%02m-%02d.%02H" +# time-stamp-time-zone: "UTC" +# time-stamp-end: "; # UTC" +# End: diff --git a/doc/Makefile.am b/doc/Makefile.am new file mode 100644 index 0000000..d8b8aaa --- /dev/null +++ b/doc/Makefile.am @@ -0,0 +1,52 @@ +EXTRA_DIST = _static _templates conf.py \ + index.rst introduction.rst installation.rst \ + configuration.rst sample_rulebase.rst internals.rst \ + contacts.rst changes.rst libraryapi.rst \ + lognormalizer.rst license.rst graph.png + +htmldir = $(docdir) +built_html = _build/html + +#html_DATA = $(built_html)/index.html + +# Makefile for Sphinx documentation +# + +# You can set these variables from the command line. +SPHINXOPTS = -n -W -c $(srcdir) +#SPHINXBUILD = sphinx-build +PAPER = +BUILDDIR = _build + +# Internal variables. +ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(SPHINXOPTS) $(srcdir) + +.PHONY: clean-local html-local man-local all-local dist-hook install-data-hook + +dist-hook: + find $(distdir)/ -name .gitignore | xargs rm -f + +clean-local: + -rm -rf $(BUILDDIR)/* + +html-local: + $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html + @echo + @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." + +man-local: + $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man + @echo + @echo "Build finished. The manual pages are in $(BUILDDIR)/man." + +all-local: html-local + +install-data-hook: + find $(built_html) -type f -printf "%P\n" | \ + while read file; do \ + echo " $(INSTALL_DATA) -D $(built_html)/$$file '$(DESTDIR)$(htmldir)/$$file'"; \ + $(INSTALL_DATA) -D $(built_html)/$$file "$(DESTDIR)$(htmldir)/$$file" || exit $$?; \ + done + +uninstall-local: + -rm -rf "$(DESTDIR)$(htmldir)" diff --git a/doc/Makefile.in b/doc/Makefile.in new file mode 100644 index 0000000..b840fff --- /dev/null +++ b/doc/Makefile.in @@ -0,0 +1,509 @@ +# Makefile.in generated by automake 1.15 from Makefile.am. +# @configure_input@ + +# Copyright (C) 1994-2014 Free Software Foundation, Inc. + +# This Makefile.in is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY, to the extent permitted by law; without +# even the implied warranty of MERCHANTABILITY or FITNESS FOR A +# PARTICULAR PURPOSE. + +@SET_MAKE@ +VPATH = @srcdir@ +am__is_gnu_make = { \ + if test -z '$(MAKELEVEL)'; then \ + false; \ + elif test -n '$(MAKE_HOST)'; then \ + true; \ + elif test -n '$(MAKE_VERSION)' && test -n '$(CURDIR)'; then \ + true; \ + else \ + false; \ + fi; \ +} +am__make_running_with_option = \ + case $${target_option-} in \ + ?) ;; \ + *) echo "am__make_running_with_option: internal error: invalid" \ + "target option '$${target_option-}' specified" >&2; \ + exit 1;; \ + esac; \ + has_opt=no; \ + sane_makeflags=$$MAKEFLAGS; \ + if $(am__is_gnu_make); then \ + sane_makeflags=$$MFLAGS; \ + else \ + case $$MAKEFLAGS in \ + *\\[\ \ ]*) \ + bs=\\; \ + sane_makeflags=`printf '%s\n' "$$MAKEFLAGS" \ + | sed "s/$$bs$$bs[$$bs $$bs ]*//g"`;; \ + esac; \ + fi; \ + skip_next=no; \ + strip_trailopt () \ + { \ + flg=`printf '%s\n' "$$flg" | sed "s/$$1.*$$//"`; \ + }; \ + for flg in $$sane_makeflags; do \ + test $$skip_next = yes && { skip_next=no; continue; }; \ + case $$flg in \ + *=*|--*) continue;; \ + -*I) strip_trailopt 'I'; skip_next=yes;; \ + -*I?*) strip_trailopt 'I';; \ + -*O) strip_trailopt 'O'; skip_next=yes;; \ + -*O?*) strip_trailopt 'O';; \ + -*l) strip_trailopt 'l'; skip_next=yes;; \ + -*l?*) strip_trailopt 'l';; \ + -[dEDm]) skip_next=yes;; \ + -[JT]) skip_next=yes;; \ + esac; \ + case $$flg in \ + *$$target_option*) has_opt=yes; break;; \ + esac; \ + done; \ + test $$has_opt = yes +am__make_dryrun = (target_option=n; $(am__make_running_with_option)) +am__make_keepgoing = (target_option=k; $(am__make_running_with_option)) +pkgdatadir = $(datadir)/@PACKAGE@ +pkgincludedir = $(includedir)/@PACKAGE@ +pkglibdir = $(libdir)/@PACKAGE@ +pkglibexecdir = $(libexecdir)/@PACKAGE@ +am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd +install_sh_DATA = $(install_sh) -c -m 644 +install_sh_PROGRAM = $(install_sh) -c +install_sh_SCRIPT = $(install_sh) -c +INSTALL_HEADER = $(INSTALL_DATA) +transform = $(program_transform_name) +NORMAL_INSTALL = : +PRE_INSTALL = : +POST_INSTALL = : +NORMAL_UNINSTALL = : +PRE_UNINSTALL = : +POST_UNINSTALL = : +build_triplet = @build@ +host_triplet = @host@ +subdir = doc +ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 +am__aclocal_m4_deps = $(top_srcdir)/m4/libtool.m4 \ + $(top_srcdir)/m4/ltoptions.m4 $(top_srcdir)/m4/ltsugar.m4 \ + $(top_srcdir)/m4/ltversion.m4 $(top_srcdir)/m4/lt~obsolete.m4 \ + $(top_srcdir)/m4/pkg.m4 $(top_srcdir)/configure.ac +am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \ + $(ACLOCAL_M4) +DIST_COMMON = $(srcdir)/Makefile.am $(am__DIST_COMMON) +mkinstalldirs = $(install_sh) -d +CONFIG_HEADER = $(top_builddir)/config.h +CONFIG_CLEAN_FILES = +CONFIG_CLEAN_VPATH_FILES = +AM_V_P = $(am__v_P_@AM_V@) +am__v_P_ = $(am__v_P_@AM_DEFAULT_V@) +am__v_P_0 = false +am__v_P_1 = : +AM_V_GEN = $(am__v_GEN_@AM_V@) +am__v_GEN_ = $(am__v_GEN_@AM_DEFAULT_V@) +am__v_GEN_0 = @echo " GEN " $@; +am__v_GEN_1 = +AM_V_at = $(am__v_at_@AM_V@) +am__v_at_ = $(am__v_at_@AM_DEFAULT_V@) +am__v_at_0 = @ +am__v_at_1 = +SOURCES = +DIST_SOURCES = +am__can_run_installinfo = \ + case $$AM_UPDATE_INFO_DIR in \ + n|no|NO) false;; \ + *) (install-info --version) >/dev/null 2>&1;; \ + esac +am__tagged_files = $(HEADERS) $(SOURCES) $(TAGS_FILES) $(LISP) +am__DIST_COMMON = $(srcdir)/Makefile.in +DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST) +ACLOCAL = @ACLOCAL@ +AMTAR = @AMTAR@ +AM_DEFAULT_VERBOSITY = @AM_DEFAULT_VERBOSITY@ +AR = @AR@ +AUTOCONF = @AUTOCONF@ +AUTOHEADER = @AUTOHEADER@ +AUTOMAKE = @AUTOMAKE@ +AWK = @AWK@ +CC = @CC@ +CCDEPMODE = @CCDEPMODE@ +CFLAGS = @CFLAGS@ +CPP = @CPP@ +CPPFLAGS = @CPPFLAGS@ +CYGPATH_W = @CYGPATH_W@ +DEFS = @DEFS@ +DEPDIR = @DEPDIR@ +DLLTOOL = @DLLTOOL@ +DSYMUTIL = @DSYMUTIL@ +DUMPBIN = @DUMPBIN@ +ECHO_C = @ECHO_C@ +ECHO_N = @ECHO_N@ +ECHO_T = @ECHO_T@ +EGREP = @EGREP@ +EXEEXT = @EXEEXT@ +FEATURE_REGEXP = @FEATURE_REGEXP@ +FGREP = @FGREP@ +GREP = @GREP@ +INSTALL = @INSTALL@ +INSTALL_DATA = @INSTALL_DATA@ +INSTALL_PROGRAM = @INSTALL_PROGRAM@ +INSTALL_SCRIPT = @INSTALL_SCRIPT@ +INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@ +JSON_C_CFLAGS = @JSON_C_CFLAGS@ +JSON_C_LIBS = @JSON_C_LIBS@ +LD = @LD@ +LDFLAGS = @LDFLAGS@ +LIBESTR_CFLAGS = @LIBESTR_CFLAGS@ +LIBESTR_LIBS = @LIBESTR_LIBS@ +LIBLOGNORM_CFLAGS = @LIBLOGNORM_CFLAGS@ +LIBLOGNORM_LIBS = @LIBLOGNORM_LIBS@ +LIBOBJS = @LIBOBJS@ +LIBS = @LIBS@ +LIBTOOL = @LIBTOOL@ +LIPO = @LIPO@ +LN_S = @LN_S@ +LTLIBOBJS = @LTLIBOBJS@ +LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAKEINFO = @MAKEINFO@ +MANIFEST_TOOL = @MANIFEST_TOOL@ +MKDIR_P = @MKDIR_P@ +NM = @NM@ +NMEDIT = @NMEDIT@ +OBJDUMP = @OBJDUMP@ +OBJEXT = @OBJEXT@ +OTOOL = @OTOOL@ +OTOOL64 = @OTOOL64@ +PACKAGE = @PACKAGE@ +PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@ +PACKAGE_NAME = @PACKAGE_NAME@ +PACKAGE_STRING = @PACKAGE_STRING@ +PACKAGE_TARNAME = @PACKAGE_TARNAME@ +PACKAGE_URL = @PACKAGE_URL@ +PACKAGE_VERSION = @PACKAGE_VERSION@ +PATH_SEPARATOR = @PATH_SEPARATOR@ +PCRE_CFLAGS = @PCRE_CFLAGS@ +PCRE_LIBS = @PCRE_LIBS@ +PKG_CONFIG = @PKG_CONFIG@ +PKG_CONFIG_LIBDIR = @PKG_CONFIG_LIBDIR@ +PKG_CONFIG_PATH = @PKG_CONFIG_PATH@ +RANLIB = @RANLIB@ +SED = @SED@ +SET_MAKE = @SET_MAKE@ +SHELL = @SHELL@ +SPHINXBUILD = @SPHINXBUILD@ +STRIP = @STRIP@ +VALGRIND = @VALGRIND@ +VERSION = @VERSION@ +WARN_CFLAGS = @WARN_CFLAGS@ +WARN_LDFLAGS = @WARN_LDFLAGS@ +WARN_SCANNERFLAGS = @WARN_SCANNERFLAGS@ +abs_builddir = @abs_builddir@ +abs_srcdir = @abs_srcdir@ +abs_top_builddir = @abs_top_builddir@ +abs_top_srcdir = @abs_top_srcdir@ +ac_ct_AR = @ac_ct_AR@ +ac_ct_CC = @ac_ct_CC@ +ac_ct_DUMPBIN = @ac_ct_DUMPBIN@ +am__include = @am__include@ +am__leading_dot = @am__leading_dot@ +am__quote = @am__quote@ +am__tar = @am__tar@ +am__untar = @am__untar@ +bindir = @bindir@ +build = @build@ +build_alias = @build_alias@ +build_cpu = @build_cpu@ +build_os = @build_os@ +build_vendor = @build_vendor@ +builddir = @builddir@ +datadir = @datadir@ +datarootdir = @datarootdir@ +docdir = @docdir@ +dvidir = @dvidir@ +exec_prefix = @exec_prefix@ +host = @host@ +host_alias = @host_alias@ +host_cpu = @host_cpu@ +host_os = @host_os@ +host_vendor = @host_vendor@ +htmldir = $(docdir) +includedir = @includedir@ +infodir = @infodir@ +install_sh = @install_sh@ +libdir = @libdir@ +libexecdir = @libexecdir@ +localedir = @localedir@ +localstatedir = @localstatedir@ +mandir = @mandir@ +mkdir_p = @mkdir_p@ +oldincludedir = @oldincludedir@ +pdfdir = @pdfdir@ +pkg_config_libs_private = @pkg_config_libs_private@ +prefix = @prefix@ +program_transform_name = @program_transform_name@ +psdir = @psdir@ +runstatedir = @runstatedir@ +sbindir = @sbindir@ +sharedstatedir = @sharedstatedir@ +srcdir = @srcdir@ +sysconfdir = @sysconfdir@ +target_alias = @target_alias@ +top_build_prefix = @top_build_prefix@ +top_builddir = @top_builddir@ +top_srcdir = @top_srcdir@ +EXTRA_DIST = _static _templates conf.py \ + index.rst introduction.rst installation.rst \ + configuration.rst sample_rulebase.rst internals.rst \ + contacts.rst changes.rst libraryapi.rst \ + lognormalizer.rst license.rst graph.png + +built_html = _build/html + +#html_DATA = $(built_html)/index.html + +# Makefile for Sphinx documentation +# + +# You can set these variables from the command line. +SPHINXOPTS = -n -W -c $(srcdir) +#SPHINXBUILD = sphinx-build +PAPER = +BUILDDIR = _build + +# Internal variables. +ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(SPHINXOPTS) $(srcdir) +all: all-am + +.SUFFIXES: +$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) + @for dep in $?; do \ + case '$(am__configure_deps)' in \ + *$$dep*) \ + ( cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh ) \ + && { if test -f $@; then exit 0; else break; fi; }; \ + exit 1;; \ + esac; \ + done; \ + echo ' cd $(top_srcdir) && $(AUTOMAKE) --gnu doc/Makefile'; \ + $(am__cd) $(top_srcdir) && \ + $(AUTOMAKE) --gnu doc/Makefile +Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status + @case '$?' in \ + *config.status*) \ + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \ + *) \ + echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \ + cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \ + esac; + +$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh + +$(top_srcdir)/configure: $(am__configure_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(ACLOCAL_M4): $(am__aclocal_m4_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(am__aclocal_m4_deps): + +mostlyclean-libtool: + -rm -f *.lo + +clean-libtool: + -rm -rf .libs _libs +tags TAGS: + +ctags CTAGS: + +cscope cscopelist: + + +distdir: $(DISTFILES) + @srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + list='$(DISTFILES)'; \ + dist_files=`for file in $$list; do echo $$file; done | \ + sed -e "s|^$$srcdirstrip/||;t" \ + -e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \ + case $$dist_files in \ + */*) $(MKDIR_P) `echo "$$dist_files" | \ + sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \ + sort -u` ;; \ + esac; \ + for file in $$dist_files; do \ + if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \ + if test -d $$d/$$file; then \ + dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \ + if test -d "$(distdir)/$$file"; then \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \ + cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \ + else \ + test -f "$(distdir)/$$file" \ + || cp -p $$d/$$file "$(distdir)/$$file" \ + || exit 1; \ + fi; \ + done + $(MAKE) $(AM_MAKEFLAGS) \ + top_distdir="$(top_distdir)" distdir="$(distdir)" \ + dist-hook +check-am: all-am +check: check-am +all-am: Makefile all-local +installdirs: +install: install-am +install-exec: install-exec-am +install-data: install-data-am +uninstall: uninstall-am + +install-am: all-am + @$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am + +installcheck: installcheck-am +install-strip: + if test -z '$(STRIP)'; then \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + install; \ + else \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \ + fi +mostlyclean-generic: + +clean-generic: + +distclean-generic: + -test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES) + -test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES) + +maintainer-clean-generic: + @echo "This command is intended for maintainers to use" + @echo "it deletes files that may require special tools to rebuild." +clean: clean-am + +clean-am: clean-generic clean-libtool clean-local mostlyclean-am + +distclean: distclean-am + -rm -f Makefile +distclean-am: clean-am distclean-generic + +dvi: dvi-am + +dvi-am: + +html: html-am + +html-am: html-local + +info: info-am + +info-am: + +install-data-am: + @$(NORMAL_INSTALL) + $(MAKE) $(AM_MAKEFLAGS) install-data-hook +install-dvi: install-dvi-am + +install-dvi-am: + +install-exec-am: + +install-html: install-html-am + +install-html-am: + +install-info: install-info-am + +install-info-am: + +install-man: + +install-pdf: install-pdf-am + +install-pdf-am: + +install-ps: install-ps-am + +install-ps-am: + +installcheck-am: + +maintainer-clean: maintainer-clean-am + -rm -f Makefile +maintainer-clean-am: distclean-am maintainer-clean-generic + +mostlyclean: mostlyclean-am + +mostlyclean-am: mostlyclean-generic mostlyclean-libtool + +pdf: pdf-am + +pdf-am: + +ps: ps-am + +ps-am: + +uninstall-am: uninstall-local + +.MAKE: install-am install-data-am install-strip + +.PHONY: all all-am all-local check check-am clean clean-generic \ + clean-libtool clean-local cscopelist-am ctags-am dist-hook \ + distclean distclean-generic distclean-libtool distdir dvi \ + dvi-am html html-am html-local info info-am install install-am \ + install-data install-data-am install-data-hook install-dvi \ + install-dvi-am install-exec install-exec-am install-html \ + install-html-am install-info install-info-am install-man \ + install-pdf install-pdf-am install-ps install-ps-am \ + install-strip installcheck installcheck-am installdirs \ + maintainer-clean maintainer-clean-generic mostlyclean \ + mostlyclean-generic mostlyclean-libtool pdf pdf-am ps ps-am \ + tags-am uninstall uninstall-am uninstall-local + +.PRECIOUS: Makefile + + +.PHONY: clean-local html-local man-local all-local dist-hook install-data-hook + +dist-hook: + find $(distdir)/ -name .gitignore | xargs rm -f + +clean-local: + -rm -rf $(BUILDDIR)/* + +html-local: + $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html + @echo + @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." + +man-local: + $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man + @echo + @echo "Build finished. The manual pages are in $(BUILDDIR)/man." + +all-local: html-local + +install-data-hook: + find $(built_html) -type f -printf "%P\n" | \ + while read file; do \ + echo " $(INSTALL_DATA) -D $(built_html)/$$file '$(DESTDIR)$(htmldir)/$$file'"; \ + $(INSTALL_DATA) -D $(built_html)/$$file "$(DESTDIR)$(htmldir)/$$file" || exit $$?; \ + done + +uninstall-local: + -rm -rf "$(DESTDIR)$(htmldir)" + +# Tell versions [3.59,3.63) of GNU make to not export all variables. +# Otherwise a system limit (for SysV at least) may be exceeded. +.NOEXPORT: diff --git a/doc/changes.rst b/doc/changes.rst new file mode 100644 index 0000000..fc8d143 --- /dev/null +++ b/doc/changes.rst @@ -0,0 +1,7 @@ +ChangeLog +========= + +See below for a list of changes. + +.. literalinclude:: ../ChangeLog + diff --git a/doc/conf.py b/doc/conf.py new file mode 100644 index 0000000..f706ef4 --- /dev/null +++ b/doc/conf.py @@ -0,0 +1,244 @@ +# -*- coding: utf-8 -*- +# +# Liblognorm documentation build configuration file, created by +# sphinx-quickstart on Mon Dec 16 13:12:44 2013. +# +# This file is execfile()d with the current directory set to its containing dir. +# +# Note that not all possible configuration values are present in this +# autogenerated file. +# +# All configuration values have a default; values that are commented out +# serve to show the default. + +import sys, os + +# If extensions (or modules to document with autodoc) are in another directory, +# add these directories to sys.path here. If the directory is relative to the +# documentation root, use os.path.abspath to make it absolute, like shown here. +#sys.path.insert(0, os.path.abspath('.')) + +# -- General configuration ----------------------------------------------------- + +# If your documentation needs a minimal Sphinx version, state it here. +#needs_sphinx = '1.0' + +# Add any Sphinx extension module names here, as strings. They can be extensions +# coming with Sphinx (named 'sphinx.ext.*') or your custom ones. +extensions = [] + +# Add any paths that contain templates here, relative to this directory. +templates_path = ['_templates'] + +# The suffix of source filenames. +source_suffix = '.rst' + +# The encoding of source files. +#source_encoding = 'utf-8-sig' + +# The master toctree document. +master_doc = 'index' + +# General information about the project. +project = u'Liblognorm' +copyright = u'Adiscon' + +# The version info for the project you're documenting, acts as replacement for +# |version| and |release|, also used in various other places throughout the +# built documents. +# +# The short X.Y version. +version = '1.1' +# The full version, including alpha/beta/rc tags. +release = '1.1.2' + +# The language for content autogenerated by Sphinx. Refer to documentation +# for a list of supported languages. +language = None + +# There are two options for replacing |today|: either, you set today to some +# non-false value, then it is used: +#today = '' +# Else, today_fmt is used as the format for a strftime call. +#today_fmt = '%B %d, %Y' + +# List of patterns, relative to source directory, that match files and +# directories to ignore when looking for source files. +exclude_patterns = ['_build'] + +# The reST default role (used for this markup: `text`) to use for all documents. +#default_role = None + +# If true, '()' will be appended to :func: etc. cross-reference text. +#add_function_parentheses = True + +# If true, the current module name will be prepended to all description +# unit titles (such as .. function::). +#add_module_names = True + +# If true, sectionauthor and moduleauthor directives will be shown in the +# output. They are ignored by default. +#show_authors = False + +# The name of the Pygments (syntax highlighting) style to use. +pygments_style = 'sphinx' + +# A list of ignored prefixes for module index sorting. +#modindex_common_prefix = [] + + +# -- Options for HTML output --------------------------------------------------- + +# The theme to use for HTML and HTML Help pages. See the documentation for +# a list of builtin themes. +html_theme = 'haiku' + +# Theme options are theme-specific and customize the look and feel of a theme +# further. For a list of options available for each theme, see the +# documentation. +#html_theme_options = {} + +# Add any paths that contain custom themes here, relative to this directory. +#html_theme_path = [] + +# The name for this set of Sphinx documents. If None, it defaults to +# " v documentation". +#html_title = None +html_title = "A fast log normalization library" + +# A shorter title for the navigation bar. Default is the same as html_title. +# html_short_title = None +html_short_title = project + " " + release + " documentation" + +# The name of an image file (relative to this directory) to place at the top +# of the sidebar. +#html_logo = None + +# The name of an image file (within the static path) to use as favicon of the +# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 +# pixels large. +#html_favicon = None + +# Add any paths that contain custom static files (such as style sheets) here, +# relative to this directory. They are copied after the builtin static files, +# so a file named "default.css" will overwrite the builtin "default.css". +html_static_path = ['_static'] + +# If not '', a 'Last updated on:' timestamp is inserted at every page bottom, +# using the given strftime format. +#html_last_updated_fmt = '%b %d, %Y' + +# If true, SmartyPants will be used to convert quotes and dashes to +# typographically correct entities. +#html_use_smartypants = True + +# Custom sidebar templates, maps document names to template names. +#html_sidebars = {} + +# Additional templates that should be rendered to pages, maps page names to +# template names. +#html_additional_pages = {} + +# If false, no module index is generated. +#html_domain_indices = True + +# If false, no index is generated. +html_use_index = False + +# If true, the index is split into individual pages for each letter. +#html_split_index = False + +# If true, links to the reST sources are added to the pages. +#html_show_sourcelink = True + +# If true, "Created using Sphinx" is shown in the HTML footer. Default is True. +#html_show_sphinx = True + +# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. +html_show_copyright = False + +# If true, an OpenSearch description file will be output, and all pages will +# contain a tag referring to it. The value of this option must be the +# base URL from which the finished HTML is served. +#html_use_opensearch = '' + +# This is the file name suffix for HTML files (e.g. ".xhtml"). +#html_file_suffix = None + +# Output file base name for HTML help builder. +htmlhelp_basename = 'Liblognormdoc' + + +# -- Options for LaTeX output -------------------------------------------------- + +latex_elements = { +# The paper size ('letterpaper' or 'a4paper'). +#'papersize': 'letterpaper', + +# The font size ('10pt', '11pt' or '12pt'). +#'pointsize': '10pt', + +# Additional stuff for the LaTeX preamble. +#'preamble': '', +} + +# Grouping the document tree into LaTeX files. List of tuples +# (source start file, target name, title, author, documentclass [howto/manual]). +latex_documents = [ + ('index', 'Liblognorm.tex', u'Liblognorm Documentation', + u'Pavel Levshin', 'manual'), +] + +# The name of an image file (relative to this directory) to place at the top of +# the title page. +#latex_logo = None + +# For "manual" documents, if this is true, then toplevel headings are parts, +# not chapters. +#latex_use_parts = False + +# If true, show page references after internal links. +#latex_show_pagerefs = False + +# If true, show URL addresses after external links. +#latex_show_urls = False + +# Documents to append as an appendix to all manuals. +#latex_appendices = [] + +# If false, no module index is generated. +#latex_domain_indices = True + + +# -- Options for manual page output -------------------------------------------- + +# One entry per manual page. List of tuples +# (source start file, name, description, authors, manual section). +man_pages = [ + ('index', 'liblognorm', u'Liblognorm Documentation', + [u'Pavel Levshin'], 1) +] + +# If true, show URL addresses after external links. +#man_show_urls = False + + +# -- Options for Texinfo output ------------------------------------------------ + +# Grouping the document tree into Texinfo files. List of tuples +# (source start file, target name, title, author, +# dir menu entry, description, category) +texinfo_documents = [ + ('index', 'Liblognorm', u'Liblognorm Documentation', + u'Pavel Levshin', 'Liblognorm', 'Fast log normalization library.', + 'Miscellaneous'), +] + +# Documents to append as an appendix to all manuals. +#texinfo_appendices = [] + +# If false, no module index is generated. +#texinfo_domain_indices = True + +# How to display URL addresses: 'footnote', 'no', or 'inline'. +#texinfo_show_urls = 'footnote' diff --git a/doc/configuration.rst b/doc/configuration.rst new file mode 100644 index 0000000..93e054a --- /dev/null +++ b/doc/configuration.rst @@ -0,0 +1,1238 @@ +How to configure +================ + +To use liblognorm, you need 3 things. + +1. An installed and working copy of liblognorm. The installation process + has been discussed in the chapter :doc:`installation`. +2. Log files. +3. A rulebase, which is heart of liblognorm configuration. + +Log files +--------- + +A log file is a text file, which typically holds many lines. Each line is +a log message. These are usually a bit strange to read, thus to analyze. +This mostly happens, if you have a lot of different devices, that are all +creating log messages in a different format. + +Rulebase +-------- + +The rulebase holds all the schemes for your logs. It basically consists of +many lines that reflect the structure of your log messages. When the +normalization process is started, a parse-tree will be generated from +the rulebase and put into the memory. This will then be used to parse the +log messages. + +Each line in rulebase file is evaluated separately. + +Rulebase Versions +----------------- +This documentation is for liblognorm version 2 and above. Version 2 is a +complete rewrite of liblognorm which offers many enhanced features but +is incompatible to some pre-v2 rulebase commands. For details, see +compatiblity document. + +Note that liblognorm v2 contains a full copy of the v1 engine. As such +it is fully compatible to old rulebases. In order to use the new v2 +engine, you need to explicitely opt in. To do so, you need to add +the line:: + + version=2 + +to the top of your rulebase file. Currently, it is very important that + + * the line is given exactly as above + * no whitespace within the sequence is permitted (e.g. "version = 2" + is invalid) + * no whitepace or comment after the "2" is permitted + (e.g. "version=2 # comment") is invalid + * this line **must** be the **very** first line of the file; this + also means there **must** not be any comment or empty lines in + front of it + +Only if the version indicator is properly detected, the v2 engine is +used. Otherwise, the v1 engine is used. So if you use v2 features but +got the version line wrong, you'll end up with error messages from the +v1 engine. + +The v2 engine understands almost all v1 parsers, and most importantly all +that are typically used. It does not understand these parsers: + + * tokenized + * recursive + * descent + * regex + * interpret + * suffixed + * named_suffixed + +The recursive and descent parsers should be replaced by user-defined types +in. The tokenized parsers should be replaced by repeat. The interpret functionality +is provided via the parser's "format" parameters. For the others, +currently there exists no replacement, but will the exception of regex, +will be added based on demand. If you think regex support is urgently +needed, please read our +`related issue on github, `_ +where you can also cast +you ballot in favor of it. If you need any of these parsers, you need +to use the v1 engine. That of course means you cannot use the v2 enhancements, +so converting as much as possible makes sense. + +Commentaries +------------ + +To keep your rulebase tidy, you can use commentaries. Start a commentary +with "#" like in many other configurations. It should look like this:: + + # The following prefix and rules are for firewall logs + +Note that the comment character MUST be in the first column of the line. + +Empty lines are just skipped, they can be inserted for readability. + +User-Defined Types +------------------ + +If the line starts with ``type=``, then it contains a user-defined type. +You can use a user-defined type wherever you use a built-in type; they +are equivalent. That also means you can use user-defined types in the +definition of other user-defined types (they can be used recursively). +The only restriction is that you must define a type **before** you can +use it. + +This line has following format:: + + type=: + +Everything before the colon is treated as the type name. User-defined types +must always start with "@". So "@mytype" is a valid name, whereas "mytype" +is invalid and will lead to an error. + +After the colon, a match description should be +given. It is exactly the same like the one given in rule lines (see below). + +A generic IP address type could look as follows:: + + type=@IPaddr:%ip:ipv4% + type=@IPaddr:%ip:ipv6% + +This creates a type "@IPaddr", which consists of either an IPv4 or IPv6 +address. Note how we use two different lines to create an alternative +representation. This is how things generally work with types: you can use +as many "type" lines for a single type as you need to define your object. +Note that pure alternatives could also be defined via the "alternative" +parser - which option to choose is left to the user. They are equivalent. +The ability to use multiple type lines for definition, however, brings +more power than just to define alternatives. + +Includes +-------- +Especially with user-defined types includes come handy. With an include, +you can include definitions already made elsewhere into the current +rule set (just like the "include" directive works in many programming +languages). An include is done by a line starting with ``include=`` +where the rest of the line is the actual file name, just like in this +example:: + + include=/var/lib/liblognorm/stdtypes.rb + +The definition is included right at the position where it occurs. +Processing of the original file is continued when the included file +has been fully processed. Includes can be nested. + +To facilitate repositories of common rules, liblognorm honors the + +:: + + LIBLOGNORM_RULEBASES + +environment variable. If it is set liblognorm tries to locate the file +inside the path pointed to by ``LIBLOGNORM_RULEBASES`` in the following +case: + +* the provided file cannot be found +* the provided file name is not an absolute path (does not start with "/") + +So assuming we have:: + + export LIBLOGNORM_RULEBASES=/var/lib/loblognorm + +The above example can be re-written as follows:: + + include=stdtypes.rb + +Note, however, that if ``stdtypes.rb`` exist in the current working +directory, that file will be loaded insted of the one from +``/var/lib/liblognorm``. + +This use facilitates building a library of standard type definitions. Note +the the liblognorm project also ships type definitions for common +scenarios. + +Rules +----- + +If the line starts with ``rule=``, then it contains a rule. This line has +following format:: + + rule=[[,...]]: + +Everything before a colon is treated as comma-separated list of tags, which +will be attached to a match. After the colon, match description should be +given. It consists of string literals and field selectors. String literals +should match exactly, whereas field selectors may match variable parts +of a message. + +A rule could look like this (in legacy format):: + + rule=:%date:date-rfc3164% %host:word% %tag:char-to:\x3a%: no longer listening on %ip:ipv4%#%port:number%' + +This excerpt is a common rule. A rule always contains several different +"parts"/properties and reflects the structure of the message you want to +normalize (e.g. Host, IP, Source, Syslogtag...). + + +Literals +-------- + +Literal is just a sequence of characters, which must match exactly. +Percent sign characters must be escaped to prevent them from starting a +field accidentally. Replace each "%" with "\\x25" or "%%", when it occurs +in a string literal. + +Fields +------ + +There are different formats for field specification: + + * legacy format + * condensed format + * full json format + +Legacy Format +############# +Legay format is exactly identical to the v1 engine. This permits you to use +existing v1 rulebases without any modification with the v2 engine, except for +adding the ``version=2`` header line to the top of the file. Remember: some +v1 types are not supported - if you are among the few who use them, you need +to do some manual conversion. For almost all users, manual conversion should +not be necessary. + +Legacy format is not documented here. If you want to use it, see the v1 +documentation. + +Condensed Format +################ +The goal of this format is to be as brief as possible, permitting you an +as-clear-as-possible view of your rule. It is very similar to legacy format +and recommended to be used for simple types which do not need any parser +parameters. + +Its structure is as follows:: + + %:[{}]% + +**field name** -> that name can be selected freely. It should be a description +of what kind of information the field is holding, e.g. SRC is the field +contains the source IP address of the message. These names should also be +chosen carefully, since the field name can be used in every rule and +therefore should fit for the same kind of information in different rules. + +Some special field names exist: + +* **dash** ("-"): this field is matched but not saved +* **dot** ("."): this is useful if a parser returns a set of fields. Usually, + it does so by creating a json subtree. If the field is named ".", then + no subtree is created but instead the subfields are moved into the main + hierarchy. +* **two dots** (".."): similiar to ".", but can be used at the lower level to denote + that a field is to be included with the name given by the upper-level + object. Note that ".." is only acted on if a subelement contains a single + field. The reason is that if there were more, we could not assign all of + them to the *single* name given by the upper-level-object. The prime + use case for this special name is in user-defined types that parse only + a single value. Without "..", they would always become a JSON subtree, which + seems unnatural and is different from built-in types. So it is suggested to + name such fields as "..", which means that the user can assign a name of his + liking, just like in the case of built-in parsers. + +**field type** -> selects the accordant parser, which are described below. + +Special characters that need to be escaped when used inside a field +description are "%" and ":". It is strongly recommended **not** to use them. + +**parameters** -> This is an optional set of parameters, given in pure JSON +format. Parameters can be generic (e.g. "priority") or specific to a +parser (e.g. "extradata"). Generic parameters are described below in their +own section, parser-specific ones in the relevant type documentation. + +As an example, the "char-to" parser accepts a parameter named "extradata" +which describes up to which character it shall match (the name "extradata" +stems back to the legacy v1 system):: + + %tag:char-to{"extradata":":"}% + +Whitespace, including LF, is permitted inside a field definition after +the opening precent sign and before the closing one. This can be used to +make complex rules more readable. So the example rule from the overview +section above could be rewritten as:: + + rule=:% + date:date-rfc3164 + % % + host:word + % % + tag:char-to{"extradata":":"} + %: no longer listening on % + ip:ipv4 + %#% + port:number + %' + +When doing this, note well that whitespace IS important inside the +literal text. So e.g. in the second example line above "% %" we require +a single SP as literal text. Note that any combination of your liking is +valid, so it could also be written as:: + + rule=:%date:date-rfc3164% %host:word% % tag:char-to{"extradata":":"} + %: no longer listening on % ip:ipv4 %#% port:number %' + +To prevent a typical user error, continuation lines are **not** permitted +to start with ``rule=``. There are some obscure cases where this could +be a valid rule, and it can be re-formatted in that case. Moreoften, this +is the result of a missing percent sign, as in this sample:: + + rule=:test%field:word ... missing percent sign ... + rule=:%f:word% + +If we would permit ``rule=`` at start of continuation line, these kinds +of problems would be very hard to detect. + +Full JSON Format +################ +This format is best for complex definitions or if there are many parser +parameters. + +Its structure is as follows:: + + %JSON% + +Where JSON is the configuration expressed in JSON. To get you started, let's +rewrite above sample in pure JSON form:: + + rule=:%[ {"type":"date-rfc3164", "name":"date"}, + {"type":"literal", "text:" "}, + {"type":"char-to", "name":"host", "extradata":":"}, + {"type":"literal", "text:": no longer listening on "}, + {"type":"ipv4", "name":"ip"}, + {"type":"literal", "text:"#"}, + {"type":"number", "name":"port"} + ]% + +A couple of things to note: + + * we express everything in this example in a *single* parser definition + * this is done by using a **JSON array**; whenever an array is used, + multiple parsers can be specified. They are exectued one after the + other in given order. + * literal text is matched here via explicit parser call; as specified + below, this is recommended only for specific use cases with the + current version of liblognorm + * parser parameters (both generic and parser-specific ones) are given + on the main JSON level + * the literal text shall not be stored inside an output variable; for + this reason no name attribute is given (we could also have used + ``"name":"-"`` which achives the same effect but is more verbose). + +With the literal parser calls replaced by actual literals, the sample +looks like this:: + + rule=:%{"type":"date-rfc3164", "name":"date"} + % % + {"type":"char-to", "name":"host", "extradata":":"} + % no longer listening on % + {"type":"ipv4", "name":"ip"} + %#% + {"type":"number", "name":"port"} + % + +Which format you use and how you exactly use it is up to you. + +Some guidelines: + + * using the "literal" parser in JSON should be avoided currently; the + experimental version does have some rough edges where conflicts + in literal processing will not be properly handled. This should not + be an issue in "closed environments", like "repeat", where no such + conflict can occur. + * otherwise, JSON is perfect for very complex things (like nesting of + parsers - it is **not** suggested to use any other format for these + kinds of things. + * if a field needs to be matched but the result of that match is not + needed, omit the "name" attribute; specifically avoid using + the more verbose ``"name":"-"``. + * it is a good idea to start each defintion with ``"type":"..."`` + as this provides a good quick overview over what is being defined. + +Mandatory Parameters +.................... + +type +~~~~ +The field type, selects the parser to use. See "fields" below for description. + +Optional Generic Parameters +........................... + +name +~~~~ +The field name to use. If "-" is used, the field is matched, but not stored. +In this case, you can simply **not** specify a field name, which is the +preferred way of doing this. + +priority +~~~~~~~~ +The priority to assign to this parser. Priorities are numerical values in the +range from 0 (highest) to 65535 (lowest). If multiple parsers could match at +a given character position of a log line, parsers are tried in priority order. +Different priorities can lead to different parsing. For example, if the +greedy "rest" type is assigned priority 0, and no other parser is assigned the +same priority, no other parser will ever match (because "rest" is very greedy +and always matches the rest of the message). + +Note that liblognorm internally +has a parser-specific priority, which is selected by the program developer based +on the specificallity of a type. If the user assigns equal priorities, parsers are +executed based on the parser-specific priority. + +The default priority value is 30,000. + +Field types +----------- +We have legacy and regular field types. Pre-v2, we did not have user-defined types. +As such, there was a relatively large number of parsers that handled very similar +cases, for example for strings. These parsers still work and may even provide +best performance in extreme cases. In v2, we focus on fewer, but more +generic parsers, which are then tailored via parameters. + +There is nothing bad about using legacy parsers and there is no +plan to outphase them at any time in the future. We just wanted to +let you know, especially if you wonder about some "wereid" parsers. +In v1, parsers could have only a single paramter, which was called +"extradata" at that time. This is why some of the legacy parsers +require or support a parameter named "extradata" and do not use a +better name for it (internally, the legacy format creates a +v2 parser defintion with "extradata" being populated from the +legacy "extradata" part of the configuration). + +number +###### + +One or more decimal digits. + +Parameters +.......... + +format +~~~~~~ + +Specifies the format of the json object. Possible values are "string" and +"number", with string being the default. If "number" is used, the json +object will be a native json integer. + +maxval +~~~~~~ + +Maximum value permitted for this number. If the value is higher than this, +it will not be detected by this parser definition and an alternate detection +path will be pursued. + +float +##### + +A floating-pt number represented in non-scientific form. + +Parameters +.......... + +format +~~~~~~ + +Specifies the format of the json object. Possible values are "string" and +"number", with string being the default. If "number" is used, the json +object will be a native json floating point number. Note that we try to +preserve the original string serialization format, but keep on your mind +that floating point numbers are inherently imprecise, so slight variance +may occur depending on processing them. + + +hexnumber +######### + +A hexadecimal number as seen by this parser begins with the string +"0x", is followed by 1 or more hex digits and is terminated by white +space. Any interleaving non-hex digits will cause non-detection. The +rules are strict to avoid false positives. + +Parameters +.......... + +format +~~~~~~ + +Specifies the format of the json object. Possible values are "string" and +"number", with string being the default. If "number" is used, the json +object will be a native json integer. Note that json numbers are always +decimal, so if "number" is selected, the hex number will be converted +to decimal. The original hex string is no longer available in this case. + +maxval +~~~~~~ + +Maximum value permitted for this number. If the value is higher than this, +it will not be detected by this parser definition and an alternate detection +path will be pursued. This is most useful if fixed-size hex numbers need to +be processed. For example, for byte values the "maxval" could be set to 255, +which ensures that invalid values are not misdetected. + + +kernel-timestamp +################ + +Parses a linux kernel timestamp, which has the format:: + + [ddddd.dddddd] + +where "d" is a decimal digit. The part before the period has to +have at least 5 digits as per kernel code. There is no upper +limit per se inside the kernel, but liblognorm does not accept +more than 12 digits, which seems more than sufficient (we may reduce +the max count if misdetections occur). The part after the period +has to have exactly 6 digits. + + +whitespace +########## + +This parses all whitespace until the first non-whitespace character +is found. This check is performed using the ``isspace()`` C library +function to check for space, horizontal tab, newline, vertical tab, +feed and carriage return characters. + +This parser is primarily a tool to skip to the next "word" if +the exact number of whitspace characters (and type of whitespace) +is not known. The current parsing position MUST be on a whitspace, +else the parser does not match. + +Remeber that to just parse but not preserve the field contents, the +dash ("-") is used as field name in compact format or the "name" +parameter is simply omitted in JSON format. This is almost always +expected with the *whitespace* type. + +string +###### + +This is a highly customizable parser that can be used to extract +many types of strings. It is meant to be used for most cases. It +is suggested that specific string types are created as user-defined +types using this parser. + +This parser supports: + +* various quoting modes for strings +* escape character processing + +Parameters +.......... + +quoting.mode +~~~~~~~~~~~~ +Specifies how the string is quoted. Possible modes: + +* **none** - no quoting is permitted +* **required** - quotes must be present +* **auto** - quotes are permitted, but not required + +Default is ``auto``. + +quoting.escape.mode +~~~~~~~~~~~~~~~~~~~ + +Specifies how quote character escaping is handled. Possible modes: + +* **none** - there are no escapes, quote characters are *not* permitted in value +* **double** - the ending quote character is duplicated to indicate + a single quote without termination of the value (e.g. ``""``) +* **backslash** - a backslash is prepended to the quote character (e.g ``\"``) +* **both** - both double and backslash escaping can happen and are supported + +Note that turning on ``backslash`` mode (or ``both``) has the side-effect that +backslash escaping is enabled in general. This usually is what you want +if this option is selected (e.g. otherwise you could no longer represent +backslash). + +quoting.char.begin +~~~~~~~~~~~~~~~~~~ + +Sets the begin quote character. + +Default is ". + +quoting.char.end +~~~~~~~~~~~~~~~~ + +Sets the end quote character. + +Default is ". + +Note that setting the begin and end quote character permits you to +support more quoting modes. For example, brackets and braces are +used by some software for quoting. To handle such string, you can for +example use a configuration like this:: + + rule=:a %f:string{"quoting.char.begin":"[", "quoting.char.end":"]"}% b + +which matches strings like this:: + + a [test test2] b + +matching.permitted +~~~~~~~~~~~~~~~~~~ + +This allows to specify a set of characters permitted in the to-be-parsed +field. It is primarily a utility to extract things like programming-language +like names (e.g. consisting of letters, digits and a set of special characters +only), alphanumeric or alphabetic strings. + +If this parameter is not specified, all characters are permitted. If it +is specified, only the configured characters are permitted. + +Note that this option reliably only works on US-ASCII data. Multi-byte +character encodings may lead to strange results. + +There are two ways to specify permitted characters. The simple one is to +specify them directly for the parameter:: + + rule=:%f:string{"matching.permitted":"abc"}% + +This only supports literal characters and all must be given as a single +parameter. For more advanced use cases, an array of permitted characters +can be provided:: + + rule=:%f:string{"matching.permitted":[ + {"class":"digit"}, + {"chars":"xX"} + ]}% + +Here, ``class`` is a specify for the usual character classes, with +support for: + +* digit +* hexdigit +* alpha +* alnum + +In contrast, ``chars`` permits to specify literal characters. Both +``class`` as well as ``chars`` may be specified multiple times inside +the array. For example, the ``alnum`` class could also be permitted as +follows:: + + rule=:%f:string{"matching.permitted":[ + {"class":"digit"}, + {"class":"alpha"} + ]}% + +word +#### + +One or more characters, up to the next space (\\x20), or +up to end of line. + +string-to +######### + +One or more characters, up to the next string given in +"extradata". + +alpha +##### + +One or more alphabetic characters, up to the next whitspace, punctuation, +decimal digit or control character. + +char-to +####### + +One or more characters, up to the next character(s) given in +extradata. + +Parameters +.......... + +extradata +~~~~~~~~~ + +This is a mandatory parameter. It contains one or more characters, each of +which terminates the match. + + +char-sep +######## + +Zero or more characters, up to the next character(s) given in extradata. + +Parameters +.......... + +extradata +~~~~~~~~~~ + +This is a mandatory parameter. It contains one or more characters, each of +which terminates the match. + +rest +#### + +Zero or more characters untill end of line. Must always be at end of the +rule, even though this condition is currently **not** checked. In any case, +any definitions after *rest* are ignored. + +Note that the *rest* syntax should be avoided because it generates +a very broad match. If it needs to be used, the user shall assign it +the lowest priority among his parser definitions. Note that the +parser-sepcific priority is also lowest, so by default it will only +match if nothing else matches. + +quoted-string +############# + +Zero or more characters, surrounded by double quote marks. +Quote marks are stripped from the match. + +op-quoted-string +################ + +Zero or more characters, possibly surrounded by double quote marks. +If the first character is a quote mark, operates like quoted-string. Otherwise, operates like "word" +Quote marks are stripped from the match. + +date-iso +######## +Date in ISO format ('YYYY-MM-DD'). + +time-24hr +######### + +Time of format 'HH:MM:SS', where HH is 00..23. + +time-12hr +######### + +Time of format 'HH:MM:SS', where HH is 00..12. + +duration +######## + +A duration is similar to a timestamp, except that +it tells about time elapsed. As such, hours can be larger than 23 +and hours may also be specified by a single digit (this, for example, +is commonly done in Cisco software). + +Examples for durations are "12:05:01", "0:00:01" and "37:59:59" but not +"00:60:00" (HH and MM must still be within the usual range for +minutes and seconds). + + +date-rfc3164 +############ + +Valid date/time in RFC3164 format, i.e.: 'Oct 29 09:47:08'. +This parser implements several quirks to match malformed +timestamps from some devices. + +Parameters +.......... + +format +~~~~~~ + +Specifies the format of the json object. Possible values are + +- **string** - string representation as given in input data +- **timestamp-unix** - string converted to an unix timestamp (seconds since epoch) +- **timestamp-unix-ms** - a kind of unix-timestamp, but with millisecond resolution. + This format is understood for example by ElasticSearch. Note that RFC3164 does **not** + contain subsecond resolution, so this option makes no sense for RFC3164-data only. + It is usefull, howerver, if processing mixed sources, some of which contain higher + precision. + + +date-rfc5424 +############ + +Valid date/time in RFC5424 format, i.e.: +'1985-04-12T19:20:50.52-04:00'. +Slightly different formats are allowed. + +Parameters +.......... + +format +~~~~~~ + +Specifies the format of the json object. Possible values are + +- **string** - string representation as given in input data +- **timestamp-unix** - string converted to an unix timestamp (seconds since epoch). + If subsecond resolution is given in the original timestamp, it is lost. +- **timestamp-unix-ms** - a kind of unix-timestamp, but with millisecond resolution. + This format is understood for example by ElasticSearch. Note that a RFC5424 + timestamp can contain higher than ms resolution. If so, the timestamp is + truncated to millisecond resolution. + + + +ipv4 +#### + +IPv4 address, in dot-decimal notation (AAA.BBB.CCC.DDD). + +ipv6 +#### + +IPv6 address, in textual notation as specified in RFC4291. +All formats specified in section 2.2 are supported, including +embedded IPv4 address (e.g. "::13.1.68.3"). Note that a +**pure** IPv4 address ("13.1.68.3") is **not** valid and as +such not recognized. + +To avoid false positives, there must be either a whitespace +character after the IPv6 address or the end of string must be +reached. + +mac48 +##### + +The standard (IEEE 802) format for printing MAC-48 addresses in +human-friendly form is six groups of two hexadecimal digits, +separated by hyphens (-) or colons (:), in transmission order +(e.g. 01-23-45-67-89-ab or 01:23:45:67:89:ab ). +This form is also commonly used for EUI-64. +from: http://en.wikipedia.org/wiki/MAC_address + +cef +### + +This parses ArcSight Comment Event Format (CEF) as described in +the "Implementing ArcSight CEF" manual revision 20 (2013-06-15). + +It matches a format that closely follows the spec. The header fields +are extracted into the field name container, all extension are +extracted into a container called "Extensions" beneath it. + +Example +....... + +Rule (compact format):: + + rule=:%f:cef' + +Data:: + + CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a value cc=field 3 + +Result:: + + { + "f": { + "DeviceVendor": "Vendor", + "DeviceProduct": "Product", + "DeviceVersion": "Version", + "SignatureID": "Signature ID", + "Name": "some name", + "Severity": "Severity", + "Extensions": { + "aa": "field1", + "bb": "this is a value", + "cc": "field 3" + } + } + } + +checkpoint-lea +############## + +This supports the LEA on-disk format. Unfortunately, the format +is underdocumented, the Checkpoint docs we could get hold of just +describe the API and provide a field dictionary. In a nutshell, what +we do is extract field names up to the colon and values up to the +semicolon. No escaping rules are known to us, so we assume none +exists (and as such no semicolon can be part of a value). + +If someone has a definitive reference or a sample set to contribute +to the project, please let us know and we will check if we need to +add additional transformations. + + +cisco-interface-spec +#################### + +A Cisco interface specifier, as for example seen in PIX or ASA. +The format contains a number of optional parts and is described +as follows (in ABNF-like manner where square brackets indicate +optional parts): + +:: + + [interface:]ip/port [SP (ip2/port2)] [[SP](username)] + +Samples for such a spec are: + + * outside:192.168.52.102/50349 + * inside:192.168.1.15/56543 (192.168.1.112/54543) + * outside:192.168.1.13/50179 (192.168.1.13/50179)(LOCAL\some.user) + * outside:192.168.1.25/41850(LOCAL\RG-867G8-DEL88D879BBFFC8) + * inside:192.168.1.25/53 (192.168.1.25/53) (some.user) + * 192.168.1.15/0(LOCAL\RG-867G8-DEL88D879BBFFC8) + +Note that the current verision of liblognorm does not permit sole +IP addresses to be detected as a Cisco interface spec. However, we +are reviewing more Cisco message and need to decide if this is +to be supported. The problem here is that this would create a much +broader parser which would potentially match many things that are +**not** Cisco interface specs. + +As this object extracts multiple subelements, it create a JSON +structure. + +Let's for example look at this definiton (compact format):: + + %ifaddr:cisco-interface-spec% + +and assume the following message is to be parsed:: + + outside:192.168.1.13/50179 (192.168.1.13/50179) (LOCAL\some.user) + +Then the resulting JSON will be as follows:: + +{ "ifaddr": { "interface": "outside", "ip": "192.168.1.13", "port": "50179", "ip2": "192.168.1.13", "port2": "50179", "user": "LOCAL\\some.user" } } + +Subcomponents that are not given in the to-be-normalized string are +also not present in the resulting JSON. + +iptables +######## + +Name=value pairs, separated by spaces, as in Netfilter log messages. +Name of the selector is not used; names from the line are +used instead. This selector always matches everything till +end of the line. Cannot match zero characters. + +cisco-interface-spec +#################### + +This is an experimental parser. It is used to detect Cisco Interface +Specifications. A sample of them is: + +:: + + outside:176.97.252.102/50349 + +Note that this parser does not yet extract the individual parts +due to the restrictions in current liblognorm. This is planned for +after a general algorithm overhaul. + +In order to match, this syntax must start on a non-whitespace char +other than colon. + +json +#### +This parses native JSON from the message. All data up to the first non-JSON +is parsed into the field. There may be any other field after the JSON, +including another JSON section. + +Note that any white space after the actual JSON +is considered **to be part of the JSON**. So you cannot filter on whitespace +after the JSON. + +Example +....... + +Rule (compact format):: + + rule=:%field1:json%interim text %field2:json%' + +Data:: + + {"f1": "1"} interim text {"f2": 2} + +Result:: + + { "field2": { "f2": 2 }, "field1": { "f1": "1" } } + +Note also that the space before "interim" must **not** be given in the +rule, as it is consumed by the JSON parser. However, the space after +"text" is required. + +alternative +########### + +This type permits to specify alternative ways of parsing within a single +definition. This can make writing rule bases easier. It also permits the +v2 engine to create a more efficient parsing data structure resulting in +better performance (to be noticed only in extreme cases, though). + +An example explains this parser best:: + + rule=:a % + {"type":"alternative", + "parser": [ + {"name":"num", "type":"number"}, + {"name":"hex", "type":"hexnumber"} + ] + }% b + +This rule matches messages like these:: + + a 1234 b + a 0xff b + +Note that the "parser" parameter here needs to be provided with an array +of *alternatives*. In this case, the JSON array is **not** interpreted as +a sequence. Note, though that you can nest defintions by using custom types. + +repeat +###### +This parser is used to extract a repeated sequence with the same pattern. + +An example explains this parser best:: + + rule=:a % + {"name":"numbers", "type":"repeat", + "parser":[ + {"type":"number", "name":"n1"}, + {"type":"literal", "text":":"}, + {"type":"number", "name":"n2"} + ], + "while":[ + {"type":"literal", "text":", "} + ] + }% b + +This matches lines like this:: + + a 1:2, 3:4, 5:6, 7:8 b + +and will generate this JSON:: + + { "numbers": [ + { "n2": "2", "n1": "1" }, + { "n2": "4", "n1": "3" }, + { "n2": "6", "n1": "5" }, + { "n2": "8", "n1": "7" } + ] + } + +As can be seen, there are two parameters to "alternative". The parser +parameter specifies which type should be repeatedly parsed out of +the input data. We could use a single parser for that, but in the example +above we parse a sequence. Note the nested array in the "parser" parameter. + +If we just wanted to match a single list of numbers like:: + + a 1, 2, 3, 4 b + +we could use this definition:: + + rule=:a % + {"name":"numbers", "type":"repeat", + "parser": + {"type":"number", "name":"n"}, + "while": + {"type":"literal", "text":", "} + }% b + +Note that in this example we also removed the redundant single-element +array in "while". + +The "while" parameter tells "repeat" how long to do repeat processing. It +is specified by any parser, including a nested sequence of parser (array). +As long as the "while" part matches, the repetition is continued. If it no +longer matches, "repeat" processing is successfully completed. Note that +the "parser" parameter **must** match at least once, otherwise "repeat" +fails. + +In the above sample, "while" mismatches after "4", because no ", " follows. +Then, the parser termiantes, and according to definition the literal " b" +is matched, which will result in a successful rule match (note: the "a ", +" b" literals are just here for explanatory purposes and could be any +other rule element). + +Sometimes we need to deal with malformed messages. For example, we +could have a sequence like this:: + + a 1:2, 3:4,5:6, 7:8 b + +Note the missing space after "4,". To handle such cases, we can nest the +"alternative" parser inside "while":: + + rule=:a % + {"name":"numbers", "type":"repeat", + "parser":[ + {"type":"number", "name":"n1"}, + {"type":"literal", "text":":"}, + {"type":"number", "name":"n2"} + ], + "while": { + "type":"alternative", "parser": [ + {"type":"literal", "text":", "}, + {"type":"literal", "text":","} + ] + } + }% b + +This definition handles numbers being delemited by either ", " or ",". + +For people with programming skills, the "repeat" parser is described +by this pseudocode:: + + do + parse via parsers given in "parser" + if parsing fails + abort "repeat" unsuccessful + parse via parsers given in "while" + while the "while" parsers parsed successfully + if not aborted, flag "repeat" as successful + +Parameters +.......... + +option.permitMismatchInParser +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +If set to "True", permits repeat to accept as successful even when +the parser processing failed. This by default is false, and can be +set to true to cover some border cases, where the while part cannot +definitely detect the end of processing. An example of such a border +case is a listing of flags, being terminated by a double space where +each flag is delimited by single spaces. For example, Cisco products +generate such messages (note the flags part):: + + Aug 18 13:18:45 192.168.0.1 %ASA-6-106015: Deny TCP (no connection) from 10.252.88.66/443 to 10.79.249.222/52746 flags RST on interface outside + +cee-syslog +########## +This parses cee syslog from the message. This format has been defined +by Mitre CEE as well as Project Lumberjack. + +This format essentially is JSON with additional restrictions: + + * The message must start with "@cee:" + * an JSON **object** must immediately follow (whitespace before it permitted, + but a JSON array is **not** permitted) + * after the JSON, there must be no other non-whitespace characters. + +In other words: the message must consist of a single JSON object only, +prefixed by the "@cee:" cookie. + +Note that the cee cookie is case sensitive, so "@CEE:" is **NOT** valid. + +Prefixes +-------- + +Several rules can have a common prefix. You can set it once with this +syntax:: + + prefix= + +Prefix match description syntax is the same as rule match description. +Every following rule will be treated as an addition to this prefix. + +Prefix can be reset to default (empty value) by the line:: + + prefix= + +You can define a prefix for devices that produce the same header in each +message. We assume, that you have your rules sorted by device. In such a +case you can take the header of the rules and use it with the prefix +variable. Here is a example of a rule for IPTables (legacy format, to be converted later):: + + prefix=%date:date-rfc3164% %host:word% %tag:char-to:-\x3a%: + rule=:INBOUND%INBOUND:char-to:-\x3a%: IN=%IN:word% PHYSIN=%PHYSIN:word% OUT=%OUT:word% PHYSOUT=%PHYSOUT:word% SRC=%source:ipv4% DST=%destination:ipv4% LEN=%LEN:number% TOS=%TOS:char-to: % PREC=%PREC:word% TTL=%TTL:number% ID=%ID:number% DF PROTO=%PROTO:word% SPT=%SPT:number% DPT=%DPT:number% WINDOW=%WINDOW:number% RES=0x00 ACK SYN URGP=%URGP:number% + +Usually, every rule would hold what is defined in the prefix at its +beginning. But since we can define the prefix, we can save that work in +every line and just make the rules for the log lines. This saves us a lot +of work and even saves space. + +In a rulebase you can use multiple prefixes obviously. The prefix will be +used for the following rules. If then another prefix is set, the first one +will be erased, and new one will be used for the following rules. + +Rule tags +--------- + +Rule tagging capability permits very easy classification of syslog +messages and log records in general. So you can not only extract data from +your various log source, you can also classify events, for example, as +being a "login", a "logout" or a firewall "denied access". This makes it +very easy to look at specific subsets of messages and process them in ways +specific to the information being conveyed. + +To see how it works, let’s first define what a tag is: + +A tag is a simple alphanumeric string that identifies a specific type of +object, action, status, etc. For example, we can have object tags for +firewalls and servers. For simplicity, let’s call them "firewall" and +"server". Then, we can have action tags like "login", "logout" and +"connectionOpen". Status tags could include "success" or "fail", among +others. Tags form a flat space, there is no inherent relationship between +them (but this may be added later on top of the current implementation). +Think of tags like the tag cloud in a blogging system. Tags can be defined +for any reason and need. A single event can be associated with as many +tags as required. + +Assigning tags to messages is simple. A rule contains both the sample of +the message (including the extracted fields) as well as the tags. +Have a look at this sample:: + + rule=:sshd[%pid:number%]: Invalid user %user:word% from %src-ip:ipv4% + +Here, we have a rule that shows an invalid ssh login request. The various +fields are used to extract information into a well-defined structure. Have +you ever wondered why every rule starts with a colon? Now, here is the +answer: the colon separates the tag part from the actual sample part. +Now, you can create a rule like this:: + + rule=ssh,user,login,fail:sshd[%pid:number%]: Invalid user %user:word% from %src-ip:ipv4% + +Note the "ssh,user,login,fail" part in front of the colon. These are the +four tags the user has decided to assign to this event. What now happens +is that the normalizer does not only extract the information from the +message if it finds a match, but it also adds the tags as metadata. Once +normalization is done, one can not only query the individual fields, but +also query if a specific tag is associated with this event. For example, +to find all ssh-related events (provided the rules are built that way), +you can normalize a large log and select only that subset of the +normalized log that contains the tag "ssh". + +Log annotations +--------------- + +In short, annotations allow to add arbitrary attributes to a parsed +message, depending on rule tags. Values of these attributes are fixed, +they cannot be derived from variable fields. Syntax is as following:: + + annotate=:+="" + +Field value should always be enclosed in double quote marks. + +There can be multiple annotations for the same tag. + +Examples +-------- + +Look at :doc:`sample rulebase ` for configuration +examples and matching log lines. Note that the examples are currently +in legacy format, only. diff --git a/doc/contacts.rst b/doc/contacts.rst new file mode 100644 index 0000000..d44547e --- /dev/null +++ b/doc/contacts.rst @@ -0,0 +1,28 @@ +Contacts +======== + +Mailing list +------------ + +If you have any questions about the library, you may mail to the project +mailing list lognorm@lists.adiscon.com. + +To subscribe: http://lists.adiscon.net/mailman/listinfo/lognorm + +Web site +-------- + +http://www.liblognorm.com/ + +Git repositories +---------------- + +- https://github.com/rsyslog/liblognorm.git +- git://git.adiscon.com/git/liblognorm.git + +Authors +------- + +Rainer Gerhards , Adiscon GmbH + + His blog: http://blog.gerhards.net/ diff --git a/doc/graph.png b/doc/graph.png new file mode 100644 index 0000000..93ef7f0 Binary files /dev/null and b/doc/graph.png differ diff --git a/doc/index.rst b/doc/index.rst new file mode 100644 index 0000000..38d9cd0 --- /dev/null +++ b/doc/index.rst @@ -0,0 +1,30 @@ +.. Liblognorm documentation master file, created by + sphinx-quickstart on Mon Dec 16 13:12:44 2013. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +Welcome to Liblognorm! +====================== + +Search +------ + +* :ref:`search` + +Contents: +--------- + +.. toctree:: + :maxdepth: 3 + + introduction + installation + configuration + sample_rulebase + lognormalizer + libraryapi + internals + license + contacts + changes + diff --git a/doc/installation.rst b/doc/installation.rst new file mode 100644 index 0000000..a72e102 --- /dev/null +++ b/doc/installation.rst @@ -0,0 +1,76 @@ +How to install +============== + +Here you can find the first steps to install and try liblognorm. + +Getting liblognorm +------------------ + +There are several ways to install libognorm. You can install it +from your distribution, if it is there. You can get binary packages from +Rsyslog repositories: + +- `RedHat Enterprise Linux or CentOS `_ +- `Ubuntu `_ +- `Debian `_ + +Or you can build your own binaries from sources. You can fetch all +sources from git (below you can find all commands you need) or you can +download it as tarballs at: + +- `libestr `_ +- `liblognorm `_ + +Please note if you compile it from tarballs then you have to do the same +steps which are mentioned below, apart from:: + + $ git clone ... + $ autoreconf -vfi + +Building from git +----------------- + +To build liblognorm from sources, you need to have +`json-c `_ installed. + +Open a terminal and switch to the folder where you want to build +liblognorm. Below you will find the necessary commands. First, build +and install prerequisite library called **libestr**:: + + $ git clone git://git.adiscon.com/git/libestr.git + $ cd libestr + $ autoreconf -vfi + $ ./configure + $ make + $ sudo make install + +leave that folder and repeat this step again for liblognorm:: + + $ cd .. + $ git clone git://git.adiscon.com/git/liblognorm.git + $ cd liblognorm + $ autoreconf -vfi + $ ./configure + $ make + $ sudo make install + +That’s all you have to do. + +Testing +------- + +For a first test we need two further things, a test log and the rulebase. +Both can be downloaded `here +`_. + +After downloading these examples you can use liblognorm. Go to +liblognorm/src and use the command below:: + + $ ./lognormalize -r messages.sampdb -o json ` tool to +debug. diff --git a/doc/introduction.rst b/doc/introduction.rst new file mode 100644 index 0000000..a26ac0c --- /dev/null +++ b/doc/introduction.rst @@ -0,0 +1,34 @@ +Introduction +============ + +Briefly described, liblognorm is a tool to normalize log data. + +People who need to take a look at logs often have a common problem. Logs +from different machines (from different vendors) usually have different +formats. Even if it is the same type of log (e.g. from firewalls), the log +entries are so different, that it is pretty hard to read these. This is +where liblognorm comes into the game. With this tool you can normalize all +your logs. All you need is liblognorm and its dependencies and a sample +database that fits the logs you want to normalize. + +So, for example, if you have traffic logs from three different firewalls, +liblognorm will be able to "normalize" the events into generic ones. Among +others, it will extract source and destination ip addresses and ports and +make them available via well-defined fields. As the end result, a common log +analysis application will be able to work on that common set and so this +backend will be independent from the actual firewalls feeding it. Even +better, once we have a well-understood interim format, it is also easy to +convert that into any other vendor specific format, so that you can use that +vendor's analysis tool. + +By design, liblognorm is constructed as a library. Thus, it can be used by +other tools. + +In short, liblognorm works by: + + 1. Matching a line to a rule from predefined configuration; + 2. Picking out variable fields from the line; + 3. Returning them as a JSON hash object. + +Then, a consumer of this object can construct new, normalized log line +on its own. diff --git a/doc/libraryapi.rst b/doc/libraryapi.rst new file mode 100644 index 0000000..7d9c35c --- /dev/null +++ b/doc/libraryapi.rst @@ -0,0 +1,10 @@ +Library API +=========== + +To use the library, include liblognorm.h (which is quoted below) into your code. +The API is fairly simple and hardly needs further explanations. + +.. literalinclude:: ../src/liblognorm.h + :start-after: #define LIBLOGNORM_H_INCLUDED + :end-before: #endif + :language: c diff --git a/doc/license.rst b/doc/license.rst new file mode 100644 index 0000000..22d1074 --- /dev/null +++ b/doc/license.rst @@ -0,0 +1,9 @@ +Licensing +========= + +Liblognorm is available under the terms of the GNU LGPL v2.1 or above (full +text below). + +.. literalinclude:: ../COPYING + + diff --git a/doc/lognormalizer.rst b/doc/lognormalizer.rst new file mode 100644 index 0000000..c592a2c --- /dev/null +++ b/doc/lognormalizer.rst @@ -0,0 +1,264 @@ +Lognormalizer +============= + +Lognormalizer is a sample tool which is often used to test and debug +rulebases before real use. Nevertheless, it can be used in production as +a simple command line interface to liblognorm. + +This tool reads log lines from its standard input and prints results +to standard output. You need to use redirections if you want to read +or write files. + +An example of the command:: + + $ lognormalizer -r messages.sampdb -e json + +Specifies name of the file containing the rulebase. + +:: + + -v + +Increase verbosity level. Can be used several times. If used three +times, internal data structures are dumped (make sense to developers, +only). + +:: + + -p + +Print only successfully parsed messages. + +:: + + -P + +Print only messages **not** successfully parsed. + +:: + + -L + +Add line number information to events not successfully parsed. This +is meant as a troubleshooting aid when working with unparsable events, +as the information can be used to directly go to the line in question +in the source data file. The line number is contained in a field +named ``lognormalizer.line_nbr``. + +:: + + -t + +Print only those messages which have this tag. + +:: + + -T + +Include 'event.tags' attribute when output is in JSON format. This attribute contains list of tags of the matched +rule. + +:: + + -E + +Encoder-specific data. For CSV, it is the list of fields to be output, +separated by comma or space. It is currently unused for other formats. + +:: + + -d + +Generate DOT file describing parse tree. It is used to plot parse graph +with GraphViz. + +:: + + -H + +At end of run, print a summary line with number of messages processed, +parsed and unparsed to stdout. + +:: + + -U + +At end of run, print a summary line with number of messages unparsed to +stdout. Note that this message is only printed if there was at least one +unparsable message. + +:: + + -o + +Special options. The following ones can be set: + + * **allowRegex** Permits to use regular expressions inse the v1 engine + This is deprecated and should not be used for new deployments. + + * **addExecPath** Includes metadata into the event on how it was + (tried) to be parsed. Can be useful in troubleshooting normalization + problems. + + * **addOriginalMsg** Always add the "original-msg" data item. By + default, this is only done when a message could not be parsed. + + * **addRule** Add a mockup of the rule that was processed. Note that + it is *not* an exact copy of the rule, but a rule that correctly + describes the parsed message. Most importantly, prefixes are + appended and custom data types are expanded (and no longer visiable + as such). This option is primarily meant for postprocessing, e.g. + as input to an anonymizer. + + * **addRuleRulcation** For rules that successfully parsed, add the + location of the rule inside the rulebase. But the file name as + well as the line number are given. If two rules evaluate to the same + end node, only a single rule location is given. However, in + practice this is extremely unlikely and as such for practical + reasons the information can be considered reliable. + +:: + + -s + +At end of run, print internal parse DAG statistics and exit. This +option is meant for developers and researches which want to get insight +into the quality of the algorithm and/or how efficient the rulebase could +be processed. **NOT** intended for end users. This option is performance +intense. + +:: + + -S + +Even stronger statistics than -s. Requires that the version is compiled +with --enable-advanced-statistics, which causes a considerable +performance loss. + +:: + + -x + +Print statistics as a DOT file. In order to keep the graph readable, +information is only emitted for called nodes. + +:: + + -e + +Output format. By default, output is in JSON format. With this option, +you can change it to a different one. + +Supported Output Formats +........................ +The JSON, XML, and CSV formats should be self-explanatory. + +The cee-syslog format emits messages according to the Mitre CEE spec. +Note that the cee-syslog format is primarily supported for +backward-compatibility. It does **not** support nested data items +and as such cannot be used when the rulebase makes use of this +feature (we assume this most often happens nowadays). We strongly +recommend not use it for new deployments. Support may be removed +in later releases. + +The raw format outputs an exact copy of the input message, without +any normalization visible. The prime use case of "raw" is to extract +either all messages that could or could not be normalized. To do so +specify the -p or -P option. Also, it works in combination with the +-t option to extract a subset based on tagging. In any case, the core +use is to prepare a subset of the original file for further processing. + +Examples +-------- + +These examples were created using sample rulebase from source package. + +Default (CEE) output:: + + $ lognormalizer -r rulebases/sample.rulebase + Weight: 42kg + [cee@115 event.tags="tag2" unit="kg" N="42" fat="free"] + Snow White and the Seven Dwarfs + [cee@115 event.tags="tale" company="the Seven Dwarfs"] + 2012-10-11 src=127.0.0.1 dst=88.111.222.19 + [cee@115 dst="88.111.222.19" src="127.0.0.1" date="2012-10-11"] + +JSON output, flat tags enabled:: + + $ lognormalizer -r rulebases/sample.rulebase -e json -T + %% + { "event.tags": [ "tag3", "percent" ], "percent": "100", "part": "wha", "whole": "whale" } + Weight: 42kg + { "unit": "kg", "N": "42", "event.tags": [ "tag2" ], "fat": "free" } + +CSV output with fixed field list:: + + $ lognormalizer -r rulebases/sample.rulebase -e csv -E'N unit' + Weight: 42kg + "42","kg" + Weight: 115lbs + "115","lbs" + Anything not matching the rule + , + +Creating a graph of the rulebase +-------------------------------- + +To get a better overview of a rulebase you can create a graph that shows you +the chain of normalization (parse-tree). + +At first you have to install an additional package called graphviz. Graphviz +is a tool that creates such a graph with the help of a control file (created +with the rulebase). `Here `_ you will find more +information about graphviz. + +To install it you can use the package manager. For example, on RedHat +systems it is yum command:: + + $ sudo yum install graphviz + +The next step would be creating the control file for graphviz. Therefore we +use the normalizer command with the options -d "prefered filename for the +control file" and -r "rulebase":: + + $ lognormalize -d control.dot -r messages.rb + +Please note that there is no need for an input or output file. +If you have a look at the control file now you will see that the content is +a little bit confusing, but it includes all information, like the nodes, +fields and parser, that graphviz needs to create the graph. Of course you +can edit that file, but please note that it is a lot of work. + +Now we can create the graph by typing:: + + $ dot control.dot -Tpng >graph.png + +dot + name of control file + option -T -> file format + output file + +That is just one example for using graphviz, of course you can do many +other great things with it. But I think this "simple" graph could be very +helpful for the normalizer. + +Below you see sample for such a graph, but please note that this is +not such a pretty one. Such a graph can grow very fast by editing your +rulebase. + +.. figure:: graph.png + :width: 90 % + :alt: graph sample + diff --git a/doc/sample_rulebase.rst b/doc/sample_rulebase.rst new file mode 100644 index 0000000..053c210 --- /dev/null +++ b/doc/sample_rulebase.rst @@ -0,0 +1,6 @@ +Sample rulebase +=============== + +.. literalinclude:: ../rulebases/sample.rulebase + :linenos: + diff --git a/install-sh b/install-sh new file mode 100755 index 0000000..59990a1 --- /dev/null +++ b/install-sh @@ -0,0 +1,508 @@ +#!/bin/sh +# install - install a program, script, or datafile + +scriptversion=2014-09-12.12; # UTC + +# This originates from X11R5 (mit/util/scripts/install.sh), which was +# later released in X11R6 (xc/config/util/install.sh) with the +# following copyright and license. +# +# Copyright (C) 1994 X Consortium +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to +# deal in the Software without restriction, including without limitation the +# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or +# sell copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in +# all copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# X CONSORTIUM BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN +# AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNEC- +# TION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. +# +# Except as contained in this notice, the name of the X Consortium shall not +# be used in advertising or otherwise to promote the sale, use or other deal- +# ings in this Software without prior written authorization from the X Consor- +# tium. +# +# +# FSF changes to this file are in the public domain. +# +# Calling this script install-sh is preferred over install.sh, to prevent +# 'make' implicit rules from creating a file called install from it +# when there is no Makefile. +# +# This script is compatible with the BSD install script, but was written +# from scratch. + +tab=' ' +nl=' +' +IFS=" $tab$nl" + +# Set DOITPROG to "echo" to test this script. + +doit=${DOITPROG-} +doit_exec=${doit:-exec} + +# Put in absolute file names if you don't have them in your path; +# or use environment vars. + +chgrpprog=${CHGRPPROG-chgrp} +chmodprog=${CHMODPROG-chmod} +chownprog=${CHOWNPROG-chown} +cmpprog=${CMPPROG-cmp} +cpprog=${CPPROG-cp} +mkdirprog=${MKDIRPROG-mkdir} +mvprog=${MVPROG-mv} +rmprog=${RMPROG-rm} +stripprog=${STRIPPROG-strip} + +posix_mkdir= + +# Desired mode of installed file. +mode=0755 + +chgrpcmd= +chmodcmd=$chmodprog +chowncmd= +mvcmd=$mvprog +rmcmd="$rmprog -f" +stripcmd= + +src= +dst= +dir_arg= +dst_arg= + +copy_on_change=false +is_target_a_directory=possibly + +usage="\ +Usage: $0 [OPTION]... [-T] SRCFILE DSTFILE + or: $0 [OPTION]... SRCFILES... DIRECTORY + or: $0 [OPTION]... -t DIRECTORY SRCFILES... + or: $0 [OPTION]... -d DIRECTORIES... + +In the 1st form, copy SRCFILE to DSTFILE. +In the 2nd and 3rd, copy all SRCFILES to DIRECTORY. +In the 4th, create DIRECTORIES. + +Options: + --help display this help and exit. + --version display version info and exit. + + -c (ignored) + -C install only if different (preserve the last data modification time) + -d create directories instead of installing files. + -g GROUP $chgrpprog installed files to GROUP. + -m MODE $chmodprog installed files to MODE. + -o USER $chownprog installed files to USER. + -s $stripprog installed files. + -t DIRECTORY install into DIRECTORY. + -T report an error if DSTFILE is a directory. + +Environment variables override the default commands: + CHGRPPROG CHMODPROG CHOWNPROG CMPPROG CPPROG MKDIRPROG MVPROG + RMPROG STRIPPROG +" + +while test $# -ne 0; do + case $1 in + -c) ;; + + -C) copy_on_change=true;; + + -d) dir_arg=true;; + + -g) chgrpcmd="$chgrpprog $2" + shift;; + + --help) echo "$usage"; exit $?;; + + -m) mode=$2 + case $mode in + *' '* | *"$tab"* | *"$nl"* | *'*'* | *'?'* | *'['*) + echo "$0: invalid mode: $mode" >&2 + exit 1;; + esac + shift;; + + -o) chowncmd="$chownprog $2" + shift;; + + -s) stripcmd=$stripprog;; + + -t) + is_target_a_directory=always + dst_arg=$2 + # Protect names problematic for 'test' and other utilities. + case $dst_arg in + -* | [=\(\)!]) dst_arg=./$dst_arg;; + esac + shift;; + + -T) is_target_a_directory=never;; + + --version) echo "$0 $scriptversion"; exit $?;; + + --) shift + break;; + + -*) echo "$0: invalid option: $1" >&2 + exit 1;; + + *) break;; + esac + shift +done + +# We allow the use of options -d and -T together, by making -d +# take the precedence; this is for compatibility with GNU install. + +if test -n "$dir_arg"; then + if test -n "$dst_arg"; then + echo "$0: target directory not allowed when installing a directory." >&2 + exit 1 + fi +fi + +if test $# -ne 0 && test -z "$dir_arg$dst_arg"; then + # When -d is used, all remaining arguments are directories to create. + # When -t is used, the destination is already specified. + # Otherwise, the last argument is the destination. Remove it from $@. + for arg + do + if test -n "$dst_arg"; then + # $@ is not empty: it contains at least $arg. + set fnord "$@" "$dst_arg" + shift # fnord + fi + shift # arg + dst_arg=$arg + # Protect names problematic for 'test' and other utilities. + case $dst_arg in + -* | [=\(\)!]) dst_arg=./$dst_arg;; + esac + done +fi + +if test $# -eq 0; then + if test -z "$dir_arg"; then + echo "$0: no input file specified." >&2 + exit 1 + fi + # It's OK to call 'install-sh -d' without argument. + # This can happen when creating conditional directories. + exit 0 +fi + +if test -z "$dir_arg"; then + if test $# -gt 1 || test "$is_target_a_directory" = always; then + if test ! -d "$dst_arg"; then + echo "$0: $dst_arg: Is not a directory." >&2 + exit 1 + fi + fi +fi + +if test -z "$dir_arg"; then + do_exit='(exit $ret); exit $ret' + trap "ret=129; $do_exit" 1 + trap "ret=130; $do_exit" 2 + trap "ret=141; $do_exit" 13 + trap "ret=143; $do_exit" 15 + + # Set umask so as not to create temps with too-generous modes. + # However, 'strip' requires both read and write access to temps. + case $mode in + # Optimize common cases. + *644) cp_umask=133;; + *755) cp_umask=22;; + + *[0-7]) + if test -z "$stripcmd"; then + u_plus_rw= + else + u_plus_rw='% 200' + fi + cp_umask=`expr '(' 777 - $mode % 1000 ')' $u_plus_rw`;; + *) + if test -z "$stripcmd"; then + u_plus_rw= + else + u_plus_rw=,u+rw + fi + cp_umask=$mode$u_plus_rw;; + esac +fi + +for src +do + # Protect names problematic for 'test' and other utilities. + case $src in + -* | [=\(\)!]) src=./$src;; + esac + + if test -n "$dir_arg"; then + dst=$src + dstdir=$dst + test -d "$dstdir" + dstdir_status=$? + else + + # Waiting for this to be detected by the "$cpprog $src $dsttmp" command + # might cause directories to be created, which would be especially bad + # if $src (and thus $dsttmp) contains '*'. + if test ! -f "$src" && test ! -d "$src"; then + echo "$0: $src does not exist." >&2 + exit 1 + fi + + if test -z "$dst_arg"; then + echo "$0: no destination specified." >&2 + exit 1 + fi + dst=$dst_arg + + # If destination is a directory, append the input filename; won't work + # if double slashes aren't ignored. + if test -d "$dst"; then + if test "$is_target_a_directory" = never; then + echo "$0: $dst_arg: Is a directory" >&2 + exit 1 + fi + dstdir=$dst + dst=$dstdir/`basename "$src"` + dstdir_status=0 + else + dstdir=`dirname "$dst"` + test -d "$dstdir" + dstdir_status=$? + fi + fi + + obsolete_mkdir_used=false + + if test $dstdir_status != 0; then + case $posix_mkdir in + '') + # Create intermediate dirs using mode 755 as modified by the umask. + # This is like FreeBSD 'install' as of 1997-10-28. + umask=`umask` + case $stripcmd.$umask in + # Optimize common cases. + *[2367][2367]) mkdir_umask=$umask;; + .*0[02][02] | .[02][02] | .[02]) mkdir_umask=22;; + + *[0-7]) + mkdir_umask=`expr $umask + 22 \ + - $umask % 100 % 40 + $umask % 20 \ + - $umask % 10 % 4 + $umask % 2 + `;; + *) mkdir_umask=$umask,go-w;; + esac + + # With -d, create the new directory with the user-specified mode. + # Otherwise, rely on $mkdir_umask. + if test -n "$dir_arg"; then + mkdir_mode=-m$mode + else + mkdir_mode= + fi + + posix_mkdir=false + case $umask in + *[123567][0-7][0-7]) + # POSIX mkdir -p sets u+wx bits regardless of umask, which + # is incompatible with FreeBSD 'install' when (umask & 300) != 0. + ;; + *) + # $RANDOM is not portable (e.g. dash); use it when possible to + # lower collision chance + tmpdir=${TMPDIR-/tmp}/ins$RANDOM-$$ + trap 'ret=$?; rmdir "$tmpdir/a/b" "$tmpdir/a" "$tmpdir" 2>/dev/null; exit $ret' 0 + + # As "mkdir -p" follows symlinks and we work in /tmp possibly; so + # create the $tmpdir first (and fail if unsuccessful) to make sure + # that nobody tries to guess the $tmpdir name. + if (umask $mkdir_umask && + $mkdirprog $mkdir_mode "$tmpdir" && + exec $mkdirprog $mkdir_mode -p -- "$tmpdir/a/b") >/dev/null 2>&1 + then + if test -z "$dir_arg" || { + # Check for POSIX incompatibilities with -m. + # HP-UX 11.23 and IRIX 6.5 mkdir -m -p sets group- or + # other-writable bit of parent directory when it shouldn't. + # FreeBSD 6.1 mkdir -m -p sets mode of existing directory. + test_tmpdir="$tmpdir/a" + ls_ld_tmpdir=`ls -ld "$test_tmpdir"` + case $ls_ld_tmpdir in + d????-?r-*) different_mode=700;; + d????-?--*) different_mode=755;; + *) false;; + esac && + $mkdirprog -m$different_mode -p -- "$test_tmpdir" && { + ls_ld_tmpdir_1=`ls -ld "$test_tmpdir"` + test "$ls_ld_tmpdir" = "$ls_ld_tmpdir_1" + } + } + then posix_mkdir=: + fi + rmdir "$tmpdir/a/b" "$tmpdir/a" "$tmpdir" + else + # Remove any dirs left behind by ancient mkdir implementations. + rmdir ./$mkdir_mode ./-p ./-- "$tmpdir" 2>/dev/null + fi + trap '' 0;; + esac;; + esac + + if + $posix_mkdir && ( + umask $mkdir_umask && + $doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir" + ) + then : + else + + # The umask is ridiculous, or mkdir does not conform to POSIX, + # or it failed possibly due to a race condition. Create the + # directory the slow way, step by step, checking for races as we go. + + case $dstdir in + /*) prefix='/';; + [-=\(\)!]*) prefix='./';; + *) prefix='';; + esac + + oIFS=$IFS + IFS=/ + set -f + set fnord $dstdir + shift + set +f + IFS=$oIFS + + prefixes= + + for d + do + test X"$d" = X && continue + + prefix=$prefix$d + if test -d "$prefix"; then + prefixes= + else + if $posix_mkdir; then + (umask=$mkdir_umask && + $doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir") && break + # Don't fail if two instances are running concurrently. + test -d "$prefix" || exit 1 + else + case $prefix in + *\'*) qprefix=`echo "$prefix" | sed "s/'/'\\\\\\\\''/g"`;; + *) qprefix=$prefix;; + esac + prefixes="$prefixes '$qprefix'" + fi + fi + prefix=$prefix/ + done + + if test -n "$prefixes"; then + # Don't fail if two instances are running concurrently. + (umask $mkdir_umask && + eval "\$doit_exec \$mkdirprog $prefixes") || + test -d "$dstdir" || exit 1 + obsolete_mkdir_used=true + fi + fi + fi + + if test -n "$dir_arg"; then + { test -z "$chowncmd" || $doit $chowncmd "$dst"; } && + { test -z "$chgrpcmd" || $doit $chgrpcmd "$dst"; } && + { test "$obsolete_mkdir_used$chowncmd$chgrpcmd" = false || + test -z "$chmodcmd" || $doit $chmodcmd $mode "$dst"; } || exit 1 + else + + # Make a couple of temp file names in the proper directory. + dsttmp=$dstdir/_inst.$$_ + rmtmp=$dstdir/_rm.$$_ + + # Trap to clean up those temp files at exit. + trap 'ret=$?; rm -f "$dsttmp" "$rmtmp" && exit $ret' 0 + + # Copy the file name to the temp name. + (umask $cp_umask && $doit_exec $cpprog "$src" "$dsttmp") && + + # and set any options; do chmod last to preserve setuid bits. + # + # If any of these fail, we abort the whole thing. If we want to + # ignore errors from any of these, just make sure not to ignore + # errors from the above "$doit $cpprog $src $dsttmp" command. + # + { test -z "$chowncmd" || $doit $chowncmd "$dsttmp"; } && + { test -z "$chgrpcmd" || $doit $chgrpcmd "$dsttmp"; } && + { test -z "$stripcmd" || $doit $stripcmd "$dsttmp"; } && + { test -z "$chmodcmd" || $doit $chmodcmd $mode "$dsttmp"; } && + + # If -C, don't bother to copy if it wouldn't change the file. + if $copy_on_change && + old=`LC_ALL=C ls -dlL "$dst" 2>/dev/null` && + new=`LC_ALL=C ls -dlL "$dsttmp" 2>/dev/null` && + set -f && + set X $old && old=:$2:$4:$5:$6 && + set X $new && new=:$2:$4:$5:$6 && + set +f && + test "$old" = "$new" && + $cmpprog "$dst" "$dsttmp" >/dev/null 2>&1 + then + rm -f "$dsttmp" + else + # Rename the file to the real destination. + $doit $mvcmd -f "$dsttmp" "$dst" 2>/dev/null || + + # The rename failed, perhaps because mv can't rename something else + # to itself, or perhaps because mv is so ancient that it does not + # support -f. + { + # Now remove or move aside any old file at destination location. + # We try this two ways since rm can't unlink itself on some + # systems and the destination file might be busy for other + # reasons. In this case, the final cleanup might fail but the new + # file should still install successfully. + { + test ! -f "$dst" || + $doit $rmcmd -f "$dst" 2>/dev/null || + { $doit $mvcmd -f "$dst" "$rmtmp" 2>/dev/null && + { $doit $rmcmd -f "$rmtmp" 2>/dev/null; :; } + } || + { echo "$0: cannot unlink or rename $dst" >&2 + (exit 1); exit 1 + } + } && + + # Now rename the file to the real destination. + $doit $mvcmd "$dsttmp" "$dst" + } + fi || exit 1 + + trap '' 0 + fi +done + +# Local variables: +# eval: (add-hook 'write-file-hooks 'time-stamp) +# time-stamp-start: "scriptversion=" +# time-stamp-format: "%:y-%02m-%02d.%02H" +# time-stamp-time-zone: "UTC" +# time-stamp-end: "; # UTC" +# End: diff --git a/lognorm.pc.in b/lognorm.pc.in new file mode 100644 index 0000000..dfbe866 --- /dev/null +++ b/lognorm.pc.in @@ -0,0 +1,12 @@ +prefix=@prefix@ +exec_prefix=@exec_prefix@ +libdir=@libdir@ +includedir=@includedir@ + +Name: lognorm +Description: fast samples-based log normalization library +Version: @VERSION@ +Requires: libfastjson +Libs: -L${libdir} -llognorm +Libs.private: @pkg_config_libs_private@ +Cflags: -I${includedir} diff --git a/ltmain.sh b/ltmain.sh new file mode 100644 index 0000000..147d758 --- /dev/null +++ b/ltmain.sh @@ -0,0 +1,11156 @@ +#! /bin/sh +## DO NOT EDIT - This file generated from ./build-aux/ltmain.in +## by inline-source v2014-01-03.01 + +# libtool (GNU libtool) 2.4.6 +# Provide generalized library-building support services. +# Written by Gordon Matzigkeit , 1996 + +# Copyright (C) 1996-2015 Free Software Foundation, Inc. +# This is free software; see the source for copying conditions. There is NO +# warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. + +# GNU Libtool is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 2 of the License, or +# (at your option) any later version. +# +# As a special exception to the GNU General Public License, +# if you distribute this file as part of a program or library that +# is built using GNU Libtool, you may include this file under the +# same distribution terms that you use for the rest of that program. +# +# GNU Libtool is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + + +PROGRAM=libtool +PACKAGE=libtool +VERSION="2.4.6 Debian-2.4.6-0.1" +package_revision=2.4.6 + + +## ------ ## +## Usage. ## +## ------ ## + +# Run './libtool --help' for help with using this script from the +# command line. + + +## ------------------------------- ## +## User overridable command paths. ## +## ------------------------------- ## + +# After configure completes, it has a better idea of some of the +# shell tools we need than the defaults used by the functions shared +# with bootstrap, so set those here where they can still be over- +# ridden by the user, but otherwise take precedence. + +: ${AUTOCONF="autoconf"} +: ${AUTOMAKE="automake"} + + +## -------------------------- ## +## Source external libraries. ## +## -------------------------- ## + +# Much of our low-level functionality needs to be sourced from external +# libraries, which are installed to $pkgauxdir. + +# Set a version string for this script. +scriptversion=2015-01-20.17; # UTC + +# General shell script boiler plate, and helper functions. +# Written by Gary V. Vaughan, 2004 + +# Copyright (C) 2004-2015 Free Software Foundation, Inc. +# This is free software; see the source for copying conditions. There is NO +# warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. + +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 3 of the License, or +# (at your option) any later version. + +# As a special exception to the GNU General Public License, if you distribute +# this file as part of a program or library that is built using GNU Libtool, +# you may include this file under the same distribution terms that you use +# for the rest of that program. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNES FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. + +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +# Please report bugs or propose patches to gary@gnu.org. + + +## ------ ## +## Usage. ## +## ------ ## + +# Evaluate this file near the top of your script to gain access to +# the functions and variables defined here: +# +# . `echo "$0" | ${SED-sed} 's|[^/]*$||'`/build-aux/funclib.sh +# +# If you need to override any of the default environment variable +# settings, do that before evaluating this file. + + +## -------------------- ## +## Shell normalisation. ## +## -------------------- ## + +# Some shells need a little help to be as Bourne compatible as possible. +# Before doing anything else, make sure all that help has been provided! + +DUALCASE=1; export DUALCASE # for MKS sh +if test -n "${ZSH_VERSION+set}" && (emulate sh) >/dev/null 2>&1; then : + emulate sh + NULLCMD=: + # Pre-4.2 versions of Zsh do word splitting on ${1+"$@"}, which + # is contrary to our usage. Disable this feature. + alias -g '${1+"$@"}'='"$@"' + setopt NO_GLOB_SUBST +else + case `(set -o) 2>/dev/null` in *posix*) set -o posix ;; esac +fi + +# NLS nuisances: We save the old values in case they are required later. +_G_user_locale= +_G_safe_locale= +for _G_var in LANG LANGUAGE LC_ALL LC_CTYPE LC_COLLATE LC_MESSAGES +do + eval "if test set = \"\${$_G_var+set}\"; then + save_$_G_var=\$$_G_var + $_G_var=C + export $_G_var + _G_user_locale=\"$_G_var=\\\$save_\$_G_var; \$_G_user_locale\" + _G_safe_locale=\"$_G_var=C; \$_G_safe_locale\" + fi" +done + +# CDPATH. +(unset CDPATH) >/dev/null 2>&1 && unset CDPATH + +# Make sure IFS has a sensible default +sp=' ' +nl=' +' +IFS="$sp $nl" + +# There are apparently some retarded systems that use ';' as a PATH separator! +if test "${PATH_SEPARATOR+set}" != set; then + PATH_SEPARATOR=: + (PATH='/bin;/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 && { + (PATH='/bin:/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 || + PATH_SEPARATOR=';' + } +fi + + + +## ------------------------- ## +## Locate command utilities. ## +## ------------------------- ## + + +# func_executable_p FILE +# ---------------------- +# Check that FILE is an executable regular file. +func_executable_p () +{ + test -f "$1" && test -x "$1" +} + + +# func_path_progs PROGS_LIST CHECK_FUNC [PATH] +# -------------------------------------------- +# Search for either a program that responds to --version with output +# containing "GNU", or else returned by CHECK_FUNC otherwise, by +# trying all the directories in PATH with each of the elements of +# PROGS_LIST. +# +# CHECK_FUNC should accept the path to a candidate program, and +# set $func_check_prog_result if it truncates its output less than +# $_G_path_prog_max characters. +func_path_progs () +{ + _G_progs_list=$1 + _G_check_func=$2 + _G_PATH=${3-"$PATH"} + + _G_path_prog_max=0 + _G_path_prog_found=false + _G_save_IFS=$IFS; IFS=${PATH_SEPARATOR-:} + for _G_dir in $_G_PATH; do + IFS=$_G_save_IFS + test -z "$_G_dir" && _G_dir=. + for _G_prog_name in $_G_progs_list; do + for _exeext in '' .EXE; do + _G_path_prog=$_G_dir/$_G_prog_name$_exeext + func_executable_p "$_G_path_prog" || continue + case `"$_G_path_prog" --version 2>&1` in + *GNU*) func_path_progs_result=$_G_path_prog _G_path_prog_found=: ;; + *) $_G_check_func $_G_path_prog + func_path_progs_result=$func_check_prog_result + ;; + esac + $_G_path_prog_found && break 3 + done + done + done + IFS=$_G_save_IFS + test -z "$func_path_progs_result" && { + echo "no acceptable sed could be found in \$PATH" >&2 + exit 1 + } +} + + +# We want to be able to use the functions in this file before configure +# has figured out where the best binaries are kept, which means we have +# to search for them ourselves - except when the results are already set +# where we skip the searches. + +# Unless the user overrides by setting SED, search the path for either GNU +# sed, or the sed that truncates its output the least. +test -z "$SED" && { + _G_sed_script=s/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb/ + for _G_i in 1 2 3 4 5 6 7; do + _G_sed_script=$_G_sed_script$nl$_G_sed_script + done + echo "$_G_sed_script" 2>/dev/null | sed 99q >conftest.sed + _G_sed_script= + + func_check_prog_sed () + { + _G_path_prog=$1 + + _G_count=0 + printf 0123456789 >conftest.in + while : + do + cat conftest.in conftest.in >conftest.tmp + mv conftest.tmp conftest.in + cp conftest.in conftest.nl + echo '' >> conftest.nl + "$_G_path_prog" -f conftest.sed conftest.out 2>/dev/null || break + diff conftest.out conftest.nl >/dev/null 2>&1 || break + _G_count=`expr $_G_count + 1` + if test "$_G_count" -gt "$_G_path_prog_max"; then + # Best one so far, save it but keep looking for a better one + func_check_prog_result=$_G_path_prog + _G_path_prog_max=$_G_count + fi + # 10*(2^10) chars as input seems more than enough + test 10 -lt "$_G_count" && break + done + rm -f conftest.in conftest.tmp conftest.nl conftest.out + } + + func_path_progs "sed gsed" func_check_prog_sed $PATH:/usr/xpg4/bin + rm -f conftest.sed + SED=$func_path_progs_result +} + + +# Unless the user overrides by setting GREP, search the path for either GNU +# grep, or the grep that truncates its output the least. +test -z "$GREP" && { + func_check_prog_grep () + { + _G_path_prog=$1 + + _G_count=0 + _G_path_prog_max=0 + printf 0123456789 >conftest.in + while : + do + cat conftest.in conftest.in >conftest.tmp + mv conftest.tmp conftest.in + cp conftest.in conftest.nl + echo 'GREP' >> conftest.nl + "$_G_path_prog" -e 'GREP$' -e '-(cannot match)-' conftest.out 2>/dev/null || break + diff conftest.out conftest.nl >/dev/null 2>&1 || break + _G_count=`expr $_G_count + 1` + if test "$_G_count" -gt "$_G_path_prog_max"; then + # Best one so far, save it but keep looking for a better one + func_check_prog_result=$_G_path_prog + _G_path_prog_max=$_G_count + fi + # 10*(2^10) chars as input seems more than enough + test 10 -lt "$_G_count" && break + done + rm -f conftest.in conftest.tmp conftest.nl conftest.out + } + + func_path_progs "grep ggrep" func_check_prog_grep $PATH:/usr/xpg4/bin + GREP=$func_path_progs_result +} + + +## ------------------------------- ## +## User overridable command paths. ## +## ------------------------------- ## + +# All uppercase variable names are used for environment variables. These +# variables can be overridden by the user before calling a script that +# uses them if a suitable command of that name is not already available +# in the command search PATH. + +: ${CP="cp -f"} +: ${ECHO="printf %s\n"} +: ${EGREP="$GREP -E"} +: ${FGREP="$GREP -F"} +: ${LN_S="ln -s"} +: ${MAKE="make"} +: ${MKDIR="mkdir"} +: ${MV="mv -f"} +: ${RM="rm -f"} +: ${SHELL="${CONFIG_SHELL-/bin/sh}"} + + +## -------------------- ## +## Useful sed snippets. ## +## -------------------- ## + +sed_dirname='s|/[^/]*$||' +sed_basename='s|^.*/||' + +# Sed substitution that helps us do robust quoting. It backslashifies +# metacharacters that are still active within double-quoted strings. +sed_quote_subst='s|\([`"$\\]\)|\\\1|g' + +# Same as above, but do not quote variable references. +sed_double_quote_subst='s/\(["`\\]\)/\\\1/g' + +# Sed substitution that turns a string into a regex matching for the +# string literally. +sed_make_literal_regex='s|[].[^$\\*\/]|\\&|g' + +# Sed substitution that converts a w32 file name or path +# that contains forward slashes, into one that contains +# (escaped) backslashes. A very naive implementation. +sed_naive_backslashify='s|\\\\*|\\|g;s|/|\\|g;s|\\|\\\\|g' + +# Re-'\' parameter expansions in output of sed_double_quote_subst that +# were '\'-ed in input to the same. If an odd number of '\' preceded a +# '$' in input to sed_double_quote_subst, that '$' was protected from +# expansion. Since each input '\' is now two '\'s, look for any number +# of runs of four '\'s followed by two '\'s and then a '$'. '\' that '$'. +_G_bs='\\' +_G_bs2='\\\\' +_G_bs4='\\\\\\\\' +_G_dollar='\$' +sed_double_backslash="\ + s/$_G_bs4/&\\ +/g + s/^$_G_bs2$_G_dollar/$_G_bs&/ + s/\\([^$_G_bs]\\)$_G_bs2$_G_dollar/\\1$_G_bs2$_G_bs$_G_dollar/g + s/\n//g" + + +## ----------------- ## +## Global variables. ## +## ----------------- ## + +# Except for the global variables explicitly listed below, the following +# functions in the '^func_' namespace, and the '^require_' namespace +# variables initialised in the 'Resource management' section, sourcing +# this file will not pollute your global namespace with anything +# else. There's no portable way to scope variables in Bourne shell +# though, so actually running these functions will sometimes place +# results into a variable named after the function, and often use +# temporary variables in the '^_G_' namespace. If you are careful to +# avoid using those namespaces casually in your sourcing script, things +# should continue to work as you expect. And, of course, you can freely +# overwrite any of the functions or variables defined here before +# calling anything to customize them. + +EXIT_SUCCESS=0 +EXIT_FAILURE=1 +EXIT_MISMATCH=63 # $? = 63 is used to indicate version mismatch to missing. +EXIT_SKIP=77 # $? = 77 is used to indicate a skipped test to automake. + +# Allow overriding, eg assuming that you follow the convention of +# putting '$debug_cmd' at the start of all your functions, you can get +# bash to show function call trace with: +# +# debug_cmd='eval echo "${FUNCNAME[0]} $*" >&2' bash your-script-name +debug_cmd=${debug_cmd-":"} +exit_cmd=: + +# By convention, finish your script with: +# +# exit $exit_status +# +# so that you can set exit_status to non-zero if you want to indicate +# something went wrong during execution without actually bailing out at +# the point of failure. +exit_status=$EXIT_SUCCESS + +# Work around backward compatibility issue on IRIX 6.5. On IRIX 6.4+, sh +# is ksh but when the shell is invoked as "sh" and the current value of +# the _XPG environment variable is not equal to 1 (one), the special +# positional parameter $0, within a function call, is the name of the +# function. +progpath=$0 + +# The name of this program. +progname=`$ECHO "$progpath" |$SED "$sed_basename"` + +# Make sure we have an absolute progpath for reexecution: +case $progpath in + [\\/]*|[A-Za-z]:\\*) ;; + *[\\/]*) + progdir=`$ECHO "$progpath" |$SED "$sed_dirname"` + progdir=`cd "$progdir" && pwd` + progpath=$progdir/$progname + ;; + *) + _G_IFS=$IFS + IFS=${PATH_SEPARATOR-:} + for progdir in $PATH; do + IFS=$_G_IFS + test -x "$progdir/$progname" && break + done + IFS=$_G_IFS + test -n "$progdir" || progdir=`pwd` + progpath=$progdir/$progname + ;; +esac + + +## ----------------- ## +## Standard options. ## +## ----------------- ## + +# The following options affect the operation of the functions defined +# below, and should be set appropriately depending on run-time para- +# meters passed on the command line. + +opt_dry_run=false +opt_quiet=false +opt_verbose=false + +# Categories 'all' and 'none' are always available. Append any others +# you will pass as the first argument to func_warning from your own +# code. +warning_categories= + +# By default, display warnings according to 'opt_warning_types'. Set +# 'warning_func' to ':' to elide all warnings, or func_fatal_error to +# treat the next displayed warning as a fatal error. +warning_func=func_warn_and_continue + +# Set to 'all' to display all warnings, 'none' to suppress all +# warnings, or a space delimited list of some subset of +# 'warning_categories' to display only the listed warnings. +opt_warning_types=all + + +## -------------------- ## +## Resource management. ## +## -------------------- ## + +# This section contains definitions for functions that each ensure a +# particular resource (a file, or a non-empty configuration variable for +# example) is available, and if appropriate to extract default values +# from pertinent package files. Call them using their associated +# 'require_*' variable to ensure that they are executed, at most, once. +# +# It's entirely deliberate that calling these functions can set +# variables that don't obey the namespace limitations obeyed by the rest +# of this file, in order that that they be as useful as possible to +# callers. + + +# require_term_colors +# ------------------- +# Allow display of bold text on terminals that support it. +require_term_colors=func_require_term_colors +func_require_term_colors () +{ + $debug_cmd + + test -t 1 && { + # COLORTERM and USE_ANSI_COLORS environment variables take + # precedence, because most terminfo databases neglect to describe + # whether color sequences are supported. + test -n "${COLORTERM+set}" && : ${USE_ANSI_COLORS="1"} + + if test 1 = "$USE_ANSI_COLORS"; then + # Standard ANSI escape sequences + tc_reset='' + tc_bold=''; tc_standout='' + tc_red=''; tc_green='' + tc_blue=''; tc_cyan='' + else + # Otherwise trust the terminfo database after all. + test -n "`tput sgr0 2>/dev/null`" && { + tc_reset=`tput sgr0` + test -n "`tput bold 2>/dev/null`" && tc_bold=`tput bold` + tc_standout=$tc_bold + test -n "`tput smso 2>/dev/null`" && tc_standout=`tput smso` + test -n "`tput setaf 1 2>/dev/null`" && tc_red=`tput setaf 1` + test -n "`tput setaf 2 2>/dev/null`" && tc_green=`tput setaf 2` + test -n "`tput setaf 4 2>/dev/null`" && tc_blue=`tput setaf 4` + test -n "`tput setaf 5 2>/dev/null`" && tc_cyan=`tput setaf 5` + } + fi + } + + require_term_colors=: +} + + +## ----------------- ## +## Function library. ## +## ----------------- ## + +# This section contains a variety of useful functions to call in your +# scripts. Take note of the portable wrappers for features provided by +# some modern shells, which will fall back to slower equivalents on +# less featureful shells. + + +# func_append VAR VALUE +# --------------------- +# Append VALUE onto the existing contents of VAR. + + # We should try to minimise forks, especially on Windows where they are + # unreasonably slow, so skip the feature probes when bash or zsh are + # being used: + if test set = "${BASH_VERSION+set}${ZSH_VERSION+set}"; then + : ${_G_HAVE_ARITH_OP="yes"} + : ${_G_HAVE_XSI_OPS="yes"} + # The += operator was introduced in bash 3.1 + case $BASH_VERSION in + [12].* | 3.0 | 3.0*) ;; + *) + : ${_G_HAVE_PLUSEQ_OP="yes"} + ;; + esac + fi + + # _G_HAVE_PLUSEQ_OP + # Can be empty, in which case the shell is probed, "yes" if += is + # useable or anything else if it does not work. + test -z "$_G_HAVE_PLUSEQ_OP" \ + && (eval 'x=a; x+=" b"; test "a b" = "$x"') 2>/dev/null \ + && _G_HAVE_PLUSEQ_OP=yes + +if test yes = "$_G_HAVE_PLUSEQ_OP" +then + # This is an XSI compatible shell, allowing a faster implementation... + eval 'func_append () + { + $debug_cmd + + eval "$1+=\$2" + }' +else + # ...otherwise fall back to using expr, which is often a shell builtin. + func_append () + { + $debug_cmd + + eval "$1=\$$1\$2" + } +fi + + +# func_append_quoted VAR VALUE +# ---------------------------- +# Quote VALUE and append to the end of shell variable VAR, separated +# by a space. +if test yes = "$_G_HAVE_PLUSEQ_OP"; then + eval 'func_append_quoted () + { + $debug_cmd + + func_quote_for_eval "$2" + eval "$1+=\\ \$func_quote_for_eval_result" + }' +else + func_append_quoted () + { + $debug_cmd + + func_quote_for_eval "$2" + eval "$1=\$$1\\ \$func_quote_for_eval_result" + } +fi + + +# func_append_uniq VAR VALUE +# -------------------------- +# Append unique VALUE onto the existing contents of VAR, assuming +# entries are delimited by the first character of VALUE. For example: +# +# func_append_uniq options " --another-option option-argument" +# +# will only append to $options if " --another-option option-argument " +# is not already present somewhere in $options already (note spaces at +# each end implied by leading space in second argument). +func_append_uniq () +{ + $debug_cmd + + eval _G_current_value='`$ECHO $'$1'`' + _G_delim=`expr "$2" : '\(.\)'` + + case $_G_delim$_G_current_value$_G_delim in + *"$2$_G_delim"*) ;; + *) func_append "$@" ;; + esac +} + + +# func_arith TERM... +# ------------------ +# Set func_arith_result to the result of evaluating TERMs. + test -z "$_G_HAVE_ARITH_OP" \ + && (eval 'test 2 = $(( 1 + 1 ))') 2>/dev/null \ + && _G_HAVE_ARITH_OP=yes + +if test yes = "$_G_HAVE_ARITH_OP"; then + eval 'func_arith () + { + $debug_cmd + + func_arith_result=$(( $* )) + }' +else + func_arith () + { + $debug_cmd + + func_arith_result=`expr "$@"` + } +fi + + +# func_basename FILE +# ------------------ +# Set func_basename_result to FILE with everything up to and including +# the last / stripped. +if test yes = "$_G_HAVE_XSI_OPS"; then + # If this shell supports suffix pattern removal, then use it to avoid + # forking. Hide the definitions single quotes in case the shell chokes + # on unsupported syntax... + _b='func_basename_result=${1##*/}' + _d='case $1 in + */*) func_dirname_result=${1%/*}$2 ;; + * ) func_dirname_result=$3 ;; + esac' + +else + # ...otherwise fall back to using sed. + _b='func_basename_result=`$ECHO "$1" |$SED "$sed_basename"`' + _d='func_dirname_result=`$ECHO "$1" |$SED "$sed_dirname"` + if test "X$func_dirname_result" = "X$1"; then + func_dirname_result=$3 + else + func_append func_dirname_result "$2" + fi' +fi + +eval 'func_basename () +{ + $debug_cmd + + '"$_b"' +}' + + +# func_dirname FILE APPEND NONDIR_REPLACEMENT +# ------------------------------------------- +# Compute the dirname of FILE. If nonempty, add APPEND to the result, +# otherwise set result to NONDIR_REPLACEMENT. +eval 'func_dirname () +{ + $debug_cmd + + '"$_d"' +}' + + +# func_dirname_and_basename FILE APPEND NONDIR_REPLACEMENT +# -------------------------------------------------------- +# Perform func_basename and func_dirname in a single function +# call: +# dirname: Compute the dirname of FILE. If nonempty, +# add APPEND to the result, otherwise set result +# to NONDIR_REPLACEMENT. +# value returned in "$func_dirname_result" +# basename: Compute filename of FILE. +# value retuned in "$func_basename_result" +# For efficiency, we do not delegate to the functions above but instead +# duplicate the functionality here. +eval 'func_dirname_and_basename () +{ + $debug_cmd + + '"$_b"' + '"$_d"' +}' + + +# func_echo ARG... +# ---------------- +# Echo program name prefixed message. +func_echo () +{ + $debug_cmd + + _G_message=$* + + func_echo_IFS=$IFS + IFS=$nl + for _G_line in $_G_message; do + IFS=$func_echo_IFS + $ECHO "$progname: $_G_line" + done + IFS=$func_echo_IFS +} + + +# func_echo_all ARG... +# -------------------- +# Invoke $ECHO with all args, space-separated. +func_echo_all () +{ + $ECHO "$*" +} + + +# func_echo_infix_1 INFIX ARG... +# ------------------------------ +# Echo program name, followed by INFIX on the first line, with any +# additional lines not showing INFIX. +func_echo_infix_1 () +{ + $debug_cmd + + $require_term_colors + + _G_infix=$1; shift + _G_indent=$_G_infix + _G_prefix="$progname: $_G_infix: " + _G_message=$* + + # Strip color escape sequences before counting printable length + for _G_tc in "$tc_reset" "$tc_bold" "$tc_standout" "$tc_red" "$tc_green" "$tc_blue" "$tc_cyan" + do + test -n "$_G_tc" && { + _G_esc_tc=`$ECHO "$_G_tc" | $SED "$sed_make_literal_regex"` + _G_indent=`$ECHO "$_G_indent" | $SED "s|$_G_esc_tc||g"` + } + done + _G_indent="$progname: "`echo "$_G_indent" | $SED 's|.| |g'`" " ## exclude from sc_prohibit_nested_quotes + + func_echo_infix_1_IFS=$IFS + IFS=$nl + for _G_line in $_G_message; do + IFS=$func_echo_infix_1_IFS + $ECHO "$_G_prefix$tc_bold$_G_line$tc_reset" >&2 + _G_prefix=$_G_indent + done + IFS=$func_echo_infix_1_IFS +} + + +# func_error ARG... +# ----------------- +# Echo program name prefixed message to standard error. +func_error () +{ + $debug_cmd + + $require_term_colors + + func_echo_infix_1 " $tc_standout${tc_red}error$tc_reset" "$*" >&2 +} + + +# func_fatal_error ARG... +# ----------------------- +# Echo program name prefixed message to standard error, and exit. +func_fatal_error () +{ + $debug_cmd + + func_error "$*" + exit $EXIT_FAILURE +} + + +# func_grep EXPRESSION FILENAME +# ----------------------------- +# Check whether EXPRESSION matches any line of FILENAME, without output. +func_grep () +{ + $debug_cmd + + $GREP "$1" "$2" >/dev/null 2>&1 +} + + +# func_len STRING +# --------------- +# Set func_len_result to the length of STRING. STRING may not +# start with a hyphen. + test -z "$_G_HAVE_XSI_OPS" \ + && (eval 'x=a/b/c; + test 5aa/bb/cc = "${#x}${x%%/*}${x%/*}${x#*/}${x##*/}"') 2>/dev/null \ + && _G_HAVE_XSI_OPS=yes + +if test yes = "$_G_HAVE_XSI_OPS"; then + eval 'func_len () + { + $debug_cmd + + func_len_result=${#1} + }' +else + func_len () + { + $debug_cmd + + func_len_result=`expr "$1" : ".*" 2>/dev/null || echo $max_cmd_len` + } +fi + + +# func_mkdir_p DIRECTORY-PATH +# --------------------------- +# Make sure the entire path to DIRECTORY-PATH is available. +func_mkdir_p () +{ + $debug_cmd + + _G_directory_path=$1 + _G_dir_list= + + if test -n "$_G_directory_path" && test : != "$opt_dry_run"; then + + # Protect directory names starting with '-' + case $_G_directory_path in + -*) _G_directory_path=./$_G_directory_path ;; + esac + + # While some portion of DIR does not yet exist... + while test ! -d "$_G_directory_path"; do + # ...make a list in topmost first order. Use a colon delimited + # list incase some portion of path contains whitespace. + _G_dir_list=$_G_directory_path:$_G_dir_list + + # If the last portion added has no slash in it, the list is done + case $_G_directory_path in */*) ;; *) break ;; esac + + # ...otherwise throw away the child directory and loop + _G_directory_path=`$ECHO "$_G_directory_path" | $SED -e "$sed_dirname"` + done + _G_dir_list=`$ECHO "$_G_dir_list" | $SED 's|:*$||'` + + func_mkdir_p_IFS=$IFS; IFS=: + for _G_dir in $_G_dir_list; do + IFS=$func_mkdir_p_IFS + # mkdir can fail with a 'File exist' error if two processes + # try to create one of the directories concurrently. Don't + # stop in that case! + $MKDIR "$_G_dir" 2>/dev/null || : + done + IFS=$func_mkdir_p_IFS + + # Bail out if we (or some other process) failed to create a directory. + test -d "$_G_directory_path" || \ + func_fatal_error "Failed to create '$1'" + fi +} + + +# func_mktempdir [BASENAME] +# ------------------------- +# Make a temporary directory that won't clash with other running +# libtool processes, and avoids race conditions if possible. If +# given, BASENAME is the basename for that directory. +func_mktempdir () +{ + $debug_cmd + + _G_template=${TMPDIR-/tmp}/${1-$progname} + + if test : = "$opt_dry_run"; then + # Return a directory name, but don't create it in dry-run mode + _G_tmpdir=$_G_template-$$ + else + + # If mktemp works, use that first and foremost + _G_tmpdir=`mktemp -d "$_G_template-XXXXXXXX" 2>/dev/null` + + if test ! -d "$_G_tmpdir"; then + # Failing that, at least try and use $RANDOM to avoid a race + _G_tmpdir=$_G_template-${RANDOM-0}$$ + + func_mktempdir_umask=`umask` + umask 0077 + $MKDIR "$_G_tmpdir" + umask $func_mktempdir_umask + fi + + # If we're not in dry-run mode, bomb out on failure + test -d "$_G_tmpdir" || \ + func_fatal_error "cannot create temporary directory '$_G_tmpdir'" + fi + + $ECHO "$_G_tmpdir" +} + + +# func_normal_abspath PATH +# ------------------------ +# Remove doubled-up and trailing slashes, "." path components, +# and cancel out any ".." path components in PATH after making +# it an absolute path. +func_normal_abspath () +{ + $debug_cmd + + # These SED scripts presuppose an absolute path with a trailing slash. + _G_pathcar='s|^/\([^/]*\).*$|\1|' + _G_pathcdr='s|^/[^/]*||' + _G_removedotparts=':dotsl + s|/\./|/|g + t dotsl + s|/\.$|/|' + _G_collapseslashes='s|/\{1,\}|/|g' + _G_finalslash='s|/*$|/|' + + # Start from root dir and reassemble the path. + func_normal_abspath_result= + func_normal_abspath_tpath=$1 + func_normal_abspath_altnamespace= + case $func_normal_abspath_tpath in + "") + # Empty path, that just means $cwd. + func_stripname '' '/' "`pwd`" + func_normal_abspath_result=$func_stripname_result + return + ;; + # The next three entries are used to spot a run of precisely + # two leading slashes without using negated character classes; + # we take advantage of case's first-match behaviour. + ///*) + # Unusual form of absolute path, do nothing. + ;; + //*) + # Not necessarily an ordinary path; POSIX reserves leading '//' + # and for example Cygwin uses it to access remote file shares + # over CIFS/SMB, so we conserve a leading double slash if found. + func_normal_abspath_altnamespace=/ + ;; + /*) + # Absolute path, do nothing. + ;; + *) + # Relative path, prepend $cwd. + func_normal_abspath_tpath=`pwd`/$func_normal_abspath_tpath + ;; + esac + + # Cancel out all the simple stuff to save iterations. We also want + # the path to end with a slash for ease of parsing, so make sure + # there is one (and only one) here. + func_normal_abspath_tpath=`$ECHO "$func_normal_abspath_tpath" | $SED \ + -e "$_G_removedotparts" -e "$_G_collapseslashes" -e "$_G_finalslash"` + while :; do + # Processed it all yet? + if test / = "$func_normal_abspath_tpath"; then + # If we ascended to the root using ".." the result may be empty now. + if test -z "$func_normal_abspath_result"; then + func_normal_abspath_result=/ + fi + break + fi + func_normal_abspath_tcomponent=`$ECHO "$func_normal_abspath_tpath" | $SED \ + -e "$_G_pathcar"` + func_normal_abspath_tpath=`$ECHO "$func_normal_abspath_tpath" | $SED \ + -e "$_G_pathcdr"` + # Figure out what to do with it + case $func_normal_abspath_tcomponent in + "") + # Trailing empty path component, ignore it. + ;; + ..) + # Parent dir; strip last assembled component from result. + func_dirname "$func_normal_abspath_result" + func_normal_abspath_result=$func_dirname_result + ;; + *) + # Actual path component, append it. + func_append func_normal_abspath_result "/$func_normal_abspath_tcomponent" + ;; + esac + done + # Restore leading double-slash if one was found on entry. + func_normal_abspath_result=$func_normal_abspath_altnamespace$func_normal_abspath_result +} + + +# func_notquiet ARG... +# -------------------- +# Echo program name prefixed message only when not in quiet mode. +func_notquiet () +{ + $debug_cmd + + $opt_quiet || func_echo ${1+"$@"} + + # A bug in bash halts the script if the last line of a function + # fails when set -e is in force, so we need another command to + # work around that: + : +} + + +# func_relative_path SRCDIR DSTDIR +# -------------------------------- +# Set func_relative_path_result to the relative path from SRCDIR to DSTDIR. +func_relative_path () +{ + $debug_cmd + + func_relative_path_result= + func_normal_abspath "$1" + func_relative_path_tlibdir=$func_normal_abspath_result + func_normal_abspath "$2" + func_relative_path_tbindir=$func_normal_abspath_result + + # Ascend the tree starting from libdir + while :; do + # check if we have found a prefix of bindir + case $func_relative_path_tbindir in + $func_relative_path_tlibdir) + # found an exact match + func_relative_path_tcancelled= + break + ;; + $func_relative_path_tlibdir*) + # found a matching prefix + func_stripname "$func_relative_path_tlibdir" '' "$func_relative_path_tbindir" + func_relative_path_tcancelled=$func_stripname_result + if test -z "$func_relative_path_result"; then + func_relative_path_result=. + fi + break + ;; + *) + func_dirname $func_relative_path_tlibdir + func_relative_path_tlibdir=$func_dirname_result + if test -z "$func_relative_path_tlibdir"; then + # Have to descend all the way to the root! + func_relative_path_result=../$func_relative_path_result + func_relative_path_tcancelled=$func_relative_path_tbindir + break + fi + func_relative_path_result=../$func_relative_path_result + ;; + esac + done + + # Now calculate path; take care to avoid doubling-up slashes. + func_stripname '' '/' "$func_relative_path_result" + func_relative_path_result=$func_stripname_result + func_stripname '/' '/' "$func_relative_path_tcancelled" + if test -n "$func_stripname_result"; then + func_append func_relative_path_result "/$func_stripname_result" + fi + + # Normalisation. If bindir is libdir, return '.' else relative path. + if test -n "$func_relative_path_result"; then + func_stripname './' '' "$func_relative_path_result" + func_relative_path_result=$func_stripname_result + fi + + test -n "$func_relative_path_result" || func_relative_path_result=. + + : +} + + +# func_quote_for_eval ARG... +# -------------------------- +# Aesthetically quote ARGs to be evaled later. +# This function returns two values: +# i) func_quote_for_eval_result +# double-quoted, suitable for a subsequent eval +# ii) func_quote_for_eval_unquoted_result +# has all characters that are still active within double +# quotes backslashified. +func_quote_for_eval () +{ + $debug_cmd + + func_quote_for_eval_unquoted_result= + func_quote_for_eval_result= + while test 0 -lt $#; do + case $1 in + *[\\\`\"\$]*) + _G_unquoted_arg=`printf '%s\n' "$1" |$SED "$sed_quote_subst"` ;; + *) + _G_unquoted_arg=$1 ;; + esac + if test -n "$func_quote_for_eval_unquoted_result"; then + func_append func_quote_for_eval_unquoted_result " $_G_unquoted_arg" + else + func_append func_quote_for_eval_unquoted_result "$_G_unquoted_arg" + fi + + case $_G_unquoted_arg in + # Double-quote args containing shell metacharacters to delay + # word splitting, command substitution and variable expansion + # for a subsequent eval. + # Many Bourne shells cannot handle close brackets correctly + # in scan sets, so we specify it separately. + *[\[\~\#\^\&\*\(\)\{\}\|\;\<\>\?\'\ \ ]*|*]*|"") + _G_quoted_arg=\"$_G_unquoted_arg\" + ;; + *) + _G_quoted_arg=$_G_unquoted_arg + ;; + esac + + if test -n "$func_quote_for_eval_result"; then + func_append func_quote_for_eval_result " $_G_quoted_arg" + else + func_append func_quote_for_eval_result "$_G_quoted_arg" + fi + shift + done +} + + +# func_quote_for_expand ARG +# ------------------------- +# Aesthetically quote ARG to be evaled later; same as above, +# but do not quote variable references. +func_quote_for_expand () +{ + $debug_cmd + + case $1 in + *[\\\`\"]*) + _G_arg=`$ECHO "$1" | $SED \ + -e "$sed_double_quote_subst" -e "$sed_double_backslash"` ;; + *) + _G_arg=$1 ;; + esac + + case $_G_arg in + # Double-quote args containing shell metacharacters to delay + # word splitting and command substitution for a subsequent eval. + # Many Bourne shells cannot handle close brackets correctly + # in scan sets, so we specify it separately. + *[\[\~\#\^\&\*\(\)\{\}\|\;\<\>\?\'\ \ ]*|*]*|"") + _G_arg=\"$_G_arg\" + ;; + esac + + func_quote_for_expand_result=$_G_arg +} + + +# func_stripname PREFIX SUFFIX NAME +# --------------------------------- +# strip PREFIX and SUFFIX from NAME, and store in func_stripname_result. +# PREFIX and SUFFIX must not contain globbing or regex special +# characters, hashes, percent signs, but SUFFIX may contain a leading +# dot (in which case that matches only a dot). +if test yes = "$_G_HAVE_XSI_OPS"; then + eval 'func_stripname () + { + $debug_cmd + + # pdksh 5.2.14 does not do ${X%$Y} correctly if both X and Y are + # positional parameters, so assign one to ordinary variable first. + func_stripname_result=$3 + func_stripname_result=${func_stripname_result#"$1"} + func_stripname_result=${func_stripname_result%"$2"} + }' +else + func_stripname () + { + $debug_cmd + + case $2 in + .*) func_stripname_result=`$ECHO "$3" | $SED -e "s%^$1%%" -e "s%\\\\$2\$%%"`;; + *) func_stripname_result=`$ECHO "$3" | $SED -e "s%^$1%%" -e "s%$2\$%%"`;; + esac + } +fi + + +# func_show_eval CMD [FAIL_EXP] +# ----------------------------- +# Unless opt_quiet is true, then output CMD. Then, if opt_dryrun is +# not true, evaluate CMD. If the evaluation of CMD fails, and FAIL_EXP +# is given, then evaluate it. +func_show_eval () +{ + $debug_cmd + + _G_cmd=$1 + _G_fail_exp=${2-':'} + + func_quote_for_expand "$_G_cmd" + eval "func_notquiet $func_quote_for_expand_result" + + $opt_dry_run || { + eval "$_G_cmd" + _G_status=$? + if test 0 -ne "$_G_status"; then + eval "(exit $_G_status); $_G_fail_exp" + fi + } +} + + +# func_show_eval_locale CMD [FAIL_EXP] +# ------------------------------------ +# Unless opt_quiet is true, then output CMD. Then, if opt_dryrun is +# not true, evaluate CMD. If the evaluation of CMD fails, and FAIL_EXP +# is given, then evaluate it. Use the saved locale for evaluation. +func_show_eval_locale () +{ + $debug_cmd + + _G_cmd=$1 + _G_fail_exp=${2-':'} + + $opt_quiet || { + func_quote_for_expand "$_G_cmd" + eval "func_echo $func_quote_for_expand_result" + } + + $opt_dry_run || { + eval "$_G_user_locale + $_G_cmd" + _G_status=$? + eval "$_G_safe_locale" + if test 0 -ne "$_G_status"; then + eval "(exit $_G_status); $_G_fail_exp" + fi + } +} + + +# func_tr_sh +# ---------- +# Turn $1 into a string suitable for a shell variable name. +# Result is stored in $func_tr_sh_result. All characters +# not in the set a-zA-Z0-9_ are replaced with '_'. Further, +# if $1 begins with a digit, a '_' is prepended as well. +func_tr_sh () +{ + $debug_cmd + + case $1 in + [0-9]* | *[!a-zA-Z0-9_]*) + func_tr_sh_result=`$ECHO "$1" | $SED -e 's/^\([0-9]\)/_\1/' -e 's/[^a-zA-Z0-9_]/_/g'` + ;; + * ) + func_tr_sh_result=$1 + ;; + esac +} + + +# func_verbose ARG... +# ------------------- +# Echo program name prefixed message in verbose mode only. +func_verbose () +{ + $debug_cmd + + $opt_verbose && func_echo "$*" + + : +} + + +# func_warn_and_continue ARG... +# ----------------------------- +# Echo program name prefixed warning message to standard error. +func_warn_and_continue () +{ + $debug_cmd + + $require_term_colors + + func_echo_infix_1 "${tc_red}warning$tc_reset" "$*" >&2 +} + + +# func_warning CATEGORY ARG... +# ---------------------------- +# Echo program name prefixed warning message to standard error. Warning +# messages can be filtered according to CATEGORY, where this function +# elides messages where CATEGORY is not listed in the global variable +# 'opt_warning_types'. +func_warning () +{ + $debug_cmd + + # CATEGORY must be in the warning_categories list! + case " $warning_categories " in + *" $1 "*) ;; + *) func_internal_error "invalid warning category '$1'" ;; + esac + + _G_category=$1 + shift + + case " $opt_warning_types " in + *" $_G_category "*) $warning_func ${1+"$@"} ;; + esac +} + + +# func_sort_ver VER1 VER2 +# ----------------------- +# 'sort -V' is not generally available. +# Note this deviates from the version comparison in automake +# in that it treats 1.5 < 1.5.0, and treats 1.4.4a < 1.4-p3a +# but this should suffice as we won't be specifying old +# version formats or redundant trailing .0 in bootstrap.conf. +# If we did want full compatibility then we should probably +# use m4_version_compare from autoconf. +func_sort_ver () +{ + $debug_cmd + + printf '%s\n%s\n' "$1" "$2" \ + | sort -t. -k 1,1n -k 2,2n -k 3,3n -k 4,4n -k 5,5n -k 6,6n -k 7,7n -k 8,8n -k 9,9n +} + +# func_lt_ver PREV CURR +# --------------------- +# Return true if PREV and CURR are in the correct order according to +# func_sort_ver, otherwise false. Use it like this: +# +# func_lt_ver "$prev_ver" "$proposed_ver" || func_fatal_error "..." +func_lt_ver () +{ + $debug_cmd + + test "x$1" = x`func_sort_ver "$1" "$2" | $SED 1q` +} + + +# Local variables: +# mode: shell-script +# sh-indentation: 2 +# eval: (add-hook 'before-save-hook 'time-stamp) +# time-stamp-pattern: "10/scriptversion=%:y-%02m-%02d.%02H; # UTC" +# time-stamp-time-zone: "UTC" +# End: +#! /bin/sh + +# Set a version string for this script. +scriptversion=2014-01-07.03; # UTC + +# A portable, pluggable option parser for Bourne shell. +# Written by Gary V. Vaughan, 2010 + +# Copyright (C) 2010-2015 Free Software Foundation, Inc. +# This is free software; see the source for copying conditions. There is NO +# warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. + +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. + +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +# Please report bugs or propose patches to gary@gnu.org. + + +## ------ ## +## Usage. ## +## ------ ## + +# This file is a library for parsing options in your shell scripts along +# with assorted other useful supporting features that you can make use +# of too. +# +# For the simplest scripts you might need only: +# +# #!/bin/sh +# . relative/path/to/funclib.sh +# . relative/path/to/options-parser +# scriptversion=1.0 +# func_options ${1+"$@"} +# eval set dummy "$func_options_result"; shift +# ...rest of your script... +# +# In order for the '--version' option to work, you will need to have a +# suitably formatted comment like the one at the top of this file +# starting with '# Written by ' and ending with '# warranty; '. +# +# For '-h' and '--help' to work, you will also need a one line +# description of your script's purpose in a comment directly above the +# '# Written by ' line, like the one at the top of this file. +# +# The default options also support '--debug', which will turn on shell +# execution tracing (see the comment above debug_cmd below for another +# use), and '--verbose' and the func_verbose function to allow your script +# to display verbose messages only when your user has specified +# '--verbose'. +# +# After sourcing this file, you can plug processing for additional +# options by amending the variables from the 'Configuration' section +# below, and following the instructions in the 'Option parsing' +# section further down. + +## -------------- ## +## Configuration. ## +## -------------- ## + +# You should override these variables in your script after sourcing this +# file so that they reflect the customisations you have added to the +# option parser. + +# The usage line for option parsing errors and the start of '-h' and +# '--help' output messages. You can embed shell variables for delayed +# expansion at the time the message is displayed, but you will need to +# quote other shell meta-characters carefully to prevent them being +# expanded when the contents are evaled. +usage='$progpath [OPTION]...' + +# Short help message in response to '-h' and '--help'. Add to this or +# override it after sourcing this library to reflect the full set of +# options your script accepts. +usage_message="\ + --debug enable verbose shell tracing + -W, --warnings=CATEGORY + report the warnings falling in CATEGORY [all] + -v, --verbose verbosely report processing + --version print version information and exit + -h, --help print short or long help message and exit +" + +# Additional text appended to 'usage_message' in response to '--help'. +long_help_message=" +Warning categories include: + 'all' show all warnings + 'none' turn off all the warnings + 'error' warnings are treated as fatal errors" + +# Help message printed before fatal option parsing errors. +fatal_help="Try '\$progname --help' for more information." + + + +## ------------------------- ## +## Hook function management. ## +## ------------------------- ## + +# This section contains functions for adding, removing, and running hooks +# to the main code. A hook is just a named list of of function, that can +# be run in order later on. + +# func_hookable FUNC_NAME +# ----------------------- +# Declare that FUNC_NAME will run hooks added with +# 'func_add_hook FUNC_NAME ...'. +func_hookable () +{ + $debug_cmd + + func_append hookable_fns " $1" +} + + +# func_add_hook FUNC_NAME HOOK_FUNC +# --------------------------------- +# Request that FUNC_NAME call HOOK_FUNC before it returns. FUNC_NAME must +# first have been declared "hookable" by a call to 'func_hookable'. +func_add_hook () +{ + $debug_cmd + + case " $hookable_fns " in + *" $1 "*) ;; + *) func_fatal_error "'$1' does not accept hook functions." ;; + esac + + eval func_append ${1}_hooks '" $2"' +} + + +# func_remove_hook FUNC_NAME HOOK_FUNC +# ------------------------------------ +# Remove HOOK_FUNC from the list of functions called by FUNC_NAME. +func_remove_hook () +{ + $debug_cmd + + eval ${1}_hooks='`$ECHO "\$'$1'_hooks" |$SED "s| '$2'||"`' +} + + +# func_run_hooks FUNC_NAME [ARG]... +# --------------------------------- +# Run all hook functions registered to FUNC_NAME. +# It is assumed that the list of hook functions contains nothing more +# than a whitespace-delimited list of legal shell function names, and +# no effort is wasted trying to catch shell meta-characters or preserve +# whitespace. +func_run_hooks () +{ + $debug_cmd + + case " $hookable_fns " in + *" $1 "*) ;; + *) func_fatal_error "'$1' does not support hook funcions.n" ;; + esac + + eval _G_hook_fns=\$$1_hooks; shift + + for _G_hook in $_G_hook_fns; do + eval $_G_hook '"$@"' + + # store returned options list back into positional + # parameters for next 'cmd' execution. + eval _G_hook_result=\$${_G_hook}_result + eval set dummy "$_G_hook_result"; shift + done + + func_quote_for_eval ${1+"$@"} + func_run_hooks_result=$func_quote_for_eval_result +} + + + +## --------------- ## +## Option parsing. ## +## --------------- ## + +# In order to add your own option parsing hooks, you must accept the +# full positional parameter list in your hook function, remove any +# options that you action, and then pass back the remaining unprocessed +# options in '_result', escaped suitably for +# 'eval'. Like this: +# +# my_options_prep () +# { +# $debug_cmd +# +# # Extend the existing usage message. +# usage_message=$usage_message' +# -s, --silent don'\''t print informational messages +# ' +# +# func_quote_for_eval ${1+"$@"} +# my_options_prep_result=$func_quote_for_eval_result +# } +# func_add_hook func_options_prep my_options_prep +# +# +# my_silent_option () +# { +# $debug_cmd +# +# # Note that for efficiency, we parse as many options as we can +# # recognise in a loop before passing the remainder back to the +# # caller on the first unrecognised argument we encounter. +# while test $# -gt 0; do +# opt=$1; shift +# case $opt in +# --silent|-s) opt_silent=: ;; +# # Separate non-argument short options: +# -s*) func_split_short_opt "$_G_opt" +# set dummy "$func_split_short_opt_name" \ +# "-$func_split_short_opt_arg" ${1+"$@"} +# shift +# ;; +# *) set dummy "$_G_opt" "$*"; shift; break ;; +# esac +# done +# +# func_quote_for_eval ${1+"$@"} +# my_silent_option_result=$func_quote_for_eval_result +# } +# func_add_hook func_parse_options my_silent_option +# +# +# my_option_validation () +# { +# $debug_cmd +# +# $opt_silent && $opt_verbose && func_fatal_help "\ +# '--silent' and '--verbose' options are mutually exclusive." +# +# func_quote_for_eval ${1+"$@"} +# my_option_validation_result=$func_quote_for_eval_result +# } +# func_add_hook func_validate_options my_option_validation +# +# You'll alse need to manually amend $usage_message to reflect the extra +# options you parse. It's preferable to append if you can, so that +# multiple option parsing hooks can be added safely. + + +# func_options [ARG]... +# --------------------- +# All the functions called inside func_options are hookable. See the +# individual implementations for details. +func_hookable func_options +func_options () +{ + $debug_cmd + + func_options_prep ${1+"$@"} + eval func_parse_options \ + ${func_options_prep_result+"$func_options_prep_result"} + eval func_validate_options \ + ${func_parse_options_result+"$func_parse_options_result"} + + eval func_run_hooks func_options \ + ${func_validate_options_result+"$func_validate_options_result"} + + # save modified positional parameters for caller + func_options_result=$func_run_hooks_result +} + + +# func_options_prep [ARG]... +# -------------------------- +# All initialisations required before starting the option parse loop. +# Note that when calling hook functions, we pass through the list of +# positional parameters. If a hook function modifies that list, and +# needs to propogate that back to rest of this script, then the complete +# modified list must be put in 'func_run_hooks_result' before +# returning. +func_hookable func_options_prep +func_options_prep () +{ + $debug_cmd + + # Option defaults: + opt_verbose=false + opt_warning_types= + + func_run_hooks func_options_prep ${1+"$@"} + + # save modified positional parameters for caller + func_options_prep_result=$func_run_hooks_result +} + + +# func_parse_options [ARG]... +# --------------------------- +# The main option parsing loop. +func_hookable func_parse_options +func_parse_options () +{ + $debug_cmd + + func_parse_options_result= + + # this just eases exit handling + while test $# -gt 0; do + # Defer to hook functions for initial option parsing, so they + # get priority in the event of reusing an option name. + func_run_hooks func_parse_options ${1+"$@"} + + # Adjust func_parse_options positional parameters to match + eval set dummy "$func_run_hooks_result"; shift + + # Break out of the loop if we already parsed every option. + test $# -gt 0 || break + + _G_opt=$1 + shift + case $_G_opt in + --debug|-x) debug_cmd='set -x' + func_echo "enabling shell trace mode" + $debug_cmd + ;; + + --no-warnings|--no-warning|--no-warn) + set dummy --warnings none ${1+"$@"} + shift + ;; + + --warnings|--warning|-W) + test $# = 0 && func_missing_arg $_G_opt && break + case " $warning_categories $1" in + *" $1 "*) + # trailing space prevents matching last $1 above + func_append_uniq opt_warning_types " $1" + ;; + *all) + opt_warning_types=$warning_categories + ;; + *none) + opt_warning_types=none + warning_func=: + ;; + *error) + opt_warning_types=$warning_categories + warning_func=func_fatal_error + ;; + *) + func_fatal_error \ + "unsupported warning category: '$1'" + ;; + esac + shift + ;; + + --verbose|-v) opt_verbose=: ;; + --version) func_version ;; + -\?|-h) func_usage ;; + --help) func_help ;; + + # Separate optargs to long options (plugins may need this): + --*=*) func_split_equals "$_G_opt" + set dummy "$func_split_equals_lhs" \ + "$func_split_equals_rhs" ${1+"$@"} + shift + ;; + + # Separate optargs to short options: + -W*) + func_split_short_opt "$_G_opt" + set dummy "$func_split_short_opt_name" \ + "$func_split_short_opt_arg" ${1+"$@"} + shift + ;; + + # Separate non-argument short options: + -\?*|-h*|-v*|-x*) + func_split_short_opt "$_G_opt" + set dummy "$func_split_short_opt_name" \ + "-$func_split_short_opt_arg" ${1+"$@"} + shift + ;; + + --) break ;; + -*) func_fatal_help "unrecognised option: '$_G_opt'" ;; + *) set dummy "$_G_opt" ${1+"$@"}; shift; break ;; + esac + done + + # save modified positional parameters for caller + func_quote_for_eval ${1+"$@"} + func_parse_options_result=$func_quote_for_eval_result +} + + +# func_validate_options [ARG]... +# ------------------------------ +# Perform any sanity checks on option settings and/or unconsumed +# arguments. +func_hookable func_validate_options +func_validate_options () +{ + $debug_cmd + + # Display all warnings if -W was not given. + test -n "$opt_warning_types" || opt_warning_types=" $warning_categories" + + func_run_hooks func_validate_options ${1+"$@"} + + # Bail if the options were screwed! + $exit_cmd $EXIT_FAILURE + + # save modified positional parameters for caller + func_validate_options_result=$func_run_hooks_result +} + + + +## ----------------- ## +## Helper functions. ## +## ----------------- ## + +# This section contains the helper functions used by the rest of the +# hookable option parser framework in ascii-betical order. + + +# func_fatal_help ARG... +# ---------------------- +# Echo program name prefixed message to standard error, followed by +# a help hint, and exit. +func_fatal_help () +{ + $debug_cmd + + eval \$ECHO \""Usage: $usage"\" + eval \$ECHO \""$fatal_help"\" + func_error ${1+"$@"} + exit $EXIT_FAILURE +} + + +# func_help +# --------- +# Echo long help message to standard output and exit. +func_help () +{ + $debug_cmd + + func_usage_message + $ECHO "$long_help_message" + exit 0 +} + + +# func_missing_arg ARGNAME +# ------------------------ +# Echo program name prefixed message to standard error and set global +# exit_cmd. +func_missing_arg () +{ + $debug_cmd + + func_error "Missing argument for '$1'." + exit_cmd=exit +} + + +# func_split_equals STRING +# ------------------------ +# Set func_split_equals_lhs and func_split_equals_rhs shell variables after +# splitting STRING at the '=' sign. +test -z "$_G_HAVE_XSI_OPS" \ + && (eval 'x=a/b/c; + test 5aa/bb/cc = "${#x}${x%%/*}${x%/*}${x#*/}${x##*/}"') 2>/dev/null \ + && _G_HAVE_XSI_OPS=yes + +if test yes = "$_G_HAVE_XSI_OPS" +then + # This is an XSI compatible shell, allowing a faster implementation... + eval 'func_split_equals () + { + $debug_cmd + + func_split_equals_lhs=${1%%=*} + func_split_equals_rhs=${1#*=} + test "x$func_split_equals_lhs" = "x$1" \ + && func_split_equals_rhs= + }' +else + # ...otherwise fall back to using expr, which is often a shell builtin. + func_split_equals () + { + $debug_cmd + + func_split_equals_lhs=`expr "x$1" : 'x\([^=]*\)'` + func_split_equals_rhs= + test "x$func_split_equals_lhs" = "x$1" \ + || func_split_equals_rhs=`expr "x$1" : 'x[^=]*=\(.*\)$'` + } +fi #func_split_equals + + +# func_split_short_opt SHORTOPT +# ----------------------------- +# Set func_split_short_opt_name and func_split_short_opt_arg shell +# variables after splitting SHORTOPT after the 2nd character. +if test yes = "$_G_HAVE_XSI_OPS" +then + # This is an XSI compatible shell, allowing a faster implementation... + eval 'func_split_short_opt () + { + $debug_cmd + + func_split_short_opt_arg=${1#??} + func_split_short_opt_name=${1%"$func_split_short_opt_arg"} + }' +else + # ...otherwise fall back to using expr, which is often a shell builtin. + func_split_short_opt () + { + $debug_cmd + + func_split_short_opt_name=`expr "x$1" : 'x-\(.\)'` + func_split_short_opt_arg=`expr "x$1" : 'x-.\(.*\)$'` + } +fi #func_split_short_opt + + +# func_usage +# ---------- +# Echo short help message to standard output and exit. +func_usage () +{ + $debug_cmd + + func_usage_message + $ECHO "Run '$progname --help |${PAGER-more}' for full usage" + exit 0 +} + + +# func_usage_message +# ------------------ +# Echo short help message to standard output. +func_usage_message () +{ + $debug_cmd + + eval \$ECHO \""Usage: $usage"\" + echo + $SED -n 's|^# || + /^Written by/{ + x;p;x + } + h + /^Written by/q' < "$progpath" + echo + eval \$ECHO \""$usage_message"\" +} + + +# func_version +# ------------ +# Echo version message to standard output and exit. +func_version () +{ + $debug_cmd + + printf '%s\n' "$progname $scriptversion" + $SED -n ' + /(C)/!b go + :more + /\./!{ + N + s|\n# | | + b more + } + :go + /^# Written by /,/# warranty; / { + s|^# || + s|^# *$|| + s|\((C)\)[ 0-9,-]*[ ,-]\([1-9][0-9]* \)|\1 \2| + p + } + /^# Written by / { + s|^# || + p + } + /^warranty; /q' < "$progpath" + + exit $? +} + + +# Local variables: +# mode: shell-script +# sh-indentation: 2 +# eval: (add-hook 'before-save-hook 'time-stamp) +# time-stamp-pattern: "10/scriptversion=%:y-%02m-%02d.%02H; # UTC" +# time-stamp-time-zone: "UTC" +# End: + +# Set a version string. +scriptversion='(GNU libtool) 2.4.6' + + +# func_echo ARG... +# ---------------- +# Libtool also displays the current mode in messages, so override +# funclib.sh func_echo with this custom definition. +func_echo () +{ + $debug_cmd + + _G_message=$* + + func_echo_IFS=$IFS + IFS=$nl + for _G_line in $_G_message; do + IFS=$func_echo_IFS + $ECHO "$progname${opt_mode+: $opt_mode}: $_G_line" + done + IFS=$func_echo_IFS +} + + +# func_warning ARG... +# ------------------- +# Libtool warnings are not categorized, so override funclib.sh +# func_warning with this simpler definition. +func_warning () +{ + $debug_cmd + + $warning_func ${1+"$@"} +} + + +## ---------------- ## +## Options parsing. ## +## ---------------- ## + +# Hook in the functions to make sure our own options are parsed during +# the option parsing loop. + +usage='$progpath [OPTION]... [MODE-ARG]...' + +# Short help message in response to '-h'. +usage_message="Options: + --config show all configuration variables + --debug enable verbose shell tracing + -n, --dry-run display commands without modifying any files + --features display basic configuration information and exit + --mode=MODE use operation mode MODE + --no-warnings equivalent to '-Wnone' + --preserve-dup-deps don't remove duplicate dependency libraries + --quiet, --silent don't print informational messages + --tag=TAG use configuration variables from tag TAG + -v, --verbose print more informational messages than default + --version print version information + -W, --warnings=CATEGORY report the warnings falling in CATEGORY [all] + -h, --help, --help-all print short, long, or detailed help message +" + +# Additional text appended to 'usage_message' in response to '--help'. +func_help () +{ + $debug_cmd + + func_usage_message + $ECHO "$long_help_message + +MODE must be one of the following: + + clean remove files from the build directory + compile compile a source file into a libtool object + execute automatically set library path, then run a program + finish complete the installation of libtool libraries + install install libraries or executables + link create a library or an executable + uninstall remove libraries from an installed directory + +MODE-ARGS vary depending on the MODE. When passed as first option, +'--mode=MODE' may be abbreviated as 'MODE' or a unique abbreviation of that. +Try '$progname --help --mode=MODE' for a more detailed description of MODE. + +When reporting a bug, please describe a test case to reproduce it and +include the following information: + + host-triplet: $host + shell: $SHELL + compiler: $LTCC + compiler flags: $LTCFLAGS + linker: $LD (gnu? $with_gnu_ld) + version: $progname (GNU libtool) 2.4.6 + automake: `($AUTOMAKE --version) 2>/dev/null |$SED 1q` + autoconf: `($AUTOCONF --version) 2>/dev/null |$SED 1q` + +Report bugs to . +GNU libtool home page: . +General help using GNU software: ." + exit 0 +} + + +# func_lo2o OBJECT-NAME +# --------------------- +# Transform OBJECT-NAME from a '.lo' suffix to the platform specific +# object suffix. + +lo2o=s/\\.lo\$/.$objext/ +o2lo=s/\\.$objext\$/.lo/ + +if test yes = "$_G_HAVE_XSI_OPS"; then + eval 'func_lo2o () + { + case $1 in + *.lo) func_lo2o_result=${1%.lo}.$objext ;; + * ) func_lo2o_result=$1 ;; + esac + }' + + # func_xform LIBOBJ-OR-SOURCE + # --------------------------- + # Transform LIBOBJ-OR-SOURCE from a '.o' or '.c' (or otherwise) + # suffix to a '.lo' libtool-object suffix. + eval 'func_xform () + { + func_xform_result=${1%.*}.lo + }' +else + # ...otherwise fall back to using sed. + func_lo2o () + { + func_lo2o_result=`$ECHO "$1" | $SED "$lo2o"` + } + + func_xform () + { + func_xform_result=`$ECHO "$1" | $SED 's|\.[^.]*$|.lo|'` + } +fi + + +# func_fatal_configuration ARG... +# ------------------------------- +# Echo program name prefixed message to standard error, followed by +# a configuration failure hint, and exit. +func_fatal_configuration () +{ + func__fatal_error ${1+"$@"} \ + "See the $PACKAGE documentation for more information." \ + "Fatal configuration error." +} + + +# func_config +# ----------- +# Display the configuration for all the tags in this script. +func_config () +{ + re_begincf='^# ### BEGIN LIBTOOL' + re_endcf='^# ### END LIBTOOL' + + # Default configuration. + $SED "1,/$re_begincf CONFIG/d;/$re_endcf CONFIG/,\$d" < "$progpath" + + # Now print the configurations for the tags. + for tagname in $taglist; do + $SED -n "/$re_begincf TAG CONFIG: $tagname\$/,/$re_endcf TAG CONFIG: $tagname\$/p" < "$progpath" + done + + exit $? +} + + +# func_features +# ------------- +# Display the features supported by this script. +func_features () +{ + echo "host: $host" + if test yes = "$build_libtool_libs"; then + echo "enable shared libraries" + else + echo "disable shared libraries" + fi + if test yes = "$build_old_libs"; then + echo "enable static libraries" + else + echo "disable static libraries" + fi + + exit $? +} + + +# func_enable_tag TAGNAME +# ----------------------- +# Verify that TAGNAME is valid, and either flag an error and exit, or +# enable the TAGNAME tag. We also add TAGNAME to the global $taglist +# variable here. +func_enable_tag () +{ + # Global variable: + tagname=$1 + + re_begincf="^# ### BEGIN LIBTOOL TAG CONFIG: $tagname\$" + re_endcf="^# ### END LIBTOOL TAG CONFIG: $tagname\$" + sed_extractcf=/$re_begincf/,/$re_endcf/p + + # Validate tagname. + case $tagname in + *[!-_A-Za-z0-9,/]*) + func_fatal_error "invalid tag name: $tagname" + ;; + esac + + # Don't test for the "default" C tag, as we know it's + # there but not specially marked. + case $tagname in + CC) ;; + *) + if $GREP "$re_begincf" "$progpath" >/dev/null 2>&1; then + taglist="$taglist $tagname" + + # Evaluate the configuration. Be careful to quote the path + # and the sed script, to avoid splitting on whitespace, but + # also don't use non-portable quotes within backquotes within + # quotes we have to do it in 2 steps: + extractedcf=`$SED -n -e "$sed_extractcf" < "$progpath"` + eval "$extractedcf" + else + func_error "ignoring unknown tag $tagname" + fi + ;; + esac +} + + +# func_check_version_match +# ------------------------ +# Ensure that we are using m4 macros, and libtool script from the same +# release of libtool. +func_check_version_match () +{ + if test "$package_revision" != "$macro_revision"; then + if test "$VERSION" != "$macro_version"; then + if test -z "$macro_version"; then + cat >&2 <<_LT_EOF +$progname: Version mismatch error. This is $PACKAGE $VERSION, but the +$progname: definition of this LT_INIT comes from an older release. +$progname: You should recreate aclocal.m4 with macros from $PACKAGE $VERSION +$progname: and run autoconf again. +_LT_EOF + else + cat >&2 <<_LT_EOF +$progname: Version mismatch error. This is $PACKAGE $VERSION, but the +$progname: definition of this LT_INIT comes from $PACKAGE $macro_version. +$progname: You should recreate aclocal.m4 with macros from $PACKAGE $VERSION +$progname: and run autoconf again. +_LT_EOF + fi + else + cat >&2 <<_LT_EOF +$progname: Version mismatch error. This is $PACKAGE $VERSION, revision $package_revision, +$progname: but the definition of this LT_INIT comes from revision $macro_revision. +$progname: You should recreate aclocal.m4 with macros from revision $package_revision +$progname: of $PACKAGE $VERSION and run autoconf again. +_LT_EOF + fi + + exit $EXIT_MISMATCH + fi +} + + +# libtool_options_prep [ARG]... +# ----------------------------- +# Preparation for options parsed by libtool. +libtool_options_prep () +{ + $debug_mode + + # Option defaults: + opt_config=false + opt_dlopen= + opt_dry_run=false + opt_help=false + opt_mode= + opt_preserve_dup_deps=false + opt_quiet=false + + nonopt= + preserve_args= + + # Shorthand for --mode=foo, only valid as the first argument + case $1 in + clean|clea|cle|cl) + shift; set dummy --mode clean ${1+"$@"}; shift + ;; + compile|compil|compi|comp|com|co|c) + shift; set dummy --mode compile ${1+"$@"}; shift + ;; + execute|execut|execu|exec|exe|ex|e) + shift; set dummy --mode execute ${1+"$@"}; shift + ;; + finish|finis|fini|fin|fi|f) + shift; set dummy --mode finish ${1+"$@"}; shift + ;; + install|instal|insta|inst|ins|in|i) + shift; set dummy --mode install ${1+"$@"}; shift + ;; + link|lin|li|l) + shift; set dummy --mode link ${1+"$@"}; shift + ;; + uninstall|uninstal|uninsta|uninst|unins|unin|uni|un|u) + shift; set dummy --mode uninstall ${1+"$@"}; shift + ;; + esac + + # Pass back the list of options. + func_quote_for_eval ${1+"$@"} + libtool_options_prep_result=$func_quote_for_eval_result +} +func_add_hook func_options_prep libtool_options_prep + + +# libtool_parse_options [ARG]... +# --------------------------------- +# Provide handling for libtool specific options. +libtool_parse_options () +{ + $debug_cmd + + # Perform our own loop to consume as many options as possible in + # each iteration. + while test $# -gt 0; do + _G_opt=$1 + shift + case $_G_opt in + --dry-run|--dryrun|-n) + opt_dry_run=: + ;; + + --config) func_config ;; + + --dlopen|-dlopen) + opt_dlopen="${opt_dlopen+$opt_dlopen +}$1" + shift + ;; + + --preserve-dup-deps) + opt_preserve_dup_deps=: ;; + + --features) func_features ;; + + --finish) set dummy --mode finish ${1+"$@"}; shift ;; + + --help) opt_help=: ;; + + --help-all) opt_help=': help-all' ;; + + --mode) test $# = 0 && func_missing_arg $_G_opt && break + opt_mode=$1 + case $1 in + # Valid mode arguments: + clean|compile|execute|finish|install|link|relink|uninstall) ;; + + # Catch anything else as an error + *) func_error "invalid argument for $_G_opt" + exit_cmd=exit + break + ;; + esac + shift + ;; + + --no-silent|--no-quiet) + opt_quiet=false + func_append preserve_args " $_G_opt" + ;; + + --no-warnings|--no-warning|--no-warn) + opt_warning=false + func_append preserve_args " $_G_opt" + ;; + + --no-verbose) + opt_verbose=false + func_append preserve_args " $_G_opt" + ;; + + --silent|--quiet) + opt_quiet=: + opt_verbose=false + func_append preserve_args " $_G_opt" + ;; + + --tag) test $# = 0 && func_missing_arg $_G_opt && break + opt_tag=$1 + func_append preserve_args " $_G_opt $1" + func_enable_tag "$1" + shift + ;; + + --verbose|-v) opt_quiet=false + opt_verbose=: + func_append preserve_args " $_G_opt" + ;; + + # An option not handled by this hook function: + *) set dummy "$_G_opt" ${1+"$@"}; shift; break ;; + esac + done + + + # save modified positional parameters for caller + func_quote_for_eval ${1+"$@"} + libtool_parse_options_result=$func_quote_for_eval_result +} +func_add_hook func_parse_options libtool_parse_options + + + +# libtool_validate_options [ARG]... +# --------------------------------- +# Perform any sanity checks on option settings and/or unconsumed +# arguments. +libtool_validate_options () +{ + # save first non-option argument + if test 0 -lt $#; then + nonopt=$1 + shift + fi + + # preserve --debug + test : = "$debug_cmd" || func_append preserve_args " --debug" + + case $host in + # Solaris2 added to fix http://debbugs.gnu.org/cgi/bugreport.cgi?bug=16452 + # see also: http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59788 + *cygwin* | *mingw* | *pw32* | *cegcc* | *solaris2* | *os2*) + # don't eliminate duplications in $postdeps and $predeps + opt_duplicate_compiler_generated_deps=: + ;; + *) + opt_duplicate_compiler_generated_deps=$opt_preserve_dup_deps + ;; + esac + + $opt_help || { + # Sanity checks first: + func_check_version_match + + test yes != "$build_libtool_libs" \ + && test yes != "$build_old_libs" \ + && func_fatal_configuration "not configured to build any kind of library" + + # Darwin sucks + eval std_shrext=\"$shrext_cmds\" + + # Only execute mode is allowed to have -dlopen flags. + if test -n "$opt_dlopen" && test execute != "$opt_mode"; then + func_error "unrecognized option '-dlopen'" + $ECHO "$help" 1>&2 + exit $EXIT_FAILURE + fi + + # Change the help message to a mode-specific one. + generic_help=$help + help="Try '$progname --help --mode=$opt_mode' for more information." + } + + # Pass back the unparsed argument list + func_quote_for_eval ${1+"$@"} + libtool_validate_options_result=$func_quote_for_eval_result +} +func_add_hook func_validate_options libtool_validate_options + + +# Process options as early as possible so that --help and --version +# can return quickly. +func_options ${1+"$@"} +eval set dummy "$func_options_result"; shift + + + +## ----------- ## +## Main. ## +## ----------- ## + +magic='%%%MAGIC variable%%%' +magic_exe='%%%MAGIC EXE variable%%%' + +# Global variables. +extracted_archives= +extracted_serial=0 + +# If this variable is set in any of the actions, the command in it +# will be execed at the end. This prevents here-documents from being +# left over by shells. +exec_cmd= + + +# A function that is used when there is no print builtin or printf. +func_fallback_echo () +{ + eval 'cat <<_LTECHO_EOF +$1 +_LTECHO_EOF' +} + +# func_generated_by_libtool +# True iff stdin has been generated by Libtool. This function is only +# a basic sanity check; it will hardly flush out determined imposters. +func_generated_by_libtool_p () +{ + $GREP "^# Generated by .*$PACKAGE" > /dev/null 2>&1 +} + +# func_lalib_p file +# True iff FILE is a libtool '.la' library or '.lo' object file. +# This function is only a basic sanity check; it will hardly flush out +# determined imposters. +func_lalib_p () +{ + test -f "$1" && + $SED -e 4q "$1" 2>/dev/null | func_generated_by_libtool_p +} + +# func_lalib_unsafe_p file +# True iff FILE is a libtool '.la' library or '.lo' object file. +# This function implements the same check as func_lalib_p without +# resorting to external programs. To this end, it redirects stdin and +# closes it afterwards, without saving the original file descriptor. +# As a safety measure, use it only where a negative result would be +# fatal anyway. Works if 'file' does not exist. +func_lalib_unsafe_p () +{ + lalib_p=no + if test -f "$1" && test -r "$1" && exec 5<&0 <"$1"; then + for lalib_p_l in 1 2 3 4 + do + read lalib_p_line + case $lalib_p_line in + \#\ Generated\ by\ *$PACKAGE* ) lalib_p=yes; break;; + esac + done + exec 0<&5 5<&- + fi + test yes = "$lalib_p" +} + +# func_ltwrapper_script_p file +# True iff FILE is a libtool wrapper script +# This function is only a basic sanity check; it will hardly flush out +# determined imposters. +func_ltwrapper_script_p () +{ + test -f "$1" && + $lt_truncate_bin < "$1" 2>/dev/null | func_generated_by_libtool_p +} + +# func_ltwrapper_executable_p file +# True iff FILE is a libtool wrapper executable +# This function is only a basic sanity check; it will hardly flush out +# determined imposters. +func_ltwrapper_executable_p () +{ + func_ltwrapper_exec_suffix= + case $1 in + *.exe) ;; + *) func_ltwrapper_exec_suffix=.exe ;; + esac + $GREP "$magic_exe" "$1$func_ltwrapper_exec_suffix" >/dev/null 2>&1 +} + +# func_ltwrapper_scriptname file +# Assumes file is an ltwrapper_executable +# uses $file to determine the appropriate filename for a +# temporary ltwrapper_script. +func_ltwrapper_scriptname () +{ + func_dirname_and_basename "$1" "" "." + func_stripname '' '.exe' "$func_basename_result" + func_ltwrapper_scriptname_result=$func_dirname_result/$objdir/${func_stripname_result}_ltshwrapper +} + +# func_ltwrapper_p file +# True iff FILE is a libtool wrapper script or wrapper executable +# This function is only a basic sanity check; it will hardly flush out +# determined imposters. +func_ltwrapper_p () +{ + func_ltwrapper_script_p "$1" || func_ltwrapper_executable_p "$1" +} + + +# func_execute_cmds commands fail_cmd +# Execute tilde-delimited COMMANDS. +# If FAIL_CMD is given, eval that upon failure. +# FAIL_CMD may read-access the current command in variable CMD! +func_execute_cmds () +{ + $debug_cmd + + save_ifs=$IFS; IFS='~' + for cmd in $1; do + IFS=$sp$nl + eval cmd=\"$cmd\" + IFS=$save_ifs + func_show_eval "$cmd" "${2-:}" + done + IFS=$save_ifs +} + + +# func_source file +# Source FILE, adding directory component if necessary. +# Note that it is not necessary on cygwin/mingw to append a dot to +# FILE even if both FILE and FILE.exe exist: automatic-append-.exe +# behavior happens only for exec(3), not for open(2)! Also, sourcing +# 'FILE.' does not work on cygwin managed mounts. +func_source () +{ + $debug_cmd + + case $1 in + */* | *\\*) . "$1" ;; + *) . "./$1" ;; + esac +} + + +# func_resolve_sysroot PATH +# Replace a leading = in PATH with a sysroot. Store the result into +# func_resolve_sysroot_result +func_resolve_sysroot () +{ + func_resolve_sysroot_result=$1 + case $func_resolve_sysroot_result in + =*) + func_stripname '=' '' "$func_resolve_sysroot_result" + func_resolve_sysroot_result=$lt_sysroot$func_stripname_result + ;; + esac +} + +# func_replace_sysroot PATH +# If PATH begins with the sysroot, replace it with = and +# store the result into func_replace_sysroot_result. +func_replace_sysroot () +{ + case $lt_sysroot:$1 in + ?*:"$lt_sysroot"*) + func_stripname "$lt_sysroot" '' "$1" + func_replace_sysroot_result='='$func_stripname_result + ;; + *) + # Including no sysroot. + func_replace_sysroot_result=$1 + ;; + esac +} + +# func_infer_tag arg +# Infer tagged configuration to use if any are available and +# if one wasn't chosen via the "--tag" command line option. +# Only attempt this if the compiler in the base compile +# command doesn't match the default compiler. +# arg is usually of the form 'gcc ...' +func_infer_tag () +{ + $debug_cmd + + if test -n "$available_tags" && test -z "$tagname"; then + CC_quoted= + for arg in $CC; do + func_append_quoted CC_quoted "$arg" + done + CC_expanded=`func_echo_all $CC` + CC_quoted_expanded=`func_echo_all $CC_quoted` + case $@ in + # Blanks in the command may have been stripped by the calling shell, + # but not from the CC environment variable when configure was run. + " $CC "* | "$CC "* | " $CC_expanded "* | "$CC_expanded "* | \ + " $CC_quoted"* | "$CC_quoted "* | " $CC_quoted_expanded "* | "$CC_quoted_expanded "*) ;; + # Blanks at the start of $base_compile will cause this to fail + # if we don't check for them as well. + *) + for z in $available_tags; do + if $GREP "^# ### BEGIN LIBTOOL TAG CONFIG: $z$" < "$progpath" > /dev/null; then + # Evaluate the configuration. + eval "`$SED -n -e '/^# ### BEGIN LIBTOOL TAG CONFIG: '$z'$/,/^# ### END LIBTOOL TAG CONFIG: '$z'$/p' < $progpath`" + CC_quoted= + for arg in $CC; do + # Double-quote args containing other shell metacharacters. + func_append_quoted CC_quoted "$arg" + done + CC_expanded=`func_echo_all $CC` + CC_quoted_expanded=`func_echo_all $CC_quoted` + case "$@ " in + " $CC "* | "$CC "* | " $CC_expanded "* | "$CC_expanded "* | \ + " $CC_quoted"* | "$CC_quoted "* | " $CC_quoted_expanded "* | "$CC_quoted_expanded "*) + # The compiler in the base compile command matches + # the one in the tagged configuration. + # Assume this is the tagged configuration we want. + tagname=$z + break + ;; + esac + fi + done + # If $tagname still isn't set, then no tagged configuration + # was found and let the user know that the "--tag" command + # line option must be used. + if test -z "$tagname"; then + func_echo "unable to infer tagged configuration" + func_fatal_error "specify a tag with '--tag'" +# else +# func_verbose "using $tagname tagged configuration" + fi + ;; + esac + fi +} + + + +# func_write_libtool_object output_name pic_name nonpic_name +# Create a libtool object file (analogous to a ".la" file), +# but don't create it if we're doing a dry run. +func_write_libtool_object () +{ + write_libobj=$1 + if test yes = "$build_libtool_libs"; then + write_lobj=\'$2\' + else + write_lobj=none + fi + + if test yes = "$build_old_libs"; then + write_oldobj=\'$3\' + else + write_oldobj=none + fi + + $opt_dry_run || { + cat >${write_libobj}T </dev/null` + if test "$?" -eq 0 && test -n "$func_convert_core_file_wine_to_w32_tmp"; then + func_convert_core_file_wine_to_w32_result=`$ECHO "$func_convert_core_file_wine_to_w32_tmp" | + $SED -e "$sed_naive_backslashify"` + else + func_convert_core_file_wine_to_w32_result= + fi + fi +} +# end: func_convert_core_file_wine_to_w32 + + +# func_convert_core_path_wine_to_w32 ARG +# Helper function used by path conversion functions when $build is *nix, and +# $host is mingw, cygwin, or some other w32 environment. Relies on a correctly +# configured wine environment available, with the winepath program in $build's +# $PATH. Assumes ARG has no leading or trailing path separator characters. +# +# ARG is path to be converted from $build format to win32. +# Result is available in $func_convert_core_path_wine_to_w32_result. +# Unconvertible file (directory) names in ARG are skipped; if no directory names +# are convertible, then the result may be empty. +func_convert_core_path_wine_to_w32 () +{ + $debug_cmd + + # unfortunately, winepath doesn't convert paths, only file names + func_convert_core_path_wine_to_w32_result= + if test -n "$1"; then + oldIFS=$IFS + IFS=: + for func_convert_core_path_wine_to_w32_f in $1; do + IFS=$oldIFS + func_convert_core_file_wine_to_w32 "$func_convert_core_path_wine_to_w32_f" + if test -n "$func_convert_core_file_wine_to_w32_result"; then + if test -z "$func_convert_core_path_wine_to_w32_result"; then + func_convert_core_path_wine_to_w32_result=$func_convert_core_file_wine_to_w32_result + else + func_append func_convert_core_path_wine_to_w32_result ";$func_convert_core_file_wine_to_w32_result" + fi + fi + done + IFS=$oldIFS + fi +} +# end: func_convert_core_path_wine_to_w32 + + +# func_cygpath ARGS... +# Wrapper around calling the cygpath program via LT_CYGPATH. This is used when +# when (1) $build is *nix and Cygwin is hosted via a wine environment; or (2) +# $build is MSYS and $host is Cygwin, or (3) $build is Cygwin. In case (1) or +# (2), returns the Cygwin file name or path in func_cygpath_result (input +# file name or path is assumed to be in w32 format, as previously converted +# from $build's *nix or MSYS format). In case (3), returns the w32 file name +# or path in func_cygpath_result (input file name or path is assumed to be in +# Cygwin format). Returns an empty string on error. +# +# ARGS are passed to cygpath, with the last one being the file name or path to +# be converted. +# +# Specify the absolute *nix (or w32) name to cygpath in the LT_CYGPATH +# environment variable; do not put it in $PATH. +func_cygpath () +{ + $debug_cmd + + if test -n "$LT_CYGPATH" && test -f "$LT_CYGPATH"; then + func_cygpath_result=`$LT_CYGPATH "$@" 2>/dev/null` + if test "$?" -ne 0; then + # on failure, ensure result is empty + func_cygpath_result= + fi + else + func_cygpath_result= + func_error "LT_CYGPATH is empty or specifies non-existent file: '$LT_CYGPATH'" + fi +} +#end: func_cygpath + + +# func_convert_core_msys_to_w32 ARG +# Convert file name or path ARG from MSYS format to w32 format. Return +# result in func_convert_core_msys_to_w32_result. +func_convert_core_msys_to_w32 () +{ + $debug_cmd + + # awkward: cmd appends spaces to result + func_convert_core_msys_to_w32_result=`( cmd //c echo "$1" ) 2>/dev/null | + $SED -e 's/[ ]*$//' -e "$sed_naive_backslashify"` +} +#end: func_convert_core_msys_to_w32 + + +# func_convert_file_check ARG1 ARG2 +# Verify that ARG1 (a file name in $build format) was converted to $host +# format in ARG2. Otherwise, emit an error message, but continue (resetting +# func_to_host_file_result to ARG1). +func_convert_file_check () +{ + $debug_cmd + + if test -z "$2" && test -n "$1"; then + func_error "Could not determine host file name corresponding to" + func_error " '$1'" + func_error "Continuing, but uninstalled executables may not work." + # Fallback: + func_to_host_file_result=$1 + fi +} +# end func_convert_file_check + + +# func_convert_path_check FROM_PATHSEP TO_PATHSEP FROM_PATH TO_PATH +# Verify that FROM_PATH (a path in $build format) was converted to $host +# format in TO_PATH. Otherwise, emit an error message, but continue, resetting +# func_to_host_file_result to a simplistic fallback value (see below). +func_convert_path_check () +{ + $debug_cmd + + if test -z "$4" && test -n "$3"; then + func_error "Could not determine the host path corresponding to" + func_error " '$3'" + func_error "Continuing, but uninstalled executables may not work." + # Fallback. This is a deliberately simplistic "conversion" and + # should not be "improved". See libtool.info. + if test "x$1" != "x$2"; then + lt_replace_pathsep_chars="s|$1|$2|g" + func_to_host_path_result=`echo "$3" | + $SED -e "$lt_replace_pathsep_chars"` + else + func_to_host_path_result=$3 + fi + fi +} +# end func_convert_path_check + + +# func_convert_path_front_back_pathsep FRONTPAT BACKPAT REPL ORIG +# Modifies func_to_host_path_result by prepending REPL if ORIG matches FRONTPAT +# and appending REPL if ORIG matches BACKPAT. +func_convert_path_front_back_pathsep () +{ + $debug_cmd + + case $4 in + $1 ) func_to_host_path_result=$3$func_to_host_path_result + ;; + esac + case $4 in + $2 ) func_append func_to_host_path_result "$3" + ;; + esac +} +# end func_convert_path_front_back_pathsep + + +################################################## +# $build to $host FILE NAME CONVERSION FUNCTIONS # +################################################## +# invoked via '$to_host_file_cmd ARG' +# +# In each case, ARG is the path to be converted from $build to $host format. +# Result will be available in $func_to_host_file_result. + + +# func_to_host_file ARG +# Converts the file name ARG from $build format to $host format. Return result +# in func_to_host_file_result. +func_to_host_file () +{ + $debug_cmd + + $to_host_file_cmd "$1" +} +# end func_to_host_file + + +# func_to_tool_file ARG LAZY +# converts the file name ARG from $build format to toolchain format. Return +# result in func_to_tool_file_result. If the conversion in use is listed +# in (the comma separated) LAZY, no conversion takes place. +func_to_tool_file () +{ + $debug_cmd + + case ,$2, in + *,"$to_tool_file_cmd",*) + func_to_tool_file_result=$1 + ;; + *) + $to_tool_file_cmd "$1" + func_to_tool_file_result=$func_to_host_file_result + ;; + esac +} +# end func_to_tool_file + + +# func_convert_file_noop ARG +# Copy ARG to func_to_host_file_result. +func_convert_file_noop () +{ + func_to_host_file_result=$1 +} +# end func_convert_file_noop + + +# func_convert_file_msys_to_w32 ARG +# Convert file name ARG from (mingw) MSYS to (mingw) w32 format; automatic +# conversion to w32 is not available inside the cwrapper. Returns result in +# func_to_host_file_result. +func_convert_file_msys_to_w32 () +{ + $debug_cmd + + func_to_host_file_result=$1 + if test -n "$1"; then + func_convert_core_msys_to_w32 "$1" + func_to_host_file_result=$func_convert_core_msys_to_w32_result + fi + func_convert_file_check "$1" "$func_to_host_file_result" +} +# end func_convert_file_msys_to_w32 + + +# func_convert_file_cygwin_to_w32 ARG +# Convert file name ARG from Cygwin to w32 format. Returns result in +# func_to_host_file_result. +func_convert_file_cygwin_to_w32 () +{ + $debug_cmd + + func_to_host_file_result=$1 + if test -n "$1"; then + # because $build is cygwin, we call "the" cygpath in $PATH; no need to use + # LT_CYGPATH in this case. + func_to_host_file_result=`cygpath -m "$1"` + fi + func_convert_file_check "$1" "$func_to_host_file_result" +} +# end func_convert_file_cygwin_to_w32 + + +# func_convert_file_nix_to_w32 ARG +# Convert file name ARG from *nix to w32 format. Requires a wine environment +# and a working winepath. Returns result in func_to_host_file_result. +func_convert_file_nix_to_w32 () +{ + $debug_cmd + + func_to_host_file_result=$1 + if test -n "$1"; then + func_convert_core_file_wine_to_w32 "$1" + func_to_host_file_result=$func_convert_core_file_wine_to_w32_result + fi + func_convert_file_check "$1" "$func_to_host_file_result" +} +# end func_convert_file_nix_to_w32 + + +# func_convert_file_msys_to_cygwin ARG +# Convert file name ARG from MSYS to Cygwin format. Requires LT_CYGPATH set. +# Returns result in func_to_host_file_result. +func_convert_file_msys_to_cygwin () +{ + $debug_cmd + + func_to_host_file_result=$1 + if test -n "$1"; then + func_convert_core_msys_to_w32 "$1" + func_cygpath -u "$func_convert_core_msys_to_w32_result" + func_to_host_file_result=$func_cygpath_result + fi + func_convert_file_check "$1" "$func_to_host_file_result" +} +# end func_convert_file_msys_to_cygwin + + +# func_convert_file_nix_to_cygwin ARG +# Convert file name ARG from *nix to Cygwin format. Requires Cygwin installed +# in a wine environment, working winepath, and LT_CYGPATH set. Returns result +# in func_to_host_file_result. +func_convert_file_nix_to_cygwin () +{ + $debug_cmd + + func_to_host_file_result=$1 + if test -n "$1"; then + # convert from *nix to w32, then use cygpath to convert from w32 to cygwin. + func_convert_core_file_wine_to_w32 "$1" + func_cygpath -u "$func_convert_core_file_wine_to_w32_result" + func_to_host_file_result=$func_cygpath_result + fi + func_convert_file_check "$1" "$func_to_host_file_result" +} +# end func_convert_file_nix_to_cygwin + + +############################################# +# $build to $host PATH CONVERSION FUNCTIONS # +############################################# +# invoked via '$to_host_path_cmd ARG' +# +# In each case, ARG is the path to be converted from $build to $host format. +# The result will be available in $func_to_host_path_result. +# +# Path separators are also converted from $build format to $host format. If +# ARG begins or ends with a path separator character, it is preserved (but +# converted to $host format) on output. +# +# All path conversion functions are named using the following convention: +# file name conversion function : func_convert_file_X_to_Y () +# path conversion function : func_convert_path_X_to_Y () +# where, for any given $build/$host combination the 'X_to_Y' value is the +# same. If conversion functions are added for new $build/$host combinations, +# the two new functions must follow this pattern, or func_init_to_host_path_cmd +# will break. + + +# func_init_to_host_path_cmd +# Ensures that function "pointer" variable $to_host_path_cmd is set to the +# appropriate value, based on the value of $to_host_file_cmd. +to_host_path_cmd= +func_init_to_host_path_cmd () +{ + $debug_cmd + + if test -z "$to_host_path_cmd"; then + func_stripname 'func_convert_file_' '' "$to_host_file_cmd" + to_host_path_cmd=func_convert_path_$func_stripname_result + fi +} + + +# func_to_host_path ARG +# Converts the path ARG from $build format to $host format. Return result +# in func_to_host_path_result. +func_to_host_path () +{ + $debug_cmd + + func_init_to_host_path_cmd + $to_host_path_cmd "$1" +} +# end func_to_host_path + + +# func_convert_path_noop ARG +# Copy ARG to func_to_host_path_result. +func_convert_path_noop () +{ + func_to_host_path_result=$1 +} +# end func_convert_path_noop + + +# func_convert_path_msys_to_w32 ARG +# Convert path ARG from (mingw) MSYS to (mingw) w32 format; automatic +# conversion to w32 is not available inside the cwrapper. Returns result in +# func_to_host_path_result. +func_convert_path_msys_to_w32 () +{ + $debug_cmd + + func_to_host_path_result=$1 + if test -n "$1"; then + # Remove leading and trailing path separator characters from ARG. MSYS + # behavior is inconsistent here; cygpath turns them into '.;' and ';.'; + # and winepath ignores them completely. + func_stripname : : "$1" + func_to_host_path_tmp1=$func_stripname_result + func_convert_core_msys_to_w32 "$func_to_host_path_tmp1" + func_to_host_path_result=$func_convert_core_msys_to_w32_result + func_convert_path_check : ";" \ + "$func_to_host_path_tmp1" "$func_to_host_path_result" + func_convert_path_front_back_pathsep ":*" "*:" ";" "$1" + fi +} +# end func_convert_path_msys_to_w32 + + +# func_convert_path_cygwin_to_w32 ARG +# Convert path ARG from Cygwin to w32 format. Returns result in +# func_to_host_file_result. +func_convert_path_cygwin_to_w32 () +{ + $debug_cmd + + func_to_host_path_result=$1 + if test -n "$1"; then + # See func_convert_path_msys_to_w32: + func_stripname : : "$1" + func_to_host_path_tmp1=$func_stripname_result + func_to_host_path_result=`cygpath -m -p "$func_to_host_path_tmp1"` + func_convert_path_check : ";" \ + "$func_to_host_path_tmp1" "$func_to_host_path_result" + func_convert_path_front_back_pathsep ":*" "*:" ";" "$1" + fi +} +# end func_convert_path_cygwin_to_w32 + + +# func_convert_path_nix_to_w32 ARG +# Convert path ARG from *nix to w32 format. Requires a wine environment and +# a working winepath. Returns result in func_to_host_file_result. +func_convert_path_nix_to_w32 () +{ + $debug_cmd + + func_to_host_path_result=$1 + if test -n "$1"; then + # See func_convert_path_msys_to_w32: + func_stripname : : "$1" + func_to_host_path_tmp1=$func_stripname_result + func_convert_core_path_wine_to_w32 "$func_to_host_path_tmp1" + func_to_host_path_result=$func_convert_core_path_wine_to_w32_result + func_convert_path_check : ";" \ + "$func_to_host_path_tmp1" "$func_to_host_path_result" + func_convert_path_front_back_pathsep ":*" "*:" ";" "$1" + fi +} +# end func_convert_path_nix_to_w32 + + +# func_convert_path_msys_to_cygwin ARG +# Convert path ARG from MSYS to Cygwin format. Requires LT_CYGPATH set. +# Returns result in func_to_host_file_result. +func_convert_path_msys_to_cygwin () +{ + $debug_cmd + + func_to_host_path_result=$1 + if test -n "$1"; then + # See func_convert_path_msys_to_w32: + func_stripname : : "$1" + func_to_host_path_tmp1=$func_stripname_result + func_convert_core_msys_to_w32 "$func_to_host_path_tmp1" + func_cygpath -u -p "$func_convert_core_msys_to_w32_result" + func_to_host_path_result=$func_cygpath_result + func_convert_path_check : : \ + "$func_to_host_path_tmp1" "$func_to_host_path_result" + func_convert_path_front_back_pathsep ":*" "*:" : "$1" + fi +} +# end func_convert_path_msys_to_cygwin + + +# func_convert_path_nix_to_cygwin ARG +# Convert path ARG from *nix to Cygwin format. Requires Cygwin installed in a +# a wine environment, working winepath, and LT_CYGPATH set. Returns result in +# func_to_host_file_result. +func_convert_path_nix_to_cygwin () +{ + $debug_cmd + + func_to_host_path_result=$1 + if test -n "$1"; then + # Remove leading and trailing path separator characters from + # ARG. msys behavior is inconsistent here, cygpath turns them + # into '.;' and ';.', and winepath ignores them completely. + func_stripname : : "$1" + func_to_host_path_tmp1=$func_stripname_result + func_convert_core_path_wine_to_w32 "$func_to_host_path_tmp1" + func_cygpath -u -p "$func_convert_core_path_wine_to_w32_result" + func_to_host_path_result=$func_cygpath_result + func_convert_path_check : : \ + "$func_to_host_path_tmp1" "$func_to_host_path_result" + func_convert_path_front_back_pathsep ":*" "*:" : "$1" + fi +} +# end func_convert_path_nix_to_cygwin + + +# func_dll_def_p FILE +# True iff FILE is a Windows DLL '.def' file. +# Keep in sync with _LT_DLL_DEF_P in libtool.m4 +func_dll_def_p () +{ + $debug_cmd + + func_dll_def_p_tmp=`$SED -n \ + -e 's/^[ ]*//' \ + -e '/^\(;.*\)*$/d' \ + -e 's/^\(EXPORTS\|LIBRARY\)\([ ].*\)*$/DEF/p' \ + -e q \ + "$1"` + test DEF = "$func_dll_def_p_tmp" +} + + +# func_mode_compile arg... +func_mode_compile () +{ + $debug_cmd + + # Get the compilation command and the source file. + base_compile= + srcfile=$nonopt # always keep a non-empty value in "srcfile" + suppress_opt=yes + suppress_output= + arg_mode=normal + libobj= + later= + pie_flag= + + for arg + do + case $arg_mode in + arg ) + # do not "continue". Instead, add this to base_compile + lastarg=$arg + arg_mode=normal + ;; + + target ) + libobj=$arg + arg_mode=normal + continue + ;; + + normal ) + # Accept any command-line options. + case $arg in + -o) + test -n "$libobj" && \ + func_fatal_error "you cannot specify '-o' more than once" + arg_mode=target + continue + ;; + + -pie | -fpie | -fPIE) + func_append pie_flag " $arg" + continue + ;; + + -shared | -static | -prefer-pic | -prefer-non-pic) + func_append later " $arg" + continue + ;; + + -no-suppress) + suppress_opt=no + continue + ;; + + -Xcompiler) + arg_mode=arg # the next one goes into the "base_compile" arg list + continue # The current "srcfile" will either be retained or + ;; # replaced later. I would guess that would be a bug. + + -Wc,*) + func_stripname '-Wc,' '' "$arg" + args=$func_stripname_result + lastarg= + save_ifs=$IFS; IFS=, + for arg in $args; do + IFS=$save_ifs + func_append_quoted lastarg "$arg" + done + IFS=$save_ifs + func_stripname ' ' '' "$lastarg" + lastarg=$func_stripname_result + + # Add the arguments to base_compile. + func_append base_compile " $lastarg" + continue + ;; + + *) + # Accept the current argument as the source file. + # The previous "srcfile" becomes the current argument. + # + lastarg=$srcfile + srcfile=$arg + ;; + esac # case $arg + ;; + esac # case $arg_mode + + # Aesthetically quote the previous argument. + func_append_quoted base_compile "$lastarg" + done # for arg + + case $arg_mode in + arg) + func_fatal_error "you must specify an argument for -Xcompile" + ;; + target) + func_fatal_error "you must specify a target with '-o'" + ;; + *) + # Get the name of the library object. + test -z "$libobj" && { + func_basename "$srcfile" + libobj=$func_basename_result + } + ;; + esac + + # Recognize several different file suffixes. + # If the user specifies -o file.o, it is replaced with file.lo + case $libobj in + *.[cCFSifmso] | \ + *.ada | *.adb | *.ads | *.asm | \ + *.c++ | *.cc | *.ii | *.class | *.cpp | *.cxx | \ + *.[fF][09]? | *.for | *.java | *.go | *.obj | *.sx | *.cu | *.cup) + func_xform "$libobj" + libobj=$func_xform_result + ;; + esac + + case $libobj in + *.lo) func_lo2o "$libobj"; obj=$func_lo2o_result ;; + *) + func_fatal_error "cannot determine name of library object from '$libobj'" + ;; + esac + + func_infer_tag $base_compile + + for arg in $later; do + case $arg in + -shared) + test yes = "$build_libtool_libs" \ + || func_fatal_configuration "cannot build a shared library" + build_old_libs=no + continue + ;; + + -static) + build_libtool_libs=no + build_old_libs=yes + continue + ;; + + -prefer-pic) + pic_mode=yes + continue + ;; + + -prefer-non-pic) + pic_mode=no + continue + ;; + esac + done + + func_quote_for_eval "$libobj" + test "X$libobj" != "X$func_quote_for_eval_result" \ + && $ECHO "X$libobj" | $GREP '[]~#^*{};<>?"'"'"' &()|`$[]' \ + && func_warning "libobj name '$libobj' may not contain shell special characters." + func_dirname_and_basename "$obj" "/" "" + objname=$func_basename_result + xdir=$func_dirname_result + lobj=$xdir$objdir/$objname + + test -z "$base_compile" && \ + func_fatal_help "you must specify a compilation command" + + # Delete any leftover library objects. + if test yes = "$build_old_libs"; then + removelist="$obj $lobj $libobj ${libobj}T" + else + removelist="$lobj $libobj ${libobj}T" + fi + + # On Cygwin there's no "real" PIC flag so we must build both object types + case $host_os in + cygwin* | mingw* | pw32* | os2* | cegcc*) + pic_mode=default + ;; + esac + if test no = "$pic_mode" && test pass_all != "$deplibs_check_method"; then + # non-PIC code in shared libraries is not supported + pic_mode=default + fi + + # Calculate the filename of the output object if compiler does + # not support -o with -c + if test no = "$compiler_c_o"; then + output_obj=`$ECHO "$srcfile" | $SED 's%^.*/%%; s%\.[^.]*$%%'`.$objext + lockfile=$output_obj.lock + else + output_obj= + need_locks=no + lockfile= + fi + + # Lock this critical section if it is needed + # We use this script file to make the link, it avoids creating a new file + if test yes = "$need_locks"; then + until $opt_dry_run || ln "$progpath" "$lockfile" 2>/dev/null; do + func_echo "Waiting for $lockfile to be removed" + sleep 2 + done + elif test warn = "$need_locks"; then + if test -f "$lockfile"; then + $ECHO "\ +*** ERROR, $lockfile exists and contains: +`cat $lockfile 2>/dev/null` + +This indicates that another process is trying to use the same +temporary object file, and libtool could not work around it because +your compiler does not support '-c' and '-o' together. If you +repeat this compilation, it may succeed, by chance, but you had better +avoid parallel builds (make -j) in this platform, or get a better +compiler." + + $opt_dry_run || $RM $removelist + exit $EXIT_FAILURE + fi + func_append removelist " $output_obj" + $ECHO "$srcfile" > "$lockfile" + fi + + $opt_dry_run || $RM $removelist + func_append removelist " $lockfile" + trap '$opt_dry_run || $RM $removelist; exit $EXIT_FAILURE' 1 2 15 + + func_to_tool_file "$srcfile" func_convert_file_msys_to_w32 + srcfile=$func_to_tool_file_result + func_quote_for_eval "$srcfile" + qsrcfile=$func_quote_for_eval_result + + # Only build a PIC object if we are building libtool libraries. + if test yes = "$build_libtool_libs"; then + # Without this assignment, base_compile gets emptied. + fbsd_hideous_sh_bug=$base_compile + + if test no != "$pic_mode"; then + command="$base_compile $qsrcfile $pic_flag" + else + # Don't build PIC code + command="$base_compile $qsrcfile" + fi + + func_mkdir_p "$xdir$objdir" + + if test -z "$output_obj"; then + # Place PIC objects in $objdir + func_append command " -o $lobj" + fi + + func_show_eval_locale "$command" \ + 'test -n "$output_obj" && $RM $removelist; exit $EXIT_FAILURE' + + if test warn = "$need_locks" && + test "X`cat $lockfile 2>/dev/null`" != "X$srcfile"; then + $ECHO "\ +*** ERROR, $lockfile contains: +`cat $lockfile 2>/dev/null` + +but it should contain: +$srcfile + +This indicates that another process is trying to use the same +temporary object file, and libtool could not work around it because +your compiler does not support '-c' and '-o' together. If you +repeat this compilation, it may succeed, by chance, but you had better +avoid parallel builds (make -j) in this platform, or get a better +compiler." + + $opt_dry_run || $RM $removelist + exit $EXIT_FAILURE + fi + + # Just move the object if needed, then go on to compile the next one + if test -n "$output_obj" && test "X$output_obj" != "X$lobj"; then + func_show_eval '$MV "$output_obj" "$lobj"' \ + 'error=$?; $opt_dry_run || $RM $removelist; exit $error' + fi + + # Allow error messages only from the first compilation. + if test yes = "$suppress_opt"; then + suppress_output=' >/dev/null 2>&1' + fi + fi + + # Only build a position-dependent object if we build old libraries. + if test yes = "$build_old_libs"; then + if test yes != "$pic_mode"; then + # Don't build PIC code + command="$base_compile $qsrcfile$pie_flag" + else + command="$base_compile $qsrcfile $pic_flag" + fi + if test yes = "$compiler_c_o"; then + func_append command " -o $obj" + fi + + # Suppress compiler output if we already did a PIC compilation. + func_append command "$suppress_output" + func_show_eval_locale "$command" \ + '$opt_dry_run || $RM $removelist; exit $EXIT_FAILURE' + + if test warn = "$need_locks" && + test "X`cat $lockfile 2>/dev/null`" != "X$srcfile"; then + $ECHO "\ +*** ERROR, $lockfile contains: +`cat $lockfile 2>/dev/null` + +but it should contain: +$srcfile + +This indicates that another process is trying to use the same +temporary object file, and libtool could not work around it because +your compiler does not support '-c' and '-o' together. If you +repeat this compilation, it may succeed, by chance, but you had better +avoid parallel builds (make -j) in this platform, or get a better +compiler." + + $opt_dry_run || $RM $removelist + exit $EXIT_FAILURE + fi + + # Just move the object if needed + if test -n "$output_obj" && test "X$output_obj" != "X$obj"; then + func_show_eval '$MV "$output_obj" "$obj"' \ + 'error=$?; $opt_dry_run || $RM $removelist; exit $error' + fi + fi + + $opt_dry_run || { + func_write_libtool_object "$libobj" "$objdir/$objname" "$objname" + + # Unlock the critical section if it was locked + if test no != "$need_locks"; then + removelist=$lockfile + $RM "$lockfile" + fi + } + + exit $EXIT_SUCCESS +} + +$opt_help || { + test compile = "$opt_mode" && func_mode_compile ${1+"$@"} +} + +func_mode_help () +{ + # We need to display help for each of the modes. + case $opt_mode in + "") + # Generic help is extracted from the usage comments + # at the start of this file. + func_help + ;; + + clean) + $ECHO \ +"Usage: $progname [OPTION]... --mode=clean RM [RM-OPTION]... FILE... + +Remove files from the build directory. + +RM is the name of the program to use to delete files associated with each FILE +(typically '/bin/rm'). RM-OPTIONS are options (such as '-f') to be passed +to RM. + +If FILE is a libtool library, object or program, all the files associated +with it are deleted. Otherwise, only FILE itself is deleted using RM." + ;; + + compile) + $ECHO \ +"Usage: $progname [OPTION]... --mode=compile COMPILE-COMMAND... SOURCEFILE + +Compile a source file into a libtool library object. + +This mode accepts the following additional options: + + -o OUTPUT-FILE set the output file name to OUTPUT-FILE + -no-suppress do not suppress compiler output for multiple passes + -prefer-pic try to build PIC objects only + -prefer-non-pic try to build non-PIC objects only + -shared do not build a '.o' file suitable for static linking + -static only build a '.o' file suitable for static linking + -Wc,FLAG pass FLAG directly to the compiler + +COMPILE-COMMAND is a command to be used in creating a 'standard' object file +from the given SOURCEFILE. + +The output file name is determined by removing the directory component from +SOURCEFILE, then substituting the C source code suffix '.c' with the +library object suffix, '.lo'." + ;; + + execute) + $ECHO \ +"Usage: $progname [OPTION]... --mode=execute COMMAND [ARGS]... + +Automatically set library path, then run a program. + +This mode accepts the following additional options: + + -dlopen FILE add the directory containing FILE to the library path + +This mode sets the library path environment variable according to '-dlopen' +flags. + +If any of the ARGS are libtool executable wrappers, then they are translated +into their corresponding uninstalled binary, and any of their required library +directories are added to the library path. + +Then, COMMAND is executed, with ARGS as arguments." + ;; + + finish) + $ECHO \ +"Usage: $progname [OPTION]... --mode=finish [LIBDIR]... + +Complete the installation of libtool libraries. + +Each LIBDIR is a directory that contains libtool libraries. + +The commands that this mode executes may require superuser privileges. Use +the '--dry-run' option if you just want to see what would be executed." + ;; + + install) + $ECHO \ +"Usage: $progname [OPTION]... --mode=install INSTALL-COMMAND... + +Install executables or libraries. + +INSTALL-COMMAND is the installation command. The first component should be +either the 'install' or 'cp' program. + +The following components of INSTALL-COMMAND are treated specially: + + -inst-prefix-dir PREFIX-DIR Use PREFIX-DIR as a staging area for installation + +The rest of the components are interpreted as arguments to that command (only +BSD-compatible install options are recognized)." + ;; + + link) + $ECHO \ +"Usage: $progname [OPTION]... --mode=link LINK-COMMAND... + +Link object files or libraries together to form another library, or to +create an executable program. + +LINK-COMMAND is a command using the C compiler that you would use to create +a program from several object files. + +The following components of LINK-COMMAND are treated specially: + + -all-static do not do any dynamic linking at all + -avoid-version do not add a version suffix if possible + -bindir BINDIR specify path to binaries directory (for systems where + libraries must be found in the PATH setting at runtime) + -dlopen FILE '-dlpreopen' FILE if it cannot be dlopened at runtime + -dlpreopen FILE link in FILE and add its symbols to lt_preloaded_symbols + -export-dynamic allow symbols from OUTPUT-FILE to be resolved with dlsym(3) + -export-symbols SYMFILE + try to export only the symbols listed in SYMFILE + -export-symbols-regex REGEX + try to export only the symbols matching REGEX + -LLIBDIR search LIBDIR for required installed libraries + -lNAME OUTPUT-FILE requires the installed library libNAME + -module build a library that can dlopened + -no-fast-install disable the fast-install mode + -no-install link a not-installable executable + -no-undefined declare that a library does not refer to external symbols + -o OUTPUT-FILE create OUTPUT-FILE from the specified objects + -objectlist FILE use a list of object files found in FILE to specify objects + -os2dllname NAME force a short DLL name on OS/2 (no effect on other OSes) + -precious-files-regex REGEX + don't remove output files matching REGEX + -release RELEASE specify package release information + -rpath LIBDIR the created library will eventually be installed in LIBDIR + -R[ ]LIBDIR add LIBDIR to the runtime path of programs and libraries + -shared only do dynamic linking of libtool libraries + -shrext SUFFIX override the standard shared library file extension + -static do not do any dynamic linking of uninstalled libtool libraries + -static-libtool-libs + do not do any dynamic linking of libtool libraries + -version-info CURRENT[:REVISION[:AGE]] + specify library version info [each variable defaults to 0] + -weak LIBNAME declare that the target provides the LIBNAME interface + -Wc,FLAG + -Xcompiler FLAG pass linker-specific FLAG directly to the compiler + -Wl,FLAG + -Xlinker FLAG pass linker-specific FLAG directly to the linker + -XCClinker FLAG pass link-specific FLAG to the compiler driver (CC) + +All other options (arguments beginning with '-') are ignored. + +Every other argument is treated as a filename. Files ending in '.la' are +treated as uninstalled libtool libraries, other files are standard or library +object files. + +If the OUTPUT-FILE ends in '.la', then a libtool library is created, +only library objects ('.lo' files) may be specified, and '-rpath' is +required, except when creating a convenience library. + +If OUTPUT-FILE ends in '.a' or '.lib', then a standard library is created +using 'ar' and 'ranlib', or on Windows using 'lib'. + +If OUTPUT-FILE ends in '.lo' or '.$objext', then a reloadable object file +is created, otherwise an executable program is created." + ;; + + uninstall) + $ECHO \ +"Usage: $progname [OPTION]... --mode=uninstall RM [RM-OPTION]... FILE... + +Remove libraries from an installation directory. + +RM is the name of the program to use to delete files associated with each FILE +(typically '/bin/rm'). RM-OPTIONS are options (such as '-f') to be passed +to RM. + +If FILE is a libtool library, all the files associated with it are deleted. +Otherwise, only FILE itself is deleted using RM." + ;; + + *) + func_fatal_help "invalid operation mode '$opt_mode'" + ;; + esac + + echo + $ECHO "Try '$progname --help' for more information about other modes." +} + +# Now that we've collected a possible --mode arg, show help if necessary +if $opt_help; then + if test : = "$opt_help"; then + func_mode_help + else + { + func_help noexit + for opt_mode in compile link execute install finish uninstall clean; do + func_mode_help + done + } | $SED -n '1p; 2,$s/^Usage:/ or: /p' + { + func_help noexit + for opt_mode in compile link execute install finish uninstall clean; do + echo + func_mode_help + done + } | + $SED '1d + /^When reporting/,/^Report/{ + H + d + } + $x + /information about other modes/d + /more detailed .*MODE/d + s/^Usage:.*--mode=\([^ ]*\) .*/Description of \1 mode:/' + fi + exit $? +fi + + +# func_mode_execute arg... +func_mode_execute () +{ + $debug_cmd + + # The first argument is the command name. + cmd=$nonopt + test -z "$cmd" && \ + func_fatal_help "you must specify a COMMAND" + + # Handle -dlopen flags immediately. + for file in $opt_dlopen; do + test -f "$file" \ + || func_fatal_help "'$file' is not a file" + + dir= + case $file in + *.la) + func_resolve_sysroot "$file" + file=$func_resolve_sysroot_result + + # Check to see that this really is a libtool archive. + func_lalib_unsafe_p "$file" \ + || func_fatal_help "'$lib' is not a valid libtool archive" + + # Read the libtool library. + dlname= + library_names= + func_source "$file" + + # Skip this library if it cannot be dlopened. + if test -z "$dlname"; then + # Warn if it was a shared library. + test -n "$library_names" && \ + func_warning "'$file' was not linked with '-export-dynamic'" + continue + fi + + func_dirname "$file" "" "." + dir=$func_dirname_result + + if test -f "$dir/$objdir/$dlname"; then + func_append dir "/$objdir" + else + if test ! -f "$dir/$dlname"; then + func_fatal_error "cannot find '$dlname' in '$dir' or '$dir/$objdir'" + fi + fi + ;; + + *.lo) + # Just add the directory containing the .lo file. + func_dirname "$file" "" "." + dir=$func_dirname_result + ;; + + *) + func_warning "'-dlopen' is ignored for non-libtool libraries and objects" + continue + ;; + esac + + # Get the absolute pathname. + absdir=`cd "$dir" && pwd` + test -n "$absdir" && dir=$absdir + + # Now add the directory to shlibpath_var. + if eval "test -z \"\$$shlibpath_var\""; then + eval "$shlibpath_var=\"\$dir\"" + else + eval "$shlibpath_var=\"\$dir:\$$shlibpath_var\"" + fi + done + + # This variable tells wrapper scripts just to set shlibpath_var + # rather than running their programs. + libtool_execute_magic=$magic + + # Check if any of the arguments is a wrapper script. + args= + for file + do + case $file in + -* | *.la | *.lo ) ;; + *) + # Do a test to see if this is really a libtool program. + if func_ltwrapper_script_p "$file"; then + func_source "$file" + # Transform arg to wrapped name. + file=$progdir/$program + elif func_ltwrapper_executable_p "$file"; then + func_ltwrapper_scriptname "$file" + func_source "$func_ltwrapper_scriptname_result" + # Transform arg to wrapped name. + file=$progdir/$program + fi + ;; + esac + # Quote arguments (to preserve shell metacharacters). + func_append_quoted args "$file" + done + + if $opt_dry_run; then + # Display what would be done. + if test -n "$shlibpath_var"; then + eval "\$ECHO \"\$shlibpath_var=\$$shlibpath_var\"" + echo "export $shlibpath_var" + fi + $ECHO "$cmd$args" + exit $EXIT_SUCCESS + else + if test -n "$shlibpath_var"; then + # Export the shlibpath_var. + eval "export $shlibpath_var" + fi + + # Restore saved environment variables + for lt_var in LANG LANGUAGE LC_ALL LC_CTYPE LC_COLLATE LC_MESSAGES + do + eval "if test \"\${save_$lt_var+set}\" = set; then + $lt_var=\$save_$lt_var; export $lt_var + else + $lt_unset $lt_var + fi" + done + + # Now prepare to actually exec the command. + exec_cmd=\$cmd$args + fi +} + +test execute = "$opt_mode" && func_mode_execute ${1+"$@"} + + +# func_mode_finish arg... +func_mode_finish () +{ + $debug_cmd + + libs= + libdirs= + admincmds= + + for opt in "$nonopt" ${1+"$@"} + do + if test -d "$opt"; then + func_append libdirs " $opt" + + elif test -f "$opt"; then + if func_lalib_unsafe_p "$opt"; then + func_append libs " $opt" + else + func_warning "'$opt' is not a valid libtool archive" + fi + + else + func_fatal_error "invalid argument '$opt'" + fi + done + + if test -n "$libs"; then + if test -n "$lt_sysroot"; then + sysroot_regex=`$ECHO "$lt_sysroot" | $SED "$sed_make_literal_regex"` + sysroot_cmd="s/\([ ']\)$sysroot_regex/\1/g;" + else + sysroot_cmd= + fi + + # Remove sysroot references + if $opt_dry_run; then + for lib in $libs; do + echo "removing references to $lt_sysroot and '=' prefixes from $lib" + done + else + tmpdir=`func_mktempdir` + for lib in $libs; do + $SED -e "$sysroot_cmd s/\([ ']-[LR]\)=/\1/g; s/\([ ']\)=/\1/g" $lib \ + > $tmpdir/tmp-la + mv -f $tmpdir/tmp-la $lib + done + ${RM}r "$tmpdir" + fi + fi + + if test -n "$finish_cmds$finish_eval" && test -n "$libdirs"; then + for libdir in $libdirs; do + if test -n "$finish_cmds"; then + # Do each command in the finish commands. + func_execute_cmds "$finish_cmds" 'admincmds="$admincmds +'"$cmd"'"' + fi + if test -n "$finish_eval"; then + # Do the single finish_eval. + eval cmds=\"$finish_eval\" + $opt_dry_run || eval "$cmds" || func_append admincmds " + $cmds" + fi + done + fi + + # Exit here if they wanted silent mode. + $opt_quiet && exit $EXIT_SUCCESS + + if test -n "$finish_cmds$finish_eval" && test -n "$libdirs"; then + echo "----------------------------------------------------------------------" + echo "Libraries have been installed in:" + for libdir in $libdirs; do + $ECHO " $libdir" + done + echo + echo "If you ever happen to want to link against installed libraries" + echo "in a given directory, LIBDIR, you must either use libtool, and" + echo "specify the full pathname of the library, or use the '-LLIBDIR'" + echo "flag during linking and do at least one of the following:" + if test -n "$shlibpath_var"; then + echo " - add LIBDIR to the '$shlibpath_var' environment variable" + echo " during execution" + fi + if test -n "$runpath_var"; then + echo " - add LIBDIR to the '$runpath_var' environment variable" + echo " during linking" + fi + if test -n "$hardcode_libdir_flag_spec"; then + libdir=LIBDIR + eval flag=\"$hardcode_libdir_flag_spec\" + + $ECHO " - use the '$flag' linker flag" + fi + if test -n "$admincmds"; then + $ECHO " - have your system administrator run these commands:$admincmds" + fi + if test -f /etc/ld.so.conf; then + echo " - have your system administrator add LIBDIR to '/etc/ld.so.conf'" + fi + echo + + echo "See any operating system documentation about shared libraries for" + case $host in + solaris2.[6789]|solaris2.1[0-9]) + echo "more information, such as the ld(1), crle(1) and ld.so(8) manual" + echo "pages." + ;; + *) + echo "more information, such as the ld(1) and ld.so(8) manual pages." + ;; + esac + echo "----------------------------------------------------------------------" + fi + exit $EXIT_SUCCESS +} + +test finish = "$opt_mode" && func_mode_finish ${1+"$@"} + + +# func_mode_install arg... +func_mode_install () +{ + $debug_cmd + + # There may be an optional sh(1) argument at the beginning of + # install_prog (especially on Windows NT). + if test "$SHELL" = "$nonopt" || test /bin/sh = "$nonopt" || + # Allow the use of GNU shtool's install command. + case $nonopt in *shtool*) :;; *) false;; esac + then + # Aesthetically quote it. + func_quote_for_eval "$nonopt" + install_prog="$func_quote_for_eval_result " + arg=$1 + shift + else + install_prog= + arg=$nonopt + fi + + # The real first argument should be the name of the installation program. + # Aesthetically quote it. + func_quote_for_eval "$arg" + func_append install_prog "$func_quote_for_eval_result" + install_shared_prog=$install_prog + case " $install_prog " in + *[\\\ /]cp\ *) install_cp=: ;; + *) install_cp=false ;; + esac + + # We need to accept at least all the BSD install flags. + dest= + files= + opts= + prev= + install_type= + isdir=false + stripme= + no_mode=: + for arg + do + arg2= + if test -n "$dest"; then + func_append files " $dest" + dest=$arg + continue + fi + + case $arg in + -d) isdir=: ;; + -f) + if $install_cp; then :; else + prev=$arg + fi + ;; + -g | -m | -o) + prev=$arg + ;; + -s) + stripme=" -s" + continue + ;; + -*) + ;; + *) + # If the previous option needed an argument, then skip it. + if test -n "$prev"; then + if test X-m = "X$prev" && test -n "$install_override_mode"; then + arg2=$install_override_mode + no_mode=false + fi + prev= + else + dest=$arg + continue + fi + ;; + esac + + # Aesthetically quote the argument. + func_quote_for_eval "$arg" + func_append install_prog " $func_quote_for_eval_result" + if test -n "$arg2"; then + func_quote_for_eval "$arg2" + fi + func_append install_shared_prog " $func_quote_for_eval_result" + done + + test -z "$install_prog" && \ + func_fatal_help "you must specify an install program" + + test -n "$prev" && \ + func_fatal_help "the '$prev' option requires an argument" + + if test -n "$install_override_mode" && $no_mode; then + if $install_cp; then :; else + func_quote_for_eval "$install_override_mode" + func_append install_shared_prog " -m $func_quote_for_eval_result" + fi + fi + + if test -z "$files"; then + if test -z "$dest"; then + func_fatal_help "no file or destination specified" + else + func_fatal_help "you must specify a destination" + fi + fi + + # Strip any trailing slash from the destination. + func_stripname '' '/' "$dest" + dest=$func_stripname_result + + # Check to see that the destination is a directory. + test -d "$dest" && isdir=: + if $isdir; then + destdir=$dest + destname= + else + func_dirname_and_basename "$dest" "" "." + destdir=$func_dirname_result + destname=$func_basename_result + + # Not a directory, so check to see that there is only one file specified. + set dummy $files; shift + test "$#" -gt 1 && \ + func_fatal_help "'$dest' is not a directory" + fi + case $destdir in + [\\/]* | [A-Za-z]:[\\/]*) ;; + *) + for file in $files; do + case $file in + *.lo) ;; + *) + func_fatal_help "'$destdir' must be an absolute directory name" + ;; + esac + done + ;; + esac + + # This variable tells wrapper scripts just to set variables rather + # than running their programs. + libtool_install_magic=$magic + + staticlibs= + future_libdirs= + current_libdirs= + for file in $files; do + + # Do each installation. + case $file in + *.$libext) + # Do the static libraries later. + func_append staticlibs " $file" + ;; + + *.la) + func_resolve_sysroot "$file" + file=$func_resolve_sysroot_result + + # Check to see that this really is a libtool archive. + func_lalib_unsafe_p "$file" \ + || func_fatal_help "'$file' is not a valid libtool archive" + + library_names= + old_library= + relink_command= + func_source "$file" + + # Add the libdir to current_libdirs if it is the destination. + if test "X$destdir" = "X$libdir"; then + case "$current_libdirs " in + *" $libdir "*) ;; + *) func_append current_libdirs " $libdir" ;; + esac + else + # Note the libdir as a future libdir. + case "$future_libdirs " in + *" $libdir "*) ;; + *) func_append future_libdirs " $libdir" ;; + esac + fi + + func_dirname "$file" "/" "" + dir=$func_dirname_result + func_append dir "$objdir" + + if test -n "$relink_command"; then + # Determine the prefix the user has applied to our future dir. + inst_prefix_dir=`$ECHO "$destdir" | $SED -e "s%$libdir\$%%"` + + # Don't allow the user to place us outside of our expected + # location b/c this prevents finding dependent libraries that + # are installed to the same prefix. + # At present, this check doesn't affect windows .dll's that + # are installed into $libdir/../bin (currently, that works fine) + # but it's something to keep an eye on. + test "$inst_prefix_dir" = "$destdir" && \ + func_fatal_error "error: cannot install '$file' to a directory not ending in $libdir" + + if test -n "$inst_prefix_dir"; then + # Stick the inst_prefix_dir data into the link command. + relink_command=`$ECHO "$relink_command" | $SED "s%@inst_prefix_dir@%-inst-prefix-dir $inst_prefix_dir%"` + else + relink_command=`$ECHO "$relink_command" | $SED "s%@inst_prefix_dir@%%"` + fi + + func_warning "relinking '$file'" + func_show_eval "$relink_command" \ + 'func_fatal_error "error: relink '\''$file'\'' with the above command before installing it"' + fi + + # See the names of the shared library. + set dummy $library_names; shift + if test -n "$1"; then + realname=$1 + shift + + srcname=$realname + test -n "$relink_command" && srcname=${realname}T + + # Install the shared library and build the symlinks. + func_show_eval "$install_shared_prog $dir/$srcname $destdir/$realname" \ + 'exit $?' + tstripme=$stripme + case $host_os in + cygwin* | mingw* | pw32* | cegcc*) + case $realname in + *.dll.a) + tstripme= + ;; + esac + ;; + os2*) + case $realname in + *_dll.a) + tstripme= + ;; + esac + ;; + esac + if test -n "$tstripme" && test -n "$striplib"; then + func_show_eval "$striplib $destdir/$realname" 'exit $?' + fi + + if test "$#" -gt 0; then + # Delete the old symlinks, and create new ones. + # Try 'ln -sf' first, because the 'ln' binary might depend on + # the symlink we replace! Solaris /bin/ln does not understand -f, + # so we also need to try rm && ln -s. + for linkname + do + test "$linkname" != "$realname" \ + && func_show_eval "(cd $destdir && { $LN_S -f $realname $linkname || { $RM $linkname && $LN_S $realname $linkname; }; })" + done + fi + + # Do each command in the postinstall commands. + lib=$destdir/$realname + func_execute_cmds "$postinstall_cmds" 'exit $?' + fi + + # Install the pseudo-library for information purposes. + func_basename "$file" + name=$func_basename_result + instname=$dir/${name}i + func_show_eval "$install_prog $instname $destdir/$name" 'exit $?' + + # Maybe install the static library, too. + test -n "$old_library" && func_append staticlibs " $dir/$old_library" + ;; + + *.lo) + # Install (i.e. copy) a libtool object. + + # Figure out destination file name, if it wasn't already specified. + if test -n "$destname"; then + destfile=$destdir/$destname + else + func_basename "$file" + destfile=$func_basename_result + destfile=$destdir/$destfile + fi + + # Deduce the name of the destination old-style object file. + case $destfile in + *.lo) + func_lo2o "$destfile" + staticdest=$func_lo2o_result + ;; + *.$objext) + staticdest=$destfile + destfile= + ;; + *) + func_fatal_help "cannot copy a libtool object to '$destfile'" + ;; + esac + + # Install the libtool object if requested. + test -n "$destfile" && \ + func_show_eval "$install_prog $file $destfile" 'exit $?' + + # Install the old object if enabled. + if test yes = "$build_old_libs"; then + # Deduce the name of the old-style object file. + func_lo2o "$file" + staticobj=$func_lo2o_result + func_show_eval "$install_prog \$staticobj \$staticdest" 'exit $?' + fi + exit $EXIT_SUCCESS + ;; + + *) + # Figure out destination file name, if it wasn't already specified. + if test -n "$destname"; then + destfile=$destdir/$destname + else + func_basename "$file" + destfile=$func_basename_result + destfile=$destdir/$destfile + fi + + # If the file is missing, and there is a .exe on the end, strip it + # because it is most likely a libtool script we actually want to + # install + stripped_ext= + case $file in + *.exe) + if test ! -f "$file"; then + func_stripname '' '.exe' "$file" + file=$func_stripname_result + stripped_ext=.exe + fi + ;; + esac + + # Do a test to see if this is really a libtool program. + case $host in + *cygwin* | *mingw*) + if func_ltwrapper_executable_p "$file"; then + func_ltwrapper_scriptname "$file" + wrapper=$func_ltwrapper_scriptname_result + else + func_stripname '' '.exe' "$file" + wrapper=$func_stripname_result + fi + ;; + *) + wrapper=$file + ;; + esac + if func_ltwrapper_script_p "$wrapper"; then + notinst_deplibs= + relink_command= + + func_source "$wrapper" + + # Check the variables that should have been set. + test -z "$generated_by_libtool_version" && \ + func_fatal_error "invalid libtool wrapper script '$wrapper'" + + finalize=: + for lib in $notinst_deplibs; do + # Check to see that each library is installed. + libdir= + if test -f "$lib"; then + func_source "$lib" + fi + libfile=$libdir/`$ECHO "$lib" | $SED 's%^.*/%%g'` + if test -n "$libdir" && test ! -f "$libfile"; then + func_warning "'$lib' has not been installed in '$libdir'" + finalize=false + fi + done + + relink_command= + func_source "$wrapper" + + outputname= + if test no = "$fast_install" && test -n "$relink_command"; then + $opt_dry_run || { + if $finalize; then + tmpdir=`func_mktempdir` + func_basename "$file$stripped_ext" + file=$func_basename_result + outputname=$tmpdir/$file + # Replace the output file specification. + relink_command=`$ECHO "$relink_command" | $SED 's%@OUTPUT@%'"$outputname"'%g'` + + $opt_quiet || { + func_quote_for_expand "$relink_command" + eval "func_echo $func_quote_for_expand_result" + } + if eval "$relink_command"; then : + else + func_error "error: relink '$file' with the above command before installing it" + $opt_dry_run || ${RM}r "$tmpdir" + continue + fi + file=$outputname + else + func_warning "cannot relink '$file'" + fi + } + else + # Install the binary that we compiled earlier. + file=`$ECHO "$file$stripped_ext" | $SED "s%\([^/]*\)$%$objdir/\1%"` + fi + fi + + # remove .exe since cygwin /usr/bin/install will append another + # one anyway + case $install_prog,$host in + */usr/bin/install*,*cygwin*) + case $file:$destfile in + *.exe:*.exe) + # this is ok + ;; + *.exe:*) + destfile=$destfile.exe + ;; + *:*.exe) + func_stripname '' '.exe' "$destfile" + destfile=$func_stripname_result + ;; + esac + ;; + esac + func_show_eval "$install_prog\$stripme \$file \$destfile" 'exit $?' + $opt_dry_run || if test -n "$outputname"; then + ${RM}r "$tmpdir" + fi + ;; + esac + done + + for file in $staticlibs; do + func_basename "$file" + name=$func_basename_result + + # Set up the ranlib parameters. + oldlib=$destdir/$name + func_to_tool_file "$oldlib" func_convert_file_msys_to_w32 + tool_oldlib=$func_to_tool_file_result + + func_show_eval "$install_prog \$file \$oldlib" 'exit $?' + + if test -n "$stripme" && test -n "$old_striplib"; then + func_show_eval "$old_striplib $tool_oldlib" 'exit $?' + fi + + # Do each command in the postinstall commands. + func_execute_cmds "$old_postinstall_cmds" 'exit $?' + done + + test -n "$future_libdirs" && \ + func_warning "remember to run '$progname --finish$future_libdirs'" + + if test -n "$current_libdirs"; then + # Maybe just do a dry run. + $opt_dry_run && current_libdirs=" -n$current_libdirs" + exec_cmd='$SHELL "$progpath" $preserve_args --finish$current_libdirs' + else + exit $EXIT_SUCCESS + fi +} + +test install = "$opt_mode" && func_mode_install ${1+"$@"} + + +# func_generate_dlsyms outputname originator pic_p +# Extract symbols from dlprefiles and create ${outputname}S.o with +# a dlpreopen symbol table. +func_generate_dlsyms () +{ + $debug_cmd + + my_outputname=$1 + my_originator=$2 + my_pic_p=${3-false} + my_prefix=`$ECHO "$my_originator" | $SED 's%[^a-zA-Z0-9]%_%g'` + my_dlsyms= + + if test -n "$dlfiles$dlprefiles" || test no != "$dlself"; then + if test -n "$NM" && test -n "$global_symbol_pipe"; then + my_dlsyms=${my_outputname}S.c + else + func_error "not configured to extract global symbols from dlpreopened files" + fi + fi + + if test -n "$my_dlsyms"; then + case $my_dlsyms in + "") ;; + *.c) + # Discover the nlist of each of the dlfiles. + nlist=$output_objdir/$my_outputname.nm + + func_show_eval "$RM $nlist ${nlist}S ${nlist}T" + + # Parse the name list into a source file. + func_verbose "creating $output_objdir/$my_dlsyms" + + $opt_dry_run || $ECHO > "$output_objdir/$my_dlsyms" "\ +/* $my_dlsyms - symbol resolution table for '$my_outputname' dlsym emulation. */ +/* Generated by $PROGRAM (GNU $PACKAGE) $VERSION */ + +#ifdef __cplusplus +extern \"C\" { +#endif + +#if defined __GNUC__ && (((__GNUC__ == 4) && (__GNUC_MINOR__ >= 4)) || (__GNUC__ > 4)) +#pragma GCC diagnostic ignored \"-Wstrict-prototypes\" +#endif + +/* Keep this code in sync between libtool.m4, ltmain, lt_system.h, and tests. */ +#if defined _WIN32 || defined __CYGWIN__ || defined _WIN32_WCE +/* DATA imports from DLLs on WIN32 can't be const, because runtime + relocations are performed -- see ld's documentation on pseudo-relocs. */ +# define LT_DLSYM_CONST +#elif defined __osf__ +/* This system does not cope well with relocations in const data. */ +# define LT_DLSYM_CONST +#else +# define LT_DLSYM_CONST const +#endif + +#define STREQ(s1, s2) (strcmp ((s1), (s2)) == 0) + +/* External symbol declarations for the compiler. */\ +" + + if test yes = "$dlself"; then + func_verbose "generating symbol list for '$output'" + + $opt_dry_run || echo ': @PROGRAM@ ' > "$nlist" + + # Add our own program objects to the symbol list. + progfiles=`$ECHO "$objs$old_deplibs" | $SP2NL | $SED "$lo2o" | $NL2SP` + for progfile in $progfiles; do + func_to_tool_file "$progfile" func_convert_file_msys_to_w32 + func_verbose "extracting global C symbols from '$func_to_tool_file_result'" + $opt_dry_run || eval "$NM $func_to_tool_file_result | $global_symbol_pipe >> '$nlist'" + done + + if test -n "$exclude_expsyms"; then + $opt_dry_run || { + eval '$EGREP -v " ($exclude_expsyms)$" "$nlist" > "$nlist"T' + eval '$MV "$nlist"T "$nlist"' + } + fi + + if test -n "$export_symbols_regex"; then + $opt_dry_run || { + eval '$EGREP -e "$export_symbols_regex" "$nlist" > "$nlist"T' + eval '$MV "$nlist"T "$nlist"' + } + fi + + # Prepare the list of exported symbols + if test -z "$export_symbols"; then + export_symbols=$output_objdir/$outputname.exp + $opt_dry_run || { + $RM $export_symbols + eval "$SED -n -e '/^: @PROGRAM@ $/d' -e 's/^.* \(.*\)$/\1/p' "'< "$nlist" > "$export_symbols"' + case $host in + *cygwin* | *mingw* | *cegcc* ) + eval "echo EXPORTS "'> "$output_objdir/$outputname.def"' + eval 'cat "$export_symbols" >> "$output_objdir/$outputname.def"' + ;; + esac + } + else + $opt_dry_run || { + eval "$SED -e 's/\([].[*^$]\)/\\\\\1/g' -e 's/^/ /' -e 's/$/$/'"' < "$export_symbols" > "$output_objdir/$outputname.exp"' + eval '$GREP -f "$output_objdir/$outputname.exp" < "$nlist" > "$nlist"T' + eval '$MV "$nlist"T "$nlist"' + case $host in + *cygwin* | *mingw* | *cegcc* ) + eval "echo EXPORTS "'> "$output_objdir/$outputname.def"' + eval 'cat "$nlist" >> "$output_objdir/$outputname.def"' + ;; + esac + } + fi + fi + + for dlprefile in $dlprefiles; do + func_verbose "extracting global C symbols from '$dlprefile'" + func_basename "$dlprefile" + name=$func_basename_result + case $host in + *cygwin* | *mingw* | *cegcc* ) + # if an import library, we need to obtain dlname + if func_win32_import_lib_p "$dlprefile"; then + func_tr_sh "$dlprefile" + eval "curr_lafile=\$libfile_$func_tr_sh_result" + dlprefile_dlbasename= + if test -n "$curr_lafile" && func_lalib_p "$curr_lafile"; then + # Use subshell, to avoid clobbering current variable values + dlprefile_dlname=`source "$curr_lafile" && echo "$dlname"` + if test -n "$dlprefile_dlname"; then + func_basename "$dlprefile_dlname" + dlprefile_dlbasename=$func_basename_result + else + # no lafile. user explicitly requested -dlpreopen . + $sharedlib_from_linklib_cmd "$dlprefile" + dlprefile_dlbasename=$sharedlib_from_linklib_result + fi + fi + $opt_dry_run || { + if test -n "$dlprefile_dlbasename"; then + eval '$ECHO ": $dlprefile_dlbasename" >> "$nlist"' + else + func_warning "Could not compute DLL name from $name" + eval '$ECHO ": $name " >> "$nlist"' + fi + func_to_tool_file "$dlprefile" func_convert_file_msys_to_w32 + eval "$NM \"$func_to_tool_file_result\" 2>/dev/null | $global_symbol_pipe | + $SED -e '/I __imp/d' -e 's/I __nm_/D /;s/_nm__//' >> '$nlist'" + } + else # not an import lib + $opt_dry_run || { + eval '$ECHO ": $name " >> "$nlist"' + func_to_tool_file "$dlprefile" func_convert_file_msys_to_w32 + eval "$NM \"$func_to_tool_file_result\" 2>/dev/null | $global_symbol_pipe >> '$nlist'" + } + fi + ;; + *) + $opt_dry_run || { + eval '$ECHO ": $name " >> "$nlist"' + func_to_tool_file "$dlprefile" func_convert_file_msys_to_w32 + eval "$NM \"$func_to_tool_file_result\" 2>/dev/null | $global_symbol_pipe >> '$nlist'" + } + ;; + esac + done + + $opt_dry_run || { + # Make sure we have at least an empty file. + test -f "$nlist" || : > "$nlist" + + if test -n "$exclude_expsyms"; then + $EGREP -v " ($exclude_expsyms)$" "$nlist" > "$nlist"T + $MV "$nlist"T "$nlist" + fi + + # Try sorting and uniquifying the output. + if $GREP -v "^: " < "$nlist" | + if sort -k 3 /dev/null 2>&1; then + sort -k 3 + else + sort +2 + fi | + uniq > "$nlist"S; then + : + else + $GREP -v "^: " < "$nlist" > "$nlist"S + fi + + if test -f "$nlist"S; then + eval "$global_symbol_to_cdecl"' < "$nlist"S >> "$output_objdir/$my_dlsyms"' + else + echo '/* NONE */' >> "$output_objdir/$my_dlsyms" + fi + + func_show_eval '$RM "${nlist}I"' + if test -n "$global_symbol_to_import"; then + eval "$global_symbol_to_import"' < "$nlist"S > "$nlist"I' + fi + + echo >> "$output_objdir/$my_dlsyms" "\ + +/* The mapping between symbol names and symbols. */ +typedef struct { + const char *name; + void *address; +} lt_dlsymlist; +extern LT_DLSYM_CONST lt_dlsymlist +lt_${my_prefix}_LTX_preloaded_symbols[];\ +" + + if test -s "$nlist"I; then + echo >> "$output_objdir/$my_dlsyms" "\ +static void lt_syminit(void) +{ + LT_DLSYM_CONST lt_dlsymlist *symbol = lt_${my_prefix}_LTX_preloaded_symbols; + for (; symbol->name; ++symbol) + {" + $SED 's/.*/ if (STREQ (symbol->name, \"&\")) symbol->address = (void *) \&&;/' < "$nlist"I >> "$output_objdir/$my_dlsyms" + echo >> "$output_objdir/$my_dlsyms" "\ + } +}" + fi + echo >> "$output_objdir/$my_dlsyms" "\ +LT_DLSYM_CONST lt_dlsymlist +lt_${my_prefix}_LTX_preloaded_symbols[] = +{ {\"$my_originator\", (void *) 0}," + + if test -s "$nlist"I; then + echo >> "$output_objdir/$my_dlsyms" "\ + {\"@INIT@\", (void *) <_syminit}," + fi + + case $need_lib_prefix in + no) + eval "$global_symbol_to_c_name_address" < "$nlist" >> "$output_objdir/$my_dlsyms" + ;; + *) + eval "$global_symbol_to_c_name_address_lib_prefix" < "$nlist" >> "$output_objdir/$my_dlsyms" + ;; + esac + echo >> "$output_objdir/$my_dlsyms" "\ + {0, (void *) 0} +}; + +/* This works around a problem in FreeBSD linker */ +#ifdef FREEBSD_WORKAROUND +static const void *lt_preloaded_setup() { + return lt_${my_prefix}_LTX_preloaded_symbols; +} +#endif + +#ifdef __cplusplus +} +#endif\ +" + } # !$opt_dry_run + + pic_flag_for_symtable= + case "$compile_command " in + *" -static "*) ;; + *) + case $host in + # compiling the symbol table file with pic_flag works around + # a FreeBSD bug that causes programs to crash when -lm is + # linked before any other PIC object. But we must not use + # pic_flag when linking with -static. The problem exists in + # FreeBSD 2.2.6 and is fixed in FreeBSD 3.1. + *-*-freebsd2.*|*-*-freebsd3.0*|*-*-freebsdelf3.0*) + pic_flag_for_symtable=" $pic_flag -DFREEBSD_WORKAROUND" ;; + *-*-hpux*) + pic_flag_for_symtable=" $pic_flag" ;; + *) + $my_pic_p && pic_flag_for_symtable=" $pic_flag" + ;; + esac + ;; + esac + symtab_cflags= + for arg in $LTCFLAGS; do + case $arg in + -pie | -fpie | -fPIE) ;; + *) func_append symtab_cflags " $arg" ;; + esac + done + + # Now compile the dynamic symbol file. + func_show_eval '(cd $output_objdir && $LTCC$symtab_cflags -c$no_builtin_flag$pic_flag_for_symtable "$my_dlsyms")' 'exit $?' + + # Clean up the generated files. + func_show_eval '$RM "$output_objdir/$my_dlsyms" "$nlist" "${nlist}S" "${nlist}T" "${nlist}I"' + + # Transform the symbol file into the correct name. + symfileobj=$output_objdir/${my_outputname}S.$objext + case $host in + *cygwin* | *mingw* | *cegcc* ) + if test -f "$output_objdir/$my_outputname.def"; then + compile_command=`$ECHO "$compile_command" | $SED "s%@SYMFILE@%$output_objdir/$my_outputname.def $symfileobj%"` + finalize_command=`$ECHO "$finalize_command" | $SED "s%@SYMFILE@%$output_objdir/$my_outputname.def $symfileobj%"` + else + compile_command=`$ECHO "$compile_command" | $SED "s%@SYMFILE@%$symfileobj%"` + finalize_command=`$ECHO "$finalize_command" | $SED "s%@SYMFILE@%$symfileobj%"` + fi + ;; + *) + compile_command=`$ECHO "$compile_command" | $SED "s%@SYMFILE@%$symfileobj%"` + finalize_command=`$ECHO "$finalize_command" | $SED "s%@SYMFILE@%$symfileobj%"` + ;; + esac + ;; + *) + func_fatal_error "unknown suffix for '$my_dlsyms'" + ;; + esac + else + # We keep going just in case the user didn't refer to + # lt_preloaded_symbols. The linker will fail if global_symbol_pipe + # really was required. + + # Nullify the symbol file. + compile_command=`$ECHO "$compile_command" | $SED "s% @SYMFILE@%%"` + finalize_command=`$ECHO "$finalize_command" | $SED "s% @SYMFILE@%%"` + fi +} + +# func_cygming_gnu_implib_p ARG +# This predicate returns with zero status (TRUE) if +# ARG is a GNU/binutils-style import library. Returns +# with nonzero status (FALSE) otherwise. +func_cygming_gnu_implib_p () +{ + $debug_cmd + + func_to_tool_file "$1" func_convert_file_msys_to_w32 + func_cygming_gnu_implib_tmp=`$NM "$func_to_tool_file_result" | eval "$global_symbol_pipe" | $EGREP ' (_head_[A-Za-z0-9_]+_[ad]l*|[A-Za-z0-9_]+_[ad]l*_iname)$'` + test -n "$func_cygming_gnu_implib_tmp" +} + +# func_cygming_ms_implib_p ARG +# This predicate returns with zero status (TRUE) if +# ARG is an MS-style import library. Returns +# with nonzero status (FALSE) otherwise. +func_cygming_ms_implib_p () +{ + $debug_cmd + + func_to_tool_file "$1" func_convert_file_msys_to_w32 + func_cygming_ms_implib_tmp=`$NM "$func_to_tool_file_result" | eval "$global_symbol_pipe" | $GREP '_NULL_IMPORT_DESCRIPTOR'` + test -n "$func_cygming_ms_implib_tmp" +} + +# func_win32_libid arg +# return the library type of file 'arg' +# +# Need a lot of goo to handle *both* DLLs and import libs +# Has to be a shell function in order to 'eat' the argument +# that is supplied when $file_magic_command is called. +# Despite the name, also deal with 64 bit binaries. +func_win32_libid () +{ + $debug_cmd + + win32_libid_type=unknown + win32_fileres=`file -L $1 2>/dev/null` + case $win32_fileres in + *ar\ archive\ import\ library*) # definitely import + win32_libid_type="x86 archive import" + ;; + *ar\ archive*) # could be an import, or static + # Keep the egrep pattern in sync with the one in _LT_CHECK_MAGIC_METHOD. + if eval $OBJDUMP -f $1 | $SED -e '10q' 2>/dev/null | + $EGREP 'file format (pei*-i386(.*architecture: i386)?|pe-arm-wince|pe-x86-64)' >/dev/null; then + case $nm_interface in + "MS dumpbin") + if func_cygming_ms_implib_p "$1" || + func_cygming_gnu_implib_p "$1" + then + win32_nmres=import + else + win32_nmres= + fi + ;; + *) + func_to_tool_file "$1" func_convert_file_msys_to_w32 + win32_nmres=`eval $NM -f posix -A \"$func_to_tool_file_result\" | + $SED -n -e ' + 1,100{ + / I /{ + s|.*|import| + p + q + } + }'` + ;; + esac + case $win32_nmres in + import*) win32_libid_type="x86 archive import";; + *) win32_libid_type="x86 archive static";; + esac + fi + ;; + *DLL*) + win32_libid_type="x86 DLL" + ;; + *executable*) # but shell scripts are "executable" too... + case $win32_fileres in + *MS\ Windows\ PE\ Intel*) + win32_libid_type="x86 DLL" + ;; + esac + ;; + esac + $ECHO "$win32_libid_type" +} + +# func_cygming_dll_for_implib ARG +# +# Platform-specific function to extract the +# name of the DLL associated with the specified +# import library ARG. +# Invoked by eval'ing the libtool variable +# $sharedlib_from_linklib_cmd +# Result is available in the variable +# $sharedlib_from_linklib_result +func_cygming_dll_for_implib () +{ + $debug_cmd + + sharedlib_from_linklib_result=`$DLLTOOL --identify-strict --identify "$1"` +} + +# func_cygming_dll_for_implib_fallback_core SECTION_NAME LIBNAMEs +# +# The is the core of a fallback implementation of a +# platform-specific function to extract the name of the +# DLL associated with the specified import library LIBNAME. +# +# SECTION_NAME is either .idata$6 or .idata$7, depending +# on the platform and compiler that created the implib. +# +# Echos the name of the DLL associated with the +# specified import library. +func_cygming_dll_for_implib_fallback_core () +{ + $debug_cmd + + match_literal=`$ECHO "$1" | $SED "$sed_make_literal_regex"` + $OBJDUMP -s --section "$1" "$2" 2>/dev/null | + $SED '/^Contents of section '"$match_literal"':/{ + # Place marker at beginning of archive member dllname section + s/.*/====MARK====/ + p + d + } + # These lines can sometimes be longer than 43 characters, but + # are always uninteresting + /:[ ]*file format pe[i]\{,1\}-/d + /^In archive [^:]*:/d + # Ensure marker is printed + /^====MARK====/p + # Remove all lines with less than 43 characters + /^.\{43\}/!d + # From remaining lines, remove first 43 characters + s/^.\{43\}//' | + $SED -n ' + # Join marker and all lines until next marker into a single line + /^====MARK====/ b para + H + $ b para + b + :para + x + s/\n//g + # Remove the marker + s/^====MARK====// + # Remove trailing dots and whitespace + s/[\. \t]*$// + # Print + /./p' | + # we now have a list, one entry per line, of the stringified + # contents of the appropriate section of all members of the + # archive that possess that section. Heuristic: eliminate + # all those that have a first or second character that is + # a '.' (that is, objdump's representation of an unprintable + # character.) This should work for all archives with less than + # 0x302f exports -- but will fail for DLLs whose name actually + # begins with a literal '.' or a single character followed by + # a '.'. + # + # Of those that remain, print the first one. + $SED -e '/^\./d;/^.\./d;q' +} + +# func_cygming_dll_for_implib_fallback ARG +# Platform-specific function to extract the +# name of the DLL associated with the specified +# import library ARG. +# +# This fallback implementation is for use when $DLLTOOL +# does not support the --identify-strict option. +# Invoked by eval'ing the libtool variable +# $sharedlib_from_linklib_cmd +# Result is available in the variable +# $sharedlib_from_linklib_result +func_cygming_dll_for_implib_fallback () +{ + $debug_cmd + + if func_cygming_gnu_implib_p "$1"; then + # binutils import library + sharedlib_from_linklib_result=`func_cygming_dll_for_implib_fallback_core '.idata$7' "$1"` + elif func_cygming_ms_implib_p "$1"; then + # ms-generated import library + sharedlib_from_linklib_result=`func_cygming_dll_for_implib_fallback_core '.idata$6' "$1"` + else + # unknown + sharedlib_from_linklib_result= + fi +} + + +# func_extract_an_archive dir oldlib +func_extract_an_archive () +{ + $debug_cmd + + f_ex_an_ar_dir=$1; shift + f_ex_an_ar_oldlib=$1 + if test yes = "$lock_old_archive_extraction"; then + lockfile=$f_ex_an_ar_oldlib.lock + until $opt_dry_run || ln "$progpath" "$lockfile" 2>/dev/null; do + func_echo "Waiting for $lockfile to be removed" + sleep 2 + done + fi + func_show_eval "(cd \$f_ex_an_ar_dir && $AR x \"\$f_ex_an_ar_oldlib\")" \ + 'stat=$?; rm -f "$lockfile"; exit $stat' + if test yes = "$lock_old_archive_extraction"; then + $opt_dry_run || rm -f "$lockfile" + fi + if ($AR t "$f_ex_an_ar_oldlib" | sort | sort -uc >/dev/null 2>&1); then + : + else + func_fatal_error "object name conflicts in archive: $f_ex_an_ar_dir/$f_ex_an_ar_oldlib" + fi +} + + +# func_extract_archives gentop oldlib ... +func_extract_archives () +{ + $debug_cmd + + my_gentop=$1; shift + my_oldlibs=${1+"$@"} + my_oldobjs= + my_xlib= + my_xabs= + my_xdir= + + for my_xlib in $my_oldlibs; do + # Extract the objects. + case $my_xlib in + [\\/]* | [A-Za-z]:[\\/]*) my_xabs=$my_xlib ;; + *) my_xabs=`pwd`"/$my_xlib" ;; + esac + func_basename "$my_xlib" + my_xlib=$func_basename_result + my_xlib_u=$my_xlib + while :; do + case " $extracted_archives " in + *" $my_xlib_u "*) + func_arith $extracted_serial + 1 + extracted_serial=$func_arith_result + my_xlib_u=lt$extracted_serial-$my_xlib ;; + *) break ;; + esac + done + extracted_archives="$extracted_archives $my_xlib_u" + my_xdir=$my_gentop/$my_xlib_u + + func_mkdir_p "$my_xdir" + + case $host in + *-darwin*) + func_verbose "Extracting $my_xabs" + # Do not bother doing anything if just a dry run + $opt_dry_run || { + darwin_orig_dir=`pwd` + cd $my_xdir || exit $? + darwin_archive=$my_xabs + darwin_curdir=`pwd` + func_basename "$darwin_archive" + darwin_base_archive=$func_basename_result + darwin_arches=`$LIPO -info "$darwin_archive" 2>/dev/null | $GREP Architectures 2>/dev/null || true` + if test -n "$darwin_arches"; then + darwin_arches=`$ECHO "$darwin_arches" | $SED -e 's/.*are://'` + darwin_arch= + func_verbose "$darwin_base_archive has multiple architectures $darwin_arches" + for darwin_arch in $darwin_arches; do + func_mkdir_p "unfat-$$/$darwin_base_archive-$darwin_arch" + $LIPO -thin $darwin_arch -output "unfat-$$/$darwin_base_archive-$darwin_arch/$darwin_base_archive" "$darwin_archive" + cd "unfat-$$/$darwin_base_archive-$darwin_arch" + func_extract_an_archive "`pwd`" "$darwin_base_archive" + cd "$darwin_curdir" + $RM "unfat-$$/$darwin_base_archive-$darwin_arch/$darwin_base_archive" + done # $darwin_arches + ## Okay now we've a bunch of thin objects, gotta fatten them up :) + darwin_filelist=`find unfat-$$ -type f -name \*.o -print -o -name \*.lo -print | $SED -e "$sed_basename" | sort -u` + darwin_file= + darwin_files= + for darwin_file in $darwin_filelist; do + darwin_files=`find unfat-$$ -name $darwin_file -print | sort | $NL2SP` + $LIPO -create -output "$darwin_file" $darwin_files + done # $darwin_filelist + $RM -rf unfat-$$ + cd "$darwin_orig_dir" + else + cd $darwin_orig_dir + func_extract_an_archive "$my_xdir" "$my_xabs" + fi # $darwin_arches + } # !$opt_dry_run + ;; + *) + func_extract_an_archive "$my_xdir" "$my_xabs" + ;; + esac + my_oldobjs="$my_oldobjs "`find $my_xdir -name \*.$objext -print -o -name \*.lo -print | sort | $NL2SP` + done + + func_extract_archives_result=$my_oldobjs +} + + +# func_emit_wrapper [arg=no] +# +# Emit a libtool wrapper script on stdout. +# Don't directly open a file because we may want to +# incorporate the script contents within a cygwin/mingw +# wrapper executable. Must ONLY be called from within +# func_mode_link because it depends on a number of variables +# set therein. +# +# ARG is the value that the WRAPPER_SCRIPT_BELONGS_IN_OBJDIR +# variable will take. If 'yes', then the emitted script +# will assume that the directory where it is stored is +# the $objdir directory. This is a cygwin/mingw-specific +# behavior. +func_emit_wrapper () +{ + func_emit_wrapper_arg1=${1-no} + + $ECHO "\ +#! $SHELL + +# $output - temporary wrapper script for $objdir/$outputname +# Generated by $PROGRAM (GNU $PACKAGE) $VERSION +# +# The $output program cannot be directly executed until all the libtool +# libraries that it depends on are installed. +# +# This wrapper script should never be moved out of the build directory. +# If it is, it will not operate correctly. + +# Sed substitution that helps us do robust quoting. It backslashifies +# metacharacters that are still active within double-quoted strings. +sed_quote_subst='$sed_quote_subst' + +# Be Bourne compatible +if test -n \"\${ZSH_VERSION+set}\" && (emulate sh) >/dev/null 2>&1; then + emulate sh + NULLCMD=: + # Zsh 3.x and 4.x performs word splitting on \${1+\"\$@\"}, which + # is contrary to our usage. Disable this feature. + alias -g '\${1+\"\$@\"}'='\"\$@\"' + setopt NO_GLOB_SUBST +else + case \`(set -o) 2>/dev/null\` in *posix*) set -o posix;; esac +fi +BIN_SH=xpg4; export BIN_SH # for Tru64 +DUALCASE=1; export DUALCASE # for MKS sh + +# The HP-UX ksh and POSIX shell print the target directory to stdout +# if CDPATH is set. +(unset CDPATH) >/dev/null 2>&1 && unset CDPATH + +relink_command=\"$relink_command\" + +# This environment variable determines our operation mode. +if test \"\$libtool_install_magic\" = \"$magic\"; then + # install mode needs the following variables: + generated_by_libtool_version='$macro_version' + notinst_deplibs='$notinst_deplibs' +else + # When we are sourced in execute mode, \$file and \$ECHO are already set. + if test \"\$libtool_execute_magic\" != \"$magic\"; then + file=\"\$0\"" + + qECHO=`$ECHO "$ECHO" | $SED "$sed_quote_subst"` + $ECHO "\ + +# A function that is used when there is no print builtin or printf. +func_fallback_echo () +{ + eval 'cat <<_LTECHO_EOF +\$1 +_LTECHO_EOF' +} + ECHO=\"$qECHO\" + fi + +# Very basic option parsing. These options are (a) specific to +# the libtool wrapper, (b) are identical between the wrapper +# /script/ and the wrapper /executable/ that is used only on +# windows platforms, and (c) all begin with the string "--lt-" +# (application programs are unlikely to have options that match +# this pattern). +# +# There are only two supported options: --lt-debug and +# --lt-dump-script. There is, deliberately, no --lt-help. +# +# The first argument to this parsing function should be the +# script's $0 value, followed by "$@". +lt_option_debug= +func_parse_lt_options () +{ + lt_script_arg0=\$0 + shift + for lt_opt + do + case \"\$lt_opt\" in + --lt-debug) lt_option_debug=1 ;; + --lt-dump-script) + lt_dump_D=\`\$ECHO \"X\$lt_script_arg0\" | $SED -e 's/^X//' -e 's%/[^/]*$%%'\` + test \"X\$lt_dump_D\" = \"X\$lt_script_arg0\" && lt_dump_D=. + lt_dump_F=\`\$ECHO \"X\$lt_script_arg0\" | $SED -e 's/^X//' -e 's%^.*/%%'\` + cat \"\$lt_dump_D/\$lt_dump_F\" + exit 0 + ;; + --lt-*) + \$ECHO \"Unrecognized --lt- option: '\$lt_opt'\" 1>&2 + exit 1 + ;; + esac + done + + # Print the debug banner immediately: + if test -n \"\$lt_option_debug\"; then + echo \"$outputname:$output:\$LINENO: libtool wrapper (GNU $PACKAGE) $VERSION\" 1>&2 + fi +} + +# Used when --lt-debug. Prints its arguments to stdout +# (redirection is the responsibility of the caller) +func_lt_dump_args () +{ + lt_dump_args_N=1; + for lt_arg + do + \$ECHO \"$outputname:$output:\$LINENO: newargv[\$lt_dump_args_N]: \$lt_arg\" + lt_dump_args_N=\`expr \$lt_dump_args_N + 1\` + done +} + +# Core function for launching the target application +func_exec_program_core () +{ +" + case $host in + # Backslashes separate directories on plain windows + *-*-mingw | *-*-os2* | *-cegcc*) + $ECHO "\ + if test -n \"\$lt_option_debug\"; then + \$ECHO \"$outputname:$output:\$LINENO: newargv[0]: \$progdir\\\\\$program\" 1>&2 + func_lt_dump_args \${1+\"\$@\"} 1>&2 + fi + exec \"\$progdir\\\\\$program\" \${1+\"\$@\"} +" + ;; + + *) + $ECHO "\ + if test -n \"\$lt_option_debug\"; then + \$ECHO \"$outputname:$output:\$LINENO: newargv[0]: \$progdir/\$program\" 1>&2 + func_lt_dump_args \${1+\"\$@\"} 1>&2 + fi + exec \"\$progdir/\$program\" \${1+\"\$@\"} +" + ;; + esac + $ECHO "\ + \$ECHO \"\$0: cannot exec \$program \$*\" 1>&2 + exit 1 +} + +# A function to encapsulate launching the target application +# Strips options in the --lt-* namespace from \$@ and +# launches target application with the remaining arguments. +func_exec_program () +{ + case \" \$* \" in + *\\ --lt-*) + for lt_wr_arg + do + case \$lt_wr_arg in + --lt-*) ;; + *) set x \"\$@\" \"\$lt_wr_arg\"; shift;; + esac + shift + done ;; + esac + func_exec_program_core \${1+\"\$@\"} +} + + # Parse options + func_parse_lt_options \"\$0\" \${1+\"\$@\"} + + # Find the directory that this script lives in. + thisdir=\`\$ECHO \"\$file\" | $SED 's%/[^/]*$%%'\` + test \"x\$thisdir\" = \"x\$file\" && thisdir=. + + # Follow symbolic links until we get to the real thisdir. + file=\`ls -ld \"\$file\" | $SED -n 's/.*-> //p'\` + while test -n \"\$file\"; do + destdir=\`\$ECHO \"\$file\" | $SED 's%/[^/]*\$%%'\` + + # If there was a directory component, then change thisdir. + if test \"x\$destdir\" != \"x\$file\"; then + case \"\$destdir\" in + [\\\\/]* | [A-Za-z]:[\\\\/]*) thisdir=\"\$destdir\" ;; + *) thisdir=\"\$thisdir/\$destdir\" ;; + esac + fi + + file=\`\$ECHO \"\$file\" | $SED 's%^.*/%%'\` + file=\`ls -ld \"\$thisdir/\$file\" | $SED -n 's/.*-> //p'\` + done + + # Usually 'no', except on cygwin/mingw when embedded into + # the cwrapper. + WRAPPER_SCRIPT_BELONGS_IN_OBJDIR=$func_emit_wrapper_arg1 + if test \"\$WRAPPER_SCRIPT_BELONGS_IN_OBJDIR\" = \"yes\"; then + # special case for '.' + if test \"\$thisdir\" = \".\"; then + thisdir=\`pwd\` + fi + # remove .libs from thisdir + case \"\$thisdir\" in + *[\\\\/]$objdir ) thisdir=\`\$ECHO \"\$thisdir\" | $SED 's%[\\\\/][^\\\\/]*$%%'\` ;; + $objdir ) thisdir=. ;; + esac + fi + + # Try to get the absolute directory name. + absdir=\`cd \"\$thisdir\" && pwd\` + test -n \"\$absdir\" && thisdir=\"\$absdir\" +" + + if test yes = "$fast_install"; then + $ECHO "\ + program=lt-'$outputname'$exeext + progdir=\"\$thisdir/$objdir\" + + if test ! -f \"\$progdir/\$program\" || + { file=\`ls -1dt \"\$progdir/\$program\" \"\$progdir/../\$program\" 2>/dev/null | $SED 1q\`; \\ + test \"X\$file\" != \"X\$progdir/\$program\"; }; then + + file=\"\$\$-\$program\" + + if test ! -d \"\$progdir\"; then + $MKDIR \"\$progdir\" + else + $RM \"\$progdir/\$file\" + fi" + + $ECHO "\ + + # relink executable if necessary + if test -n \"\$relink_command\"; then + if relink_command_output=\`eval \$relink_command 2>&1\`; then : + else + \$ECHO \"\$relink_command_output\" >&2 + $RM \"\$progdir/\$file\" + exit 1 + fi + fi + + $MV \"\$progdir/\$file\" \"\$progdir/\$program\" 2>/dev/null || + { $RM \"\$progdir/\$program\"; + $MV \"\$progdir/\$file\" \"\$progdir/\$program\"; } + $RM \"\$progdir/\$file\" + fi" + else + $ECHO "\ + program='$outputname' + progdir=\"\$thisdir/$objdir\" +" + fi + + $ECHO "\ + + if test -f \"\$progdir/\$program\"; then" + + # fixup the dll searchpath if we need to. + # + # Fix the DLL searchpath if we need to. Do this before prepending + # to shlibpath, because on Windows, both are PATH and uninstalled + # libraries must come first. + if test -n "$dllsearchpath"; then + $ECHO "\ + # Add the dll search path components to the executable PATH + PATH=$dllsearchpath:\$PATH +" + fi + + # Export our shlibpath_var if we have one. + if test yes = "$shlibpath_overrides_runpath" && test -n "$shlibpath_var" && test -n "$temp_rpath"; then + $ECHO "\ + # Add our own library path to $shlibpath_var + $shlibpath_var=\"$temp_rpath\$$shlibpath_var\" + + # Some systems cannot cope with colon-terminated $shlibpath_var + # The second colon is a workaround for a bug in BeOS R4 sed + $shlibpath_var=\`\$ECHO \"\$$shlibpath_var\" | $SED 's/::*\$//'\` + + export $shlibpath_var +" + fi + + $ECHO "\ + if test \"\$libtool_execute_magic\" != \"$magic\"; then + # Run the actual program with our arguments. + func_exec_program \${1+\"\$@\"} + fi + else + # The program doesn't exist. + \$ECHO \"\$0: error: '\$progdir/\$program' does not exist\" 1>&2 + \$ECHO \"This script is just a wrapper for \$program.\" 1>&2 + \$ECHO \"See the $PACKAGE documentation for more information.\" 1>&2 + exit 1 + fi +fi\ +" +} + + +# func_emit_cwrapperexe_src +# emit the source code for a wrapper executable on stdout +# Must ONLY be called from within func_mode_link because +# it depends on a number of variable set therein. +func_emit_cwrapperexe_src () +{ + cat < +#include +#ifdef _MSC_VER +# include +# include +# include +#else +# include +# include +# ifdef __CYGWIN__ +# include +# endif +#endif +#include +#include +#include +#include +#include +#include +#include +#include + +#define STREQ(s1, s2) (strcmp ((s1), (s2)) == 0) + +/* declarations of non-ANSI functions */ +#if defined __MINGW32__ +# ifdef __STRICT_ANSI__ +int _putenv (const char *); +# endif +#elif defined __CYGWIN__ +# ifdef __STRICT_ANSI__ +char *realpath (const char *, char *); +int putenv (char *); +int setenv (const char *, const char *, int); +# endif +/* #elif defined other_platform || defined ... */ +#endif + +/* portability defines, excluding path handling macros */ +#if defined _MSC_VER +# define setmode _setmode +# define stat _stat +# define chmod _chmod +# define getcwd _getcwd +# define putenv _putenv +# define S_IXUSR _S_IEXEC +#elif defined __MINGW32__ +# define setmode _setmode +# define stat _stat +# define chmod _chmod +# define getcwd _getcwd +# define putenv _putenv +#elif defined __CYGWIN__ +# define HAVE_SETENV +# define FOPEN_WB "wb" +/* #elif defined other platforms ... */ +#endif + +#if defined PATH_MAX +# define LT_PATHMAX PATH_MAX +#elif defined MAXPATHLEN +# define LT_PATHMAX MAXPATHLEN +#else +# define LT_PATHMAX 1024 +#endif + +#ifndef S_IXOTH +# define S_IXOTH 0 +#endif +#ifndef S_IXGRP +# define S_IXGRP 0 +#endif + +/* path handling portability macros */ +#ifndef DIR_SEPARATOR +# define DIR_SEPARATOR '/' +# define PATH_SEPARATOR ':' +#endif + +#if defined _WIN32 || defined __MSDOS__ || defined __DJGPP__ || \ + defined __OS2__ +# define HAVE_DOS_BASED_FILE_SYSTEM +# define FOPEN_WB "wb" +# ifndef DIR_SEPARATOR_2 +# define DIR_SEPARATOR_2 '\\' +# endif +# ifndef PATH_SEPARATOR_2 +# define PATH_SEPARATOR_2 ';' +# endif +#endif + +#ifndef DIR_SEPARATOR_2 +# define IS_DIR_SEPARATOR(ch) ((ch) == DIR_SEPARATOR) +#else /* DIR_SEPARATOR_2 */ +# define IS_DIR_SEPARATOR(ch) \ + (((ch) == DIR_SEPARATOR) || ((ch) == DIR_SEPARATOR_2)) +#endif /* DIR_SEPARATOR_2 */ + +#ifndef PATH_SEPARATOR_2 +# define IS_PATH_SEPARATOR(ch) ((ch) == PATH_SEPARATOR) +#else /* PATH_SEPARATOR_2 */ +# define IS_PATH_SEPARATOR(ch) ((ch) == PATH_SEPARATOR_2) +#endif /* PATH_SEPARATOR_2 */ + +#ifndef FOPEN_WB +# define FOPEN_WB "w" +#endif +#ifndef _O_BINARY +# define _O_BINARY 0 +#endif + +#define XMALLOC(type, num) ((type *) xmalloc ((num) * sizeof(type))) +#define XFREE(stale) do { \ + if (stale) { free (stale); stale = 0; } \ +} while (0) + +#if defined LT_DEBUGWRAPPER +static int lt_debug = 1; +#else +static int lt_debug = 0; +#endif + +const char *program_name = "libtool-wrapper"; /* in case xstrdup fails */ + +void *xmalloc (size_t num); +char *xstrdup (const char *string); +const char *base_name (const char *name); +char *find_executable (const char *wrapper); +char *chase_symlinks (const char *pathspec); +int make_executable (const char *path); +int check_executable (const char *path); +char *strendzap (char *str, const char *pat); +void lt_debugprintf (const char *file, int line, const char *fmt, ...); +void lt_fatal (const char *file, int line, const char *message, ...); +static const char *nonnull (const char *s); +static const char *nonempty (const char *s); +void lt_setenv (const char *name, const char *value); +char *lt_extend_str (const char *orig_value, const char *add, int to_end); +void lt_update_exe_path (const char *name, const char *value); +void lt_update_lib_path (const char *name, const char *value); +char **prepare_spawn (char **argv); +void lt_dump_script (FILE *f); +EOF + + cat <= 0) + && (st.st_mode & (S_IXUSR | S_IXGRP | S_IXOTH))) + return 1; + else + return 0; +} + +int +make_executable (const char *path) +{ + int rval = 0; + struct stat st; + + lt_debugprintf (__FILE__, __LINE__, "(make_executable): %s\n", + nonempty (path)); + if ((!path) || (!*path)) + return 0; + + if (stat (path, &st) >= 0) + { + rval = chmod (path, st.st_mode | S_IXOTH | S_IXGRP | S_IXUSR); + } + return rval; +} + +/* Searches for the full path of the wrapper. Returns + newly allocated full path name if found, NULL otherwise + Does not chase symlinks, even on platforms that support them. +*/ +char * +find_executable (const char *wrapper) +{ + int has_slash = 0; + const char *p; + const char *p_next; + /* static buffer for getcwd */ + char tmp[LT_PATHMAX + 1]; + size_t tmp_len; + char *concat_name; + + lt_debugprintf (__FILE__, __LINE__, "(find_executable): %s\n", + nonempty (wrapper)); + + if ((wrapper == NULL) || (*wrapper == '\0')) + return NULL; + + /* Absolute path? */ +#if defined HAVE_DOS_BASED_FILE_SYSTEM + if (isalpha ((unsigned char) wrapper[0]) && wrapper[1] == ':') + { + concat_name = xstrdup (wrapper); + if (check_executable (concat_name)) + return concat_name; + XFREE (concat_name); + } + else + { +#endif + if (IS_DIR_SEPARATOR (wrapper[0])) + { + concat_name = xstrdup (wrapper); + if (check_executable (concat_name)) + return concat_name; + XFREE (concat_name); + } +#if defined HAVE_DOS_BASED_FILE_SYSTEM + } +#endif + + for (p = wrapper; *p; p++) + if (*p == '/') + { + has_slash = 1; + break; + } + if (!has_slash) + { + /* no slashes; search PATH */ + const char *path = getenv ("PATH"); + if (path != NULL) + { + for (p = path; *p; p = p_next) + { + const char *q; + size_t p_len; + for (q = p; *q; q++) + if (IS_PATH_SEPARATOR (*q)) + break; + p_len = (size_t) (q - p); + p_next = (*q == '\0' ? q : q + 1); + if (p_len == 0) + { + /* empty path: current directory */ + if (getcwd (tmp, LT_PATHMAX) == NULL) + lt_fatal (__FILE__, __LINE__, "getcwd failed: %s", + nonnull (strerror (errno))); + tmp_len = strlen (tmp); + concat_name = + XMALLOC (char, tmp_len + 1 + strlen (wrapper) + 1); + memcpy (concat_name, tmp, tmp_len); + concat_name[tmp_len] = '/'; + strcpy (concat_name + tmp_len + 1, wrapper); + } + else + { + concat_name = + XMALLOC (char, p_len + 1 + strlen (wrapper) + 1); + memcpy (concat_name, p, p_len); + concat_name[p_len] = '/'; + strcpy (concat_name + p_len + 1, wrapper); + } + if (check_executable (concat_name)) + return concat_name; + XFREE (concat_name); + } + } + /* not found in PATH; assume curdir */ + } + /* Relative path | not found in path: prepend cwd */ + if (getcwd (tmp, LT_PATHMAX) == NULL) + lt_fatal (__FILE__, __LINE__, "getcwd failed: %s", + nonnull (strerror (errno))); + tmp_len = strlen (tmp); + concat_name = XMALLOC (char, tmp_len + 1 + strlen (wrapper) + 1); + memcpy (concat_name, tmp, tmp_len); + concat_name[tmp_len] = '/'; + strcpy (concat_name + tmp_len + 1, wrapper); + + if (check_executable (concat_name)) + return concat_name; + XFREE (concat_name); + return NULL; +} + +char * +chase_symlinks (const char *pathspec) +{ +#ifndef S_ISLNK + return xstrdup (pathspec); +#else + char buf[LT_PATHMAX]; + struct stat s; + char *tmp_pathspec = xstrdup (pathspec); + char *p; + int has_symlinks = 0; + while (strlen (tmp_pathspec) && !has_symlinks) + { + lt_debugprintf (__FILE__, __LINE__, + "checking path component for symlinks: %s\n", + tmp_pathspec); + if (lstat (tmp_pathspec, &s) == 0) + { + if (S_ISLNK (s.st_mode) != 0) + { + has_symlinks = 1; + break; + } + + /* search backwards for last DIR_SEPARATOR */ + p = tmp_pathspec + strlen (tmp_pathspec) - 1; + while ((p > tmp_pathspec) && (!IS_DIR_SEPARATOR (*p))) + p--; + if ((p == tmp_pathspec) && (!IS_DIR_SEPARATOR (*p))) + { + /* no more DIR_SEPARATORS left */ + break; + } + *p = '\0'; + } + else + { + lt_fatal (__FILE__, __LINE__, + "error accessing file \"%s\": %s", + tmp_pathspec, nonnull (strerror (errno))); + } + } + XFREE (tmp_pathspec); + + if (!has_symlinks) + { + return xstrdup (pathspec); + } + + tmp_pathspec = realpath (pathspec, buf); + if (tmp_pathspec == 0) + { + lt_fatal (__FILE__, __LINE__, + "could not follow symlinks for %s", pathspec); + } + return xstrdup (tmp_pathspec); +#endif +} + +char * +strendzap (char *str, const char *pat) +{ + size_t len, patlen; + + assert (str != NULL); + assert (pat != NULL); + + len = strlen (str); + patlen = strlen (pat); + + if (patlen <= len) + { + str += len - patlen; + if (STREQ (str, pat)) + *str = '\0'; + } + return str; +} + +void +lt_debugprintf (const char *file, int line, const char *fmt, ...) +{ + va_list args; + if (lt_debug) + { + (void) fprintf (stderr, "%s:%s:%d: ", program_name, file, line); + va_start (args, fmt); + (void) vfprintf (stderr, fmt, args); + va_end (args); + } +} + +static void +lt_error_core (int exit_status, const char *file, + int line, const char *mode, + const char *message, va_list ap) +{ + fprintf (stderr, "%s:%s:%d: %s: ", program_name, file, line, mode); + vfprintf (stderr, message, ap); + fprintf (stderr, ".\n"); + + if (exit_status >= 0) + exit (exit_status); +} + +void +lt_fatal (const char *file, int line, const char *message, ...) +{ + va_list ap; + va_start (ap, message); + lt_error_core (EXIT_FAILURE, file, line, "FATAL", message, ap); + va_end (ap); +} + +static const char * +nonnull (const char *s) +{ + return s ? s : "(null)"; +} + +static const char * +nonempty (const char *s) +{ + return (s && !*s) ? "(empty)" : nonnull (s); +} + +void +lt_setenv (const char *name, const char *value) +{ + lt_debugprintf (__FILE__, __LINE__, + "(lt_setenv) setting '%s' to '%s'\n", + nonnull (name), nonnull (value)); + { +#ifdef HAVE_SETENV + /* always make a copy, for consistency with !HAVE_SETENV */ + char *str = xstrdup (value); + setenv (name, str, 1); +#else + size_t len = strlen (name) + 1 + strlen (value) + 1; + char *str = XMALLOC (char, len); + sprintf (str, "%s=%s", name, value); + if (putenv (str) != EXIT_SUCCESS) + { + XFREE (str); + } +#endif + } +} + +char * +lt_extend_str (const char *orig_value, const char *add, int to_end) +{ + char *new_value; + if (orig_value && *orig_value) + { + size_t orig_value_len = strlen (orig_value); + size_t add_len = strlen (add); + new_value = XMALLOC (char, add_len + orig_value_len + 1); + if (to_end) + { + strcpy (new_value, orig_value); + strcpy (new_value + orig_value_len, add); + } + else + { + strcpy (new_value, add); + strcpy (new_value + add_len, orig_value); + } + } + else + { + new_value = xstrdup (add); + } + return new_value; +} + +void +lt_update_exe_path (const char *name, const char *value) +{ + lt_debugprintf (__FILE__, __LINE__, + "(lt_update_exe_path) modifying '%s' by prepending '%s'\n", + nonnull (name), nonnull (value)); + + if (name && *name && value && *value) + { + char *new_value = lt_extend_str (getenv (name), value, 0); + /* some systems can't cope with a ':'-terminated path #' */ + size_t len = strlen (new_value); + while ((len > 0) && IS_PATH_SEPARATOR (new_value[len-1])) + { + new_value[--len] = '\0'; + } + lt_setenv (name, new_value); + XFREE (new_value); + } +} + +void +lt_update_lib_path (const char *name, const char *value) +{ + lt_debugprintf (__FILE__, __LINE__, + "(lt_update_lib_path) modifying '%s' by prepending '%s'\n", + nonnull (name), nonnull (value)); + + if (name && *name && value && *value) + { + char *new_value = lt_extend_str (getenv (name), value, 0); + lt_setenv (name, new_value); + XFREE (new_value); + } +} + +EOF + case $host_os in + mingw*) + cat <<"EOF" + +/* Prepares an argument vector before calling spawn(). + Note that spawn() does not by itself call the command interpreter + (getenv ("COMSPEC") != NULL ? getenv ("COMSPEC") : + ({ OSVERSIONINFO v; v.dwOSVersionInfoSize = sizeof(OSVERSIONINFO); + GetVersionEx(&v); + v.dwPlatformId == VER_PLATFORM_WIN32_NT; + }) ? "cmd.exe" : "command.com"). + Instead it simply concatenates the arguments, separated by ' ', and calls + CreateProcess(). We must quote the arguments since Win32 CreateProcess() + interprets characters like ' ', '\t', '\\', '"' (but not '<' and '>') in a + special way: + - Space and tab are interpreted as delimiters. They are not treated as + delimiters if they are surrounded by double quotes: "...". + - Unescaped double quotes are removed from the input. Their only effect is + that within double quotes, space and tab are treated like normal + characters. + - Backslashes not followed by double quotes are not special. + - But 2*n+1 backslashes followed by a double quote become + n backslashes followed by a double quote (n >= 0): + \" -> " + \\\" -> \" + \\\\\" -> \\" + */ +#define SHELL_SPECIAL_CHARS "\"\\ \001\002\003\004\005\006\007\010\011\012\013\014\015\016\017\020\021\022\023\024\025\026\027\030\031\032\033\034\035\036\037" +#define SHELL_SPACE_CHARS " \001\002\003\004\005\006\007\010\011\012\013\014\015\016\017\020\021\022\023\024\025\026\027\030\031\032\033\034\035\036\037" +char ** +prepare_spawn (char **argv) +{ + size_t argc; + char **new_argv; + size_t i; + + /* Count number of arguments. */ + for (argc = 0; argv[argc] != NULL; argc++) + ; + + /* Allocate new argument vector. */ + new_argv = XMALLOC (char *, argc + 1); + + /* Put quoted arguments into the new argument vector. */ + for (i = 0; i < argc; i++) + { + const char *string = argv[i]; + + if (string[0] == '\0') + new_argv[i] = xstrdup ("\"\""); + else if (strpbrk (string, SHELL_SPECIAL_CHARS) != NULL) + { + int quote_around = (strpbrk (string, SHELL_SPACE_CHARS) != NULL); + size_t length; + unsigned int backslashes; + const char *s; + char *quoted_string; + char *p; + + length = 0; + backslashes = 0; + if (quote_around) + length++; + for (s = string; *s != '\0'; s++) + { + char c = *s; + if (c == '"') + length += backslashes + 1; + length++; + if (c == '\\') + backslashes++; + else + backslashes = 0; + } + if (quote_around) + length += backslashes + 1; + + quoted_string = XMALLOC (char, length + 1); + + p = quoted_string; + backslashes = 0; + if (quote_around) + *p++ = '"'; + for (s = string; *s != '\0'; s++) + { + char c = *s; + if (c == '"') + { + unsigned int j; + for (j = backslashes + 1; j > 0; j--) + *p++ = '\\'; + } + *p++ = c; + if (c == '\\') + backslashes++; + else + backslashes = 0; + } + if (quote_around) + { + unsigned int j; + for (j = backslashes; j > 0; j--) + *p++ = '\\'; + *p++ = '"'; + } + *p = '\0'; + + new_argv[i] = quoted_string; + } + else + new_argv[i] = (char *) string; + } + new_argv[argc] = NULL; + + return new_argv; +} +EOF + ;; + esac + + cat <<"EOF" +void lt_dump_script (FILE* f) +{ +EOF + func_emit_wrapper yes | + $SED -n -e ' +s/^\(.\{79\}\)\(..*\)/\1\ +\2/ +h +s/\([\\"]\)/\\\1/g +s/$/\\n/ +s/\([^\n]*\).*/ fputs ("\1", f);/p +g +D' + cat <<"EOF" +} +EOF +} +# end: func_emit_cwrapperexe_src + +# func_win32_import_lib_p ARG +# True if ARG is an import lib, as indicated by $file_magic_cmd +func_win32_import_lib_p () +{ + $debug_cmd + + case `eval $file_magic_cmd \"\$1\" 2>/dev/null | $SED -e 10q` in + *import*) : ;; + *) false ;; + esac +} + +# func_suncc_cstd_abi +# !!ONLY CALL THIS FOR SUN CC AFTER $compile_command IS FULLY EXPANDED!! +# Several compiler flags select an ABI that is incompatible with the +# Cstd library. Avoid specifying it if any are in CXXFLAGS. +func_suncc_cstd_abi () +{ + $debug_cmd + + case " $compile_command " in + *" -compat=g "*|*\ -std=c++[0-9][0-9]\ *|*" -library=stdcxx4 "*|*" -library=stlport4 "*) + suncc_use_cstd_abi=no + ;; + *) + suncc_use_cstd_abi=yes + ;; + esac +} + +# func_mode_link arg... +func_mode_link () +{ + $debug_cmd + + case $host in + *-*-cygwin* | *-*-mingw* | *-*-pw32* | *-*-os2* | *-cegcc*) + # It is impossible to link a dll without this setting, and + # we shouldn't force the makefile maintainer to figure out + # what system we are compiling for in order to pass an extra + # flag for every libtool invocation. + # allow_undefined=no + + # FIXME: Unfortunately, there are problems with the above when trying + # to make a dll that has undefined symbols, in which case not + # even a static library is built. For now, we need to specify + # -no-undefined on the libtool link line when we can be certain + # that all symbols are satisfied, otherwise we get a static library. + allow_undefined=yes + ;; + *) + allow_undefined=yes + ;; + esac + libtool_args=$nonopt + base_compile="$nonopt $@" + compile_command=$nonopt + finalize_command=$nonopt + + compile_rpath= + finalize_rpath= + compile_shlibpath= + finalize_shlibpath= + convenience= + old_convenience= + deplibs= + old_deplibs= + compiler_flags= + linker_flags= + dllsearchpath= + lib_search_path=`pwd` + inst_prefix_dir= + new_inherited_linker_flags= + + avoid_version=no + bindir= + dlfiles= + dlprefiles= + dlself=no + export_dynamic=no + export_symbols= + export_symbols_regex= + generated= + libobjs= + ltlibs= + module=no + no_install=no + objs= + os2dllname= + non_pic_objects= + precious_files_regex= + prefer_static_libs=no + preload=false + prev= + prevarg= + release= + rpath= + xrpath= + perm_rpath= + temp_rpath= + thread_safe=no + vinfo= + vinfo_number=no + weak_libs= + single_module=$wl-single_module + func_infer_tag $base_compile + + # We need to know -static, to get the right output filenames. + for arg + do + case $arg in + -shared) + test yes != "$build_libtool_libs" \ + && func_fatal_configuration "cannot build a shared library" + build_old_libs=no + break + ;; + -all-static | -static | -static-libtool-libs) + case $arg in + -all-static) + if test yes = "$build_libtool_libs" && test -z "$link_static_flag"; then + func_warning "complete static linking is impossible in this configuration" + fi + if test -n "$link_static_flag"; then + dlopen_self=$dlopen_self_static + fi + prefer_static_libs=yes + ;; + -static) + if test -z "$pic_flag" && test -n "$link_static_flag"; then + dlopen_self=$dlopen_self_static + fi + prefer_static_libs=built + ;; + -static-libtool-libs) + if test -z "$pic_flag" && test -n "$link_static_flag"; then + dlopen_self=$dlopen_self_static + fi + prefer_static_libs=yes + ;; + esac + build_libtool_libs=no + build_old_libs=yes + break + ;; + esac + done + + # See if our shared archives depend on static archives. + test -n "$old_archive_from_new_cmds" && build_old_libs=yes + + # Go through the arguments, transforming them on the way. + while test "$#" -gt 0; do + arg=$1 + shift + func_quote_for_eval "$arg" + qarg=$func_quote_for_eval_unquoted_result + func_append libtool_args " $func_quote_for_eval_result" + + # If the previous option needs an argument, assign it. + if test -n "$prev"; then + case $prev in + output) + func_append compile_command " @OUTPUT@" + func_append finalize_command " @OUTPUT@" + ;; + esac + + case $prev in + bindir) + bindir=$arg + prev= + continue + ;; + dlfiles|dlprefiles) + $preload || { + # Add the symbol object into the linking commands. + func_append compile_command " @SYMFILE@" + func_append finalize_command " @SYMFILE@" + preload=: + } + case $arg in + *.la | *.lo) ;; # We handle these cases below. + force) + if test no = "$dlself"; then + dlself=needless + export_dynamic=yes + fi + prev= + continue + ;; + self) + if test dlprefiles = "$prev"; then + dlself=yes + elif test dlfiles = "$prev" && test yes != "$dlopen_self"; then + dlself=yes + else + dlself=needless + export_dynamic=yes + fi + prev= + continue + ;; + *) + if test dlfiles = "$prev"; then + func_append dlfiles " $arg" + else + func_append dlprefiles " $arg" + fi + prev= + continue + ;; + esac + ;; + expsyms) + export_symbols=$arg + test -f "$arg" \ + || func_fatal_error "symbol file '$arg' does not exist" + prev= + continue + ;; + expsyms_regex) + export_symbols_regex=$arg + prev= + continue + ;; + framework) + case $host in + *-*-darwin*) + case "$deplibs " in + *" $qarg.ltframework "*) ;; + *) func_append deplibs " $qarg.ltframework" # this is fixed later + ;; + esac + ;; + esac + prev= + continue + ;; + inst_prefix) + inst_prefix_dir=$arg + prev= + continue + ;; + mllvm) + # Clang does not use LLVM to link, so we can simply discard any + # '-mllvm $arg' options when doing the link step. + prev= + continue + ;; + objectlist) + if test -f "$arg"; then + save_arg=$arg + moreargs= + for fil in `cat "$save_arg"` + do +# func_append moreargs " $fil" + arg=$fil + # A libtool-controlled object. + + # Check to see that this really is a libtool object. + if func_lalib_unsafe_p "$arg"; then + pic_object= + non_pic_object= + + # Read the .lo file + func_source "$arg" + + if test -z "$pic_object" || + test -z "$non_pic_object" || + test none = "$pic_object" && + test none = "$non_pic_object"; then + func_fatal_error "cannot find name of object for '$arg'" + fi + + # Extract subdirectory from the argument. + func_dirname "$arg" "/" "" + xdir=$func_dirname_result + + if test none != "$pic_object"; then + # Prepend the subdirectory the object is found in. + pic_object=$xdir$pic_object + + if test dlfiles = "$prev"; then + if test yes = "$build_libtool_libs" && test yes = "$dlopen_support"; then + func_append dlfiles " $pic_object" + prev= + continue + else + # If libtool objects are unsupported, then we need to preload. + prev=dlprefiles + fi + fi + + # CHECK ME: I think I busted this. -Ossama + if test dlprefiles = "$prev"; then + # Preload the old-style object. + func_append dlprefiles " $pic_object" + prev= + fi + + # A PIC object. + func_append libobjs " $pic_object" + arg=$pic_object + fi + + # Non-PIC object. + if test none != "$non_pic_object"; then + # Prepend the subdirectory the object is found in. + non_pic_object=$xdir$non_pic_object + + # A standard non-PIC object + func_append non_pic_objects " $non_pic_object" + if test -z "$pic_object" || test none = "$pic_object"; then + arg=$non_pic_object + fi + else + # If the PIC object exists, use it instead. + # $xdir was prepended to $pic_object above. + non_pic_object=$pic_object + func_append non_pic_objects " $non_pic_object" + fi + else + # Only an error if not doing a dry-run. + if $opt_dry_run; then + # Extract subdirectory from the argument. + func_dirname "$arg" "/" "" + xdir=$func_dirname_result + + func_lo2o "$arg" + pic_object=$xdir$objdir/$func_lo2o_result + non_pic_object=$xdir$func_lo2o_result + func_append libobjs " $pic_object" + func_append non_pic_objects " $non_pic_object" + else + func_fatal_error "'$arg' is not a valid libtool object" + fi + fi + done + else + func_fatal_error "link input file '$arg' does not exist" + fi + arg=$save_arg + prev= + continue + ;; + os2dllname) + os2dllname=$arg + prev= + continue + ;; + precious_regex) + precious_files_regex=$arg + prev= + continue + ;; + release) + release=-$arg + prev= + continue + ;; + rpath | xrpath) + # We need an absolute path. + case $arg in + [\\/]* | [A-Za-z]:[\\/]*) ;; + *) + func_fatal_error "only absolute run-paths are allowed" + ;; + esac + if test rpath = "$prev"; then + case "$rpath " in + *" $arg "*) ;; + *) func_append rpath " $arg" ;; + esac + else + case "$xrpath " in + *" $arg "*) ;; + *) func_append xrpath " $arg" ;; + esac + fi + prev= + continue + ;; + shrext) + shrext_cmds=$arg + prev= + continue + ;; + weak) + func_append weak_libs " $arg" + prev= + continue + ;; + xcclinker) + func_append linker_flags " $qarg" + func_append compiler_flags " $qarg" + prev= + func_append compile_command " $qarg" + func_append finalize_command " $qarg" + continue + ;; + xcompiler) + func_append compiler_flags " $qarg" + prev= + func_append compile_command " $qarg" + func_append finalize_command " $qarg" + continue + ;; + xlinker) + func_append linker_flags " $qarg" + func_append compiler_flags " $wl$qarg" + prev= + func_append compile_command " $wl$qarg" + func_append finalize_command " $wl$qarg" + continue + ;; + *) + eval "$prev=\"\$arg\"" + prev= + continue + ;; + esac + fi # test -n "$prev" + + prevarg=$arg + + case $arg in + -all-static) + if test -n "$link_static_flag"; then + # See comment for -static flag below, for more details. + func_append compile_command " $link_static_flag" + func_append finalize_command " $link_static_flag" + fi + continue + ;; + + -allow-undefined) + # FIXME: remove this flag sometime in the future. + func_fatal_error "'-allow-undefined' must not be used because it is the default" + ;; + + -avoid-version) + avoid_version=yes + continue + ;; + + -bindir) + prev=bindir + continue + ;; + + -dlopen) + prev=dlfiles + continue + ;; + + -dlpreopen) + prev=dlprefiles + continue + ;; + + -export-dynamic) + export_dynamic=yes + continue + ;; + + -export-symbols | -export-symbols-regex) + if test -n "$export_symbols" || test -n "$export_symbols_regex"; then + func_fatal_error "more than one -exported-symbols argument is not allowed" + fi + if test X-export-symbols = "X$arg"; then + prev=expsyms + else + prev=expsyms_regex + fi + continue + ;; + + -framework) + prev=framework + continue + ;; + + -inst-prefix-dir) + prev=inst_prefix + continue + ;; + + # The native IRIX linker understands -LANG:*, -LIST:* and -LNO:* + # so, if we see these flags be careful not to treat them like -L + -L[A-Z][A-Z]*:*) + case $with_gcc/$host in + no/*-*-irix* | /*-*-irix*) + func_append compile_command " $arg" + func_append finalize_command " $arg" + ;; + esac + continue + ;; + + -L*) + func_stripname "-L" '' "$arg" + if test -z "$func_stripname_result"; then + if test "$#" -gt 0; then + func_fatal_error "require no space between '-L' and '$1'" + else + func_fatal_error "need path for '-L' option" + fi + fi + func_resolve_sysroot "$func_stripname_result" + dir=$func_resolve_sysroot_result + # We need an absolute path. + case $dir in + [\\/]* | [A-Za-z]:[\\/]*) ;; + *) + absdir=`cd "$dir" && pwd` + test -z "$absdir" && \ + func_fatal_error "cannot determine absolute directory name of '$dir'" + dir=$absdir + ;; + esac + case "$deplibs " in + *" -L$dir "* | *" $arg "*) + # Will only happen for absolute or sysroot arguments + ;; + *) + # Preserve sysroot, but never include relative directories + case $dir in + [\\/]* | [A-Za-z]:[\\/]* | =*) func_append deplibs " $arg" ;; + *) func_append deplibs " -L$dir" ;; + esac + func_append lib_search_path " $dir" + ;; + esac + case $host in + *-*-cygwin* | *-*-mingw* | *-*-pw32* | *-*-os2* | *-cegcc*) + testbindir=`$ECHO "$dir" | $SED 's*/lib$*/bin*'` + case :$dllsearchpath: in + *":$dir:"*) ;; + ::) dllsearchpath=$dir;; + *) func_append dllsearchpath ":$dir";; + esac + case :$dllsearchpath: in + *":$testbindir:"*) ;; + ::) dllsearchpath=$testbindir;; + *) func_append dllsearchpath ":$testbindir";; + esac + ;; + esac + continue + ;; + + -l*) + if test X-lc = "X$arg" || test X-lm = "X$arg"; then + case $host in + *-*-cygwin* | *-*-mingw* | *-*-pw32* | *-*-beos* | *-cegcc* | *-*-haiku*) + # These systems don't actually have a C or math library (as such) + continue + ;; + *-*-os2*) + # These systems don't actually have a C library (as such) + test X-lc = "X$arg" && continue + ;; + *-*-openbsd* | *-*-freebsd* | *-*-dragonfly* | *-*-bitrig*) + # Do not include libc due to us having libc/libc_r. + test X-lc = "X$arg" && continue + ;; + *-*-rhapsody* | *-*-darwin1.[012]) + # Rhapsody C and math libraries are in the System framework + func_append deplibs " System.ltframework" + continue + ;; + *-*-sco3.2v5* | *-*-sco5v6*) + # Causes problems with __ctype + test X-lc = "X$arg" && continue + ;; + *-*-sysv4.2uw2* | *-*-sysv5* | *-*-unixware* | *-*-OpenUNIX*) + # Compiler inserts libc in the correct place for threads to work + test X-lc = "X$arg" && continue + ;; + esac + elif test X-lc_r = "X$arg"; then + case $host in + *-*-openbsd* | *-*-freebsd* | *-*-dragonfly* | *-*-bitrig*) + # Do not include libc_r directly, use -pthread flag. + continue + ;; + esac + fi + func_append deplibs " $arg" + continue + ;; + + -mllvm) + prev=mllvm + continue + ;; + + -module) + module=yes + continue + ;; + + # Tru64 UNIX uses -model [arg] to determine the layout of C++ + # classes, name mangling, and exception handling. + # Darwin uses the -arch flag to determine output architecture. + -model|-arch|-isysroot|--sysroot) + func_append compiler_flags " $arg" + func_append compile_command " $arg" + func_append finalize_command " $arg" + prev=xcompiler + continue + ;; + + -mt|-mthreads|-kthread|-Kthread|-pthread|-pthreads|--thread-safe \ + |-threads|-fopenmp|-openmp|-mp|-xopenmp|-omp|-qsmp=*) + func_append compiler_flags " $arg" + func_append compile_command " $arg" + func_append finalize_command " $arg" + case "$new_inherited_linker_flags " in + *" $arg "*) ;; + * ) func_append new_inherited_linker_flags " $arg" ;; + esac + continue + ;; + + -multi_module) + single_module=$wl-multi_module + continue + ;; + + -no-fast-install) + fast_install=no + continue + ;; + + -no-install) + case $host in + *-*-cygwin* | *-*-mingw* | *-*-pw32* | *-*-os2* | *-*-darwin* | *-cegcc*) + # The PATH hackery in wrapper scripts is required on Windows + # and Darwin in order for the loader to find any dlls it needs. + func_warning "'-no-install' is ignored for $host" + func_warning "assuming '-no-fast-install' instead" + fast_install=no + ;; + *) no_install=yes ;; + esac + continue + ;; + + -no-undefined) + allow_undefined=no + continue + ;; + + -objectlist) + prev=objectlist + continue + ;; + + -os2dllname) + prev=os2dllname + continue + ;; + + -o) prev=output ;; + + -precious-files-regex) + prev=precious_regex + continue + ;; + + -release) + prev=release + continue + ;; + + -rpath) + prev=rpath + continue + ;; + + -R) + prev=xrpath + continue + ;; + + -R*) + func_stripname '-R' '' "$arg" + dir=$func_stripname_result + # We need an absolute path. + case $dir in + [\\/]* | [A-Za-z]:[\\/]*) ;; + =*) + func_stripname '=' '' "$dir" + dir=$lt_sysroot$func_stripname_result + ;; + *) + func_fatal_error "only absolute run-paths are allowed" + ;; + esac + case "$xrpath " in + *" $dir "*) ;; + *) func_append xrpath " $dir" ;; + esac + continue + ;; + + -shared) + # The effects of -shared are defined in a previous loop. + continue + ;; + + -shrext) + prev=shrext + continue + ;; + + -static | -static-libtool-libs) + # The effects of -static are defined in a previous loop. + # We used to do the same as -all-static on platforms that + # didn't have a PIC flag, but the assumption that the effects + # would be equivalent was wrong. It would break on at least + # Digital Unix and AIX. + continue + ;; + + -thread-safe) + thread_safe=yes + continue + ;; + + -version-info) + prev=vinfo + continue + ;; + + -version-number) + prev=vinfo + vinfo_number=yes + continue + ;; + + -weak) + prev=weak + continue + ;; + + -Wc,*) + func_stripname '-Wc,' '' "$arg" + args=$func_stripname_result + arg= + save_ifs=$IFS; IFS=, + for flag in $args; do + IFS=$save_ifs + func_quote_for_eval "$flag" + func_append arg " $func_quote_for_eval_result" + func_append compiler_flags " $func_quote_for_eval_result" + done + IFS=$save_ifs + func_stripname ' ' '' "$arg" + arg=$func_stripname_result + ;; + + -Wl,*) + func_stripname '-Wl,' '' "$arg" + args=$func_stripname_result + arg= + save_ifs=$IFS; IFS=, + for flag in $args; do + IFS=$save_ifs + func_quote_for_eval "$flag" + func_append arg " $wl$func_quote_for_eval_result" + func_append compiler_flags " $wl$func_quote_for_eval_result" + func_append linker_flags " $func_quote_for_eval_result" + done + IFS=$save_ifs + func_stripname ' ' '' "$arg" + arg=$func_stripname_result + ;; + + -Xcompiler) + prev=xcompiler + continue + ;; + + -Xlinker) + prev=xlinker + continue + ;; + + -XCClinker) + prev=xcclinker + continue + ;; + + # -msg_* for osf cc + -msg_*) + func_quote_for_eval "$arg" + arg=$func_quote_for_eval_result + ;; + + # Flags to be passed through unchanged, with rationale: + # -64, -mips[0-9] enable 64-bit mode for the SGI compiler + # -r[0-9][0-9]* specify processor for the SGI compiler + # -xarch=*, -xtarget=* enable 64-bit mode for the Sun compiler + # +DA*, +DD* enable 64-bit mode for the HP compiler + # -q* compiler args for the IBM compiler + # -m*, -t[45]*, -txscale* architecture-specific flags for GCC + # -F/path path to uninstalled frameworks, gcc on darwin + # -p, -pg, --coverage, -fprofile-* profiling flags for GCC + # -fstack-protector* stack protector flags for GCC + # @file GCC response files + # -tp=* Portland pgcc target processor selection + # --sysroot=* for sysroot support + # -O*, -g*, -flto*, -fwhopr*, -fuse-linker-plugin GCC link-time optimization + # -specs=* GCC specs files + # -stdlib=* select c++ std lib with clang + # -fsanitize=* Clang/GCC memory and address sanitizer + -64|-mips[0-9]|-r[0-9][0-9]*|-xarch=*|-xtarget=*|+DA*|+DD*|-q*|-m*| \ + -t[45]*|-txscale*|-p|-pg|--coverage|-fprofile-*|-F*|@*|-tp=*|--sysroot=*| \ + -O*|-g*|-flto*|-fwhopr*|-fuse-linker-plugin|-fstack-protector*|-stdlib=*| \ + -specs=*|-fsanitize=*) + func_quote_for_eval "$arg" + arg=$func_quote_for_eval_result + func_append compile_command " $arg" + func_append finalize_command " $arg" + func_append compiler_flags " $arg" + continue + ;; + + -Z*) + if test os2 = "`expr $host : '.*\(os2\)'`"; then + # OS/2 uses -Zxxx to specify OS/2-specific options + compiler_flags="$compiler_flags $arg" + func_append compile_command " $arg" + func_append finalize_command " $arg" + case $arg in + -Zlinker | -Zstack) + prev=xcompiler + ;; + esac + continue + else + # Otherwise treat like 'Some other compiler flag' below + func_quote_for_eval "$arg" + arg=$func_quote_for_eval_result + fi + ;; + + # Some other compiler flag. + -* | +*) + func_quote_for_eval "$arg" + arg=$func_quote_for_eval_result + ;; + + *.$objext) + # A standard object. + func_append objs " $arg" + ;; + + *.lo) + # A libtool-controlled object. + + # Check to see that this really is a libtool object. + if func_lalib_unsafe_p "$arg"; then + pic_object= + non_pic_object= + + # Read the .lo file + func_source "$arg" + + if test -z "$pic_object" || + test -z "$non_pic_object" || + test none = "$pic_object" && + test none = "$non_pic_object"; then + func_fatal_error "cannot find name of object for '$arg'" + fi + + # Extract subdirectory from the argument. + func_dirname "$arg" "/" "" + xdir=$func_dirname_result + + test none = "$pic_object" || { + # Prepend the subdirectory the object is found in. + pic_object=$xdir$pic_object + + if test dlfiles = "$prev"; then + if test yes = "$build_libtool_libs" && test yes = "$dlopen_support"; then + func_append dlfiles " $pic_object" + prev= + continue + else + # If libtool objects are unsupported, then we need to preload. + prev=dlprefiles + fi + fi + + # CHECK ME: I think I busted this. -Ossama + if test dlprefiles = "$prev"; then + # Preload the old-style object. + func_append dlprefiles " $pic_object" + prev= + fi + + # A PIC object. + func_append libobjs " $pic_object" + arg=$pic_object + } + + # Non-PIC object. + if test none != "$non_pic_object"; then + # Prepend the subdirectory the object is found in. + non_pic_object=$xdir$non_pic_object + + # A standard non-PIC object + func_append non_pic_objects " $non_pic_object" + if test -z "$pic_object" || test none = "$pic_object"; then + arg=$non_pic_object + fi + else + # If the PIC object exists, use it instead. + # $xdir was prepended to $pic_object above. + non_pic_object=$pic_object + func_append non_pic_objects " $non_pic_object" + fi + else + # Only an error if not doing a dry-run. + if $opt_dry_run; then + # Extract subdirectory from the argument. + func_dirname "$arg" "/" "" + xdir=$func_dirname_result + + func_lo2o "$arg" + pic_object=$xdir$objdir/$func_lo2o_result + non_pic_object=$xdir$func_lo2o_result + func_append libobjs " $pic_object" + func_append non_pic_objects " $non_pic_object" + else + func_fatal_error "'$arg' is not a valid libtool object" + fi + fi + ;; + + *.$libext) + # An archive. + func_append deplibs " $arg" + func_append old_deplibs " $arg" + continue + ;; + + *.la) + # A libtool-controlled library. + + func_resolve_sysroot "$arg" + if test dlfiles = "$prev"; then + # This library was specified with -dlopen. + func_append dlfiles " $func_resolve_sysroot_result" + prev= + elif test dlprefiles = "$prev"; then + # The library was specified with -dlpreopen. + func_append dlprefiles " $func_resolve_sysroot_result" + prev= + else + func_append deplibs " $func_resolve_sysroot_result" + fi + continue + ;; + + # Some other compiler argument. + *) + # Unknown arguments in both finalize_command and compile_command need + # to be aesthetically quoted because they are evaled later. + func_quote_for_eval "$arg" + arg=$func_quote_for_eval_result + ;; + esac # arg + + # Now actually substitute the argument into the commands. + if test -n "$arg"; then + func_append compile_command " $arg" + func_append finalize_command " $arg" + fi + done # argument parsing loop + + test -n "$prev" && \ + func_fatal_help "the '$prevarg' option requires an argument" + + if test yes = "$export_dynamic" && test -n "$export_dynamic_flag_spec"; then + eval arg=\"$export_dynamic_flag_spec\" + func_append compile_command " $arg" + func_append finalize_command " $arg" + fi + + oldlibs= + # calculate the name of the file, without its directory + func_basename "$output" + outputname=$func_basename_result + libobjs_save=$libobjs + + if test -n "$shlibpath_var"; then + # get the directories listed in $shlibpath_var + eval shlib_search_path=\`\$ECHO \"\$$shlibpath_var\" \| \$SED \'s/:/ /g\'\` + else + shlib_search_path= + fi + eval sys_lib_search_path=\"$sys_lib_search_path_spec\" + eval sys_lib_dlsearch_path=\"$sys_lib_dlsearch_path_spec\" + + # Definition is injected by LT_CONFIG during libtool generation. + func_munge_path_list sys_lib_dlsearch_path "$LT_SYS_LIBRARY_PATH" + + func_dirname "$output" "/" "" + output_objdir=$func_dirname_result$objdir + func_to_tool_file "$output_objdir/" + tool_output_objdir=$func_to_tool_file_result + # Create the object directory. + func_mkdir_p "$output_objdir" + + # Determine the type of output + case $output in + "") + func_fatal_help "you must specify an output file" + ;; + *.$libext) linkmode=oldlib ;; + *.lo | *.$objext) linkmode=obj ;; + *.la) linkmode=lib ;; + *) linkmode=prog ;; # Anything else should be a program. + esac + + specialdeplibs= + + libs= + # Find all interdependent deplibs by searching for libraries + # that are linked more than once (e.g. -la -lb -la) + for deplib in $deplibs; do + if $opt_preserve_dup_deps; then + case "$libs " in + *" $deplib "*) func_append specialdeplibs " $deplib" ;; + esac + fi + func_append libs " $deplib" + done + + if test lib = "$linkmode"; then + libs="$predeps $libs $compiler_lib_search_path $postdeps" + + # Compute libraries that are listed more than once in $predeps + # $postdeps and mark them as special (i.e., whose duplicates are + # not to be eliminated). + pre_post_deps= + if $opt_duplicate_compiler_generated_deps; then + for pre_post_dep in $predeps $postdeps; do + case "$pre_post_deps " in + *" $pre_post_dep "*) func_append specialdeplibs " $pre_post_deps" ;; + esac + func_append pre_post_deps " $pre_post_dep" + done + fi + pre_post_deps= + fi + + deplibs= + newdependency_libs= + newlib_search_path= + need_relink=no # whether we're linking any uninstalled libtool libraries + notinst_deplibs= # not-installed libtool libraries + notinst_path= # paths that contain not-installed libtool libraries + + case $linkmode in + lib) + passes="conv dlpreopen link" + for file in $dlfiles $dlprefiles; do + case $file in + *.la) ;; + *) + func_fatal_help "libraries can '-dlopen' only libtool libraries: $file" + ;; + esac + done + ;; + prog) + compile_deplibs= + finalize_deplibs= + alldeplibs=false + newdlfiles= + newdlprefiles= + passes="conv scan dlopen dlpreopen link" + ;; + *) passes="conv" + ;; + esac + + for pass in $passes; do + # The preopen pass in lib mode reverses $deplibs; put it back here + # so that -L comes before libs that need it for instance... + if test lib,link = "$linkmode,$pass"; then + ## FIXME: Find the place where the list is rebuilt in the wrong + ## order, and fix it there properly + tmp_deplibs= + for deplib in $deplibs; do + tmp_deplibs="$deplib $tmp_deplibs" + done + deplibs=$tmp_deplibs + fi + + if test lib,link = "$linkmode,$pass" || + test prog,scan = "$linkmode,$pass"; then + libs=$deplibs + deplibs= + fi + if test prog = "$linkmode"; then + case $pass in + dlopen) libs=$dlfiles ;; + dlpreopen) libs=$dlprefiles ;; + link) + libs="$deplibs %DEPLIBS%" + test "X$link_all_deplibs" != Xno && libs="$libs $dependency_libs" + ;; + esac + fi + if test lib,dlpreopen = "$linkmode,$pass"; then + # Collect and forward deplibs of preopened libtool libs + for lib in $dlprefiles; do + # Ignore non-libtool-libs + dependency_libs= + func_resolve_sysroot "$lib" + case $lib in + *.la) func_source "$func_resolve_sysroot_result" ;; + esac + + # Collect preopened libtool deplibs, except any this library + # has declared as weak libs + for deplib in $dependency_libs; do + func_basename "$deplib" + deplib_base=$func_basename_result + case " $weak_libs " in + *" $deplib_base "*) ;; + *) func_append deplibs " $deplib" ;; + esac + done + done + libs=$dlprefiles + fi + if test dlopen = "$pass"; then + # Collect dlpreopened libraries + save_deplibs=$deplibs + deplibs= + fi + + for deplib in $libs; do + lib= + found=false + case $deplib in + -mt|-mthreads|-kthread|-Kthread|-pthread|-pthreads|--thread-safe \ + |-threads|-fopenmp|-openmp|-mp|-xopenmp|-omp|-qsmp=*) + if test prog,link = "$linkmode,$pass"; then + compile_deplibs="$deplib $compile_deplibs" + finalize_deplibs="$deplib $finalize_deplibs" + else + func_append compiler_flags " $deplib" + if test lib = "$linkmode"; then + case "$new_inherited_linker_flags " in + *" $deplib "*) ;; + * ) func_append new_inherited_linker_flags " $deplib" ;; + esac + fi + fi + continue + ;; + -l*) + if test lib != "$linkmode" && test prog != "$linkmode"; then + func_warning "'-l' is ignored for archives/objects" + continue + fi + func_stripname '-l' '' "$deplib" + name=$func_stripname_result + if test lib = "$linkmode"; then + searchdirs="$newlib_search_path $lib_search_path $compiler_lib_search_dirs $sys_lib_search_path $shlib_search_path" + else + searchdirs="$newlib_search_path $lib_search_path $sys_lib_search_path $shlib_search_path" + fi + for searchdir in $searchdirs; do + for search_ext in .la $std_shrext .so .a; do + # Search the libtool library + lib=$searchdir/lib$name$search_ext + if test -f "$lib"; then + if test .la = "$search_ext"; then + found=: + else + found=false + fi + break 2 + fi + done + done + if $found; then + # deplib is a libtool library + # If $allow_libtool_libs_with_static_runtimes && $deplib is a stdlib, + # We need to do some special things here, and not later. + if test yes = "$allow_libtool_libs_with_static_runtimes"; then + case " $predeps $postdeps " in + *" $deplib "*) + if func_lalib_p "$lib"; then + library_names= + old_library= + func_source "$lib" + for l in $old_library $library_names; do + ll=$l + done + if test "X$ll" = "X$old_library"; then # only static version available + found=false + func_dirname "$lib" "" "." + ladir=$func_dirname_result + lib=$ladir/$old_library + if test prog,link = "$linkmode,$pass"; then + compile_deplibs="$deplib $compile_deplibs" + finalize_deplibs="$deplib $finalize_deplibs" + else + deplibs="$deplib $deplibs" + test lib = "$linkmode" && newdependency_libs="$deplib $newdependency_libs" + fi + continue + fi + fi + ;; + *) ;; + esac + fi + else + # deplib doesn't seem to be a libtool library + if test prog,link = "$linkmode,$pass"; then + compile_deplibs="$deplib $compile_deplibs" + finalize_deplibs="$deplib $finalize_deplibs" + else + deplibs="$deplib $deplibs" + test lib = "$linkmode" && newdependency_libs="$deplib $newdependency_libs" + fi + continue + fi + ;; # -l + *.ltframework) + if test prog,link = "$linkmode,$pass"; then + compile_deplibs="$deplib $compile_deplibs" + finalize_deplibs="$deplib $finalize_deplibs" + else + deplibs="$deplib $deplibs" + if test lib = "$linkmode"; then + case "$new_inherited_linker_flags " in + *" $deplib "*) ;; + * ) func_append new_inherited_linker_flags " $deplib" ;; + esac + fi + fi + continue + ;; + -L*) + case $linkmode in + lib) + deplibs="$deplib $deplibs" + test conv = "$pass" && continue + newdependency_libs="$deplib $newdependency_libs" + func_stripname '-L' '' "$deplib" + func_resolve_sysroot "$func_stripname_result" + func_append newlib_search_path " $func_resolve_sysroot_result" + ;; + prog) + if test conv = "$pass"; then + deplibs="$deplib $deplibs" + continue + fi + if test scan = "$pass"; then + deplibs="$deplib $deplibs" + else + compile_deplibs="$deplib $compile_deplibs" + finalize_deplibs="$deplib $finalize_deplibs" + fi + func_stripname '-L' '' "$deplib" + func_resolve_sysroot "$func_stripname_result" + func_append newlib_search_path " $func_resolve_sysroot_result" + ;; + *) + func_warning "'-L' is ignored for archives/objects" + ;; + esac # linkmode + continue + ;; # -L + -R*) + if test link = "$pass"; then + func_stripname '-R' '' "$deplib" + func_resolve_sysroot "$func_stripname_result" + dir=$func_resolve_sysroot_result + # Make sure the xrpath contains only unique directories. + case "$xrpath " in + *" $dir "*) ;; + *) func_append xrpath " $dir" ;; + esac + fi + deplibs="$deplib $deplibs" + continue + ;; + *.la) + func_resolve_sysroot "$deplib" + lib=$func_resolve_sysroot_result + ;; + *.$libext) + if test conv = "$pass"; then + deplibs="$deplib $deplibs" + continue + fi + case $linkmode in + lib) + # Linking convenience modules into shared libraries is allowed, + # but linking other static libraries is non-portable. + case " $dlpreconveniencelibs " in + *" $deplib "*) ;; + *) + valid_a_lib=false + case $deplibs_check_method in + match_pattern*) + set dummy $deplibs_check_method; shift + match_pattern_regex=`expr "$deplibs_check_method" : "$1 \(.*\)"` + if eval "\$ECHO \"$deplib\"" 2>/dev/null | $SED 10q \ + | $EGREP "$match_pattern_regex" > /dev/null; then + valid_a_lib=: + fi + ;; + pass_all) + valid_a_lib=: + ;; + esac + if $valid_a_lib; then + echo + $ECHO "*** Warning: Linking the shared library $output against the" + $ECHO "*** static library $deplib is not portable!" + deplibs="$deplib $deplibs" + else + echo + $ECHO "*** Warning: Trying to link with static lib archive $deplib." + echo "*** I have the capability to make that library automatically link in when" + echo "*** you link to this library. But I can only do this if you have a" + echo "*** shared version of the library, which you do not appear to have" + echo "*** because the file extensions .$libext of this argument makes me believe" + echo "*** that it is just a static archive that I should not use here." + fi + ;; + esac + continue + ;; + prog) + if test link != "$pass"; then + deplibs="$deplib $deplibs" + else + compile_deplibs="$deplib $compile_deplibs" + finalize_deplibs="$deplib $finalize_deplibs" + fi + continue + ;; + esac # linkmode + ;; # *.$libext + *.lo | *.$objext) + if test conv = "$pass"; then + deplibs="$deplib $deplibs" + elif test prog = "$linkmode"; then + if test dlpreopen = "$pass" || test yes != "$dlopen_support" || test no = "$build_libtool_libs"; then + # If there is no dlopen support or we're linking statically, + # we need to preload. + func_append newdlprefiles " $deplib" + compile_deplibs="$deplib $compile_deplibs" + finalize_deplibs="$deplib $finalize_deplibs" + else + func_append newdlfiles " $deplib" + fi + fi + continue + ;; + %DEPLIBS%) + alldeplibs=: + continue + ;; + esac # case $deplib + + $found || test -f "$lib" \ + || func_fatal_error "cannot find the library '$lib' or unhandled argument '$deplib'" + + # Check to see that this really is a libtool archive. + func_lalib_unsafe_p "$lib" \ + || func_fatal_error "'$lib' is not a valid libtool archive" + + func_dirname "$lib" "" "." + ladir=$func_dirname_result + + dlname= + dlopen= + dlpreopen= + libdir= + library_names= + old_library= + inherited_linker_flags= + # If the library was installed with an old release of libtool, + # it will not redefine variables installed, or shouldnotlink + installed=yes + shouldnotlink=no + avoidtemprpath= + + + # Read the .la file + func_source "$lib" + + # Convert "-framework foo" to "foo.ltframework" + if test -n "$inherited_linker_flags"; then + tmp_inherited_linker_flags=`$ECHO "$inherited_linker_flags" | $SED 's/-framework \([^ $]*\)/\1.ltframework/g'` + for tmp_inherited_linker_flag in $tmp_inherited_linker_flags; do + case " $new_inherited_linker_flags " in + *" $tmp_inherited_linker_flag "*) ;; + *) func_append new_inherited_linker_flags " $tmp_inherited_linker_flag";; + esac + done + fi + dependency_libs=`$ECHO " $dependency_libs" | $SED 's% \([^ $]*\).ltframework% -framework \1%g'` + if test lib,link = "$linkmode,$pass" || + test prog,scan = "$linkmode,$pass" || + { test prog != "$linkmode" && test lib != "$linkmode"; }; then + test -n "$dlopen" && func_append dlfiles " $dlopen" + test -n "$dlpreopen" && func_append dlprefiles " $dlpreopen" + fi + + if test conv = "$pass"; then + # Only check for convenience libraries + deplibs="$lib $deplibs" + if test -z "$libdir"; then + if test -z "$old_library"; then + func_fatal_error "cannot find name of link library for '$lib'" + fi + # It is a libtool convenience library, so add in its objects. + func_append convenience " $ladir/$objdir/$old_library" + func_append old_convenience " $ladir/$objdir/$old_library" + tmp_libs= + for deplib in $dependency_libs; do + deplibs="$deplib $deplibs" + if $opt_preserve_dup_deps; then + case "$tmp_libs " in + *" $deplib "*) func_append specialdeplibs " $deplib" ;; + esac + fi + func_append tmp_libs " $deplib" + done + elif test prog != "$linkmode" && test lib != "$linkmode"; then + func_fatal_error "'$lib' is not a convenience library" + fi + continue + fi # $pass = conv + + + # Get the name of the library we link against. + linklib= + if test -n "$old_library" && + { test yes = "$prefer_static_libs" || + test built,no = "$prefer_static_libs,$installed"; }; then + linklib=$old_library + else + for l in $old_library $library_names; do + linklib=$l + done + fi + if test -z "$linklib"; then + func_fatal_error "cannot find name of link library for '$lib'" + fi + + # This library was specified with -dlopen. + if test dlopen = "$pass"; then + test -z "$libdir" \ + && func_fatal_error "cannot -dlopen a convenience library: '$lib'" + if test -z "$dlname" || + test yes != "$dlopen_support" || + test no = "$build_libtool_libs" + then + # If there is no dlname, no dlopen support or we're linking + # statically, we need to preload. We also need to preload any + # dependent libraries so libltdl's deplib preloader doesn't + # bomb out in the load deplibs phase. + func_append dlprefiles " $lib $dependency_libs" + else + func_append newdlfiles " $lib" + fi + continue + fi # $pass = dlopen + + # We need an absolute path. + case $ladir in + [\\/]* | [A-Za-z]:[\\/]*) abs_ladir=$ladir ;; + *) + abs_ladir=`cd "$ladir" && pwd` + if test -z "$abs_ladir"; then + func_warning "cannot determine absolute directory name of '$ladir'" + func_warning "passing it literally to the linker, although it might fail" + abs_ladir=$ladir + fi + ;; + esac + func_basename "$lib" + laname=$func_basename_result + + # Find the relevant object directory and library name. + if test yes = "$installed"; then + if test ! -f "$lt_sysroot$libdir/$linklib" && test -f "$abs_ladir/$linklib"; then + func_warning "library '$lib' was moved." + dir=$ladir + absdir=$abs_ladir + libdir=$abs_ladir + else + dir=$lt_sysroot$libdir + absdir=$lt_sysroot$libdir + fi + test yes = "$hardcode_automatic" && avoidtemprpath=yes + else + if test ! -f "$ladir/$objdir/$linklib" && test -f "$abs_ladir/$linklib"; then + dir=$ladir + absdir=$abs_ladir + # Remove this search path later + func_append notinst_path " $abs_ladir" + else + dir=$ladir/$objdir + absdir=$abs_ladir/$objdir + # Remove this search path later + func_append notinst_path " $abs_ladir" + fi + fi # $installed = yes + func_stripname 'lib' '.la' "$laname" + name=$func_stripname_result + + # This library was specified with -dlpreopen. + if test dlpreopen = "$pass"; then + if test -z "$libdir" && test prog = "$linkmode"; then + func_fatal_error "only libraries may -dlpreopen a convenience library: '$lib'" + fi + case $host in + # special handling for platforms with PE-DLLs. + *cygwin* | *mingw* | *cegcc* ) + # Linker will automatically link against shared library if both + # static and shared are present. Therefore, ensure we extract + # symbols from the import library if a shared library is present + # (otherwise, the dlopen module name will be incorrect). We do + # this by putting the import library name into $newdlprefiles. + # We recover the dlopen module name by 'saving' the la file + # name in a special purpose variable, and (later) extracting the + # dlname from the la file. + if test -n "$dlname"; then + func_tr_sh "$dir/$linklib" + eval "libfile_$func_tr_sh_result=\$abs_ladir/\$laname" + func_append newdlprefiles " $dir/$linklib" + else + func_append newdlprefiles " $dir/$old_library" + # Keep a list of preopened convenience libraries to check + # that they are being used correctly in the link pass. + test -z "$libdir" && \ + func_append dlpreconveniencelibs " $dir/$old_library" + fi + ;; + * ) + # Prefer using a static library (so that no silly _DYNAMIC symbols + # are required to link). + if test -n "$old_library"; then + func_append newdlprefiles " $dir/$old_library" + # Keep a list of preopened convenience libraries to check + # that they are being used correctly in the link pass. + test -z "$libdir" && \ + func_append dlpreconveniencelibs " $dir/$old_library" + # Otherwise, use the dlname, so that lt_dlopen finds it. + elif test -n "$dlname"; then + func_append newdlprefiles " $dir/$dlname" + else + func_append newdlprefiles " $dir/$linklib" + fi + ;; + esac + fi # $pass = dlpreopen + + if test -z "$libdir"; then + # Link the convenience library + if test lib = "$linkmode"; then + deplibs="$dir/$old_library $deplibs" + elif test prog,link = "$linkmode,$pass"; then + compile_deplibs="$dir/$old_library $compile_deplibs" + finalize_deplibs="$dir/$old_library $finalize_deplibs" + else + deplibs="$lib $deplibs" # used for prog,scan pass + fi + continue + fi + + + if test prog = "$linkmode" && test link != "$pass"; then + func_append newlib_search_path " $ladir" + deplibs="$lib $deplibs" + + linkalldeplibs=false + if test no != "$link_all_deplibs" || test -z "$library_names" || + test no = "$build_libtool_libs"; then + linkalldeplibs=: + fi + + tmp_libs= + for deplib in $dependency_libs; do + case $deplib in + -L*) func_stripname '-L' '' "$deplib" + func_resolve_sysroot "$func_stripname_result" + func_append newlib_search_path " $func_resolve_sysroot_result" + ;; + esac + # Need to link against all dependency_libs? + if $linkalldeplibs; then + deplibs="$deplib $deplibs" + else + # Need to hardcode shared library paths + # or/and link against static libraries + newdependency_libs="$deplib $newdependency_libs" + fi + if $opt_preserve_dup_deps; then + case "$tmp_libs " in + *" $deplib "*) func_append specialdeplibs " $deplib" ;; + esac + fi + func_append tmp_libs " $deplib" + done # for deplib + continue + fi # $linkmode = prog... + + if test prog,link = "$linkmode,$pass"; then + if test -n "$library_names" && + { { test no = "$prefer_static_libs" || + test built,yes = "$prefer_static_libs,$installed"; } || + test -z "$old_library"; }; then + # We need to hardcode the library path + if test -n "$shlibpath_var" && test -z "$avoidtemprpath"; then + # Make sure the rpath contains only unique directories. + case $temp_rpath: in + *"$absdir:"*) ;; + *) func_append temp_rpath "$absdir:" ;; + esac + fi + + # Hardcode the library path. + # Skip directories that are in the system default run-time + # search path. + case " $sys_lib_dlsearch_path " in + *" $absdir "*) ;; + *) + case "$compile_rpath " in + *" $absdir "*) ;; + *) func_append compile_rpath " $absdir" ;; + esac + ;; + esac + case " $sys_lib_dlsearch_path " in + *" $libdir "*) ;; + *) + case "$finalize_rpath " in + *" $libdir "*) ;; + *) func_append finalize_rpath " $libdir" ;; + esac + ;; + esac + fi # $linkmode,$pass = prog,link... + + if $alldeplibs && + { test pass_all = "$deplibs_check_method" || + { test yes = "$build_libtool_libs" && + test -n "$library_names"; }; }; then + # We only need to search for static libraries + continue + fi + fi + + link_static=no # Whether the deplib will be linked statically + use_static_libs=$prefer_static_libs + if test built = "$use_static_libs" && test yes = "$installed"; then + use_static_libs=no + fi + if test -n "$library_names" && + { test no = "$use_static_libs" || test -z "$old_library"; }; then + case $host in + *cygwin* | *mingw* | *cegcc* | *os2*) + # No point in relinking DLLs because paths are not encoded + func_append notinst_deplibs " $lib" + need_relink=no + ;; + *) + if test no = "$installed"; then + func_append notinst_deplibs " $lib" + need_relink=yes + fi + ;; + esac + # This is a shared library + + # Warn about portability, can't link against -module's on some + # systems (darwin). Don't bleat about dlopened modules though! + dlopenmodule= + for dlpremoduletest in $dlprefiles; do + if test "X$dlpremoduletest" = "X$lib"; then + dlopenmodule=$dlpremoduletest + break + fi + done + if test -z "$dlopenmodule" && test yes = "$shouldnotlink" && test link = "$pass"; then + echo + if test prog = "$linkmode"; then + $ECHO "*** Warning: Linking the executable $output against the loadable module" + else + $ECHO "*** Warning: Linking the shared library $output against the loadable module" + fi + $ECHO "*** $linklib is not portable!" + fi + if test lib = "$linkmode" && + test yes = "$hardcode_into_libs"; then + # Hardcode the library path. + # Skip directories that are in the system default run-time + # search path. + case " $sys_lib_dlsearch_path " in + *" $absdir "*) ;; + *) + case "$compile_rpath " in + *" $absdir "*) ;; + *) func_append compile_rpath " $absdir" ;; + esac + ;; + esac + case " $sys_lib_dlsearch_path " in + *" $libdir "*) ;; + *) + case "$finalize_rpath " in + *" $libdir "*) ;; + *) func_append finalize_rpath " $libdir" ;; + esac + ;; + esac + fi + + if test -n "$old_archive_from_expsyms_cmds"; then + # figure out the soname + set dummy $library_names + shift + realname=$1 + shift + libname=`eval "\\$ECHO \"$libname_spec\""` + # use dlname if we got it. it's perfectly good, no? + if test -n "$dlname"; then + soname=$dlname + elif test -n "$soname_spec"; then + # bleh windows + case $host in + *cygwin* | mingw* | *cegcc* | *os2*) + func_arith $current - $age + major=$func_arith_result + versuffix=-$major + ;; + esac + eval soname=\"$soname_spec\" + else + soname=$realname + fi + + # Make a new name for the extract_expsyms_cmds to use + soroot=$soname + func_basename "$soroot" + soname=$func_basename_result + func_stripname 'lib' '.dll' "$soname" + newlib=libimp-$func_stripname_result.a + + # If the library has no export list, then create one now + if test -f "$output_objdir/$soname-def"; then : + else + func_verbose "extracting exported symbol list from '$soname'" + func_execute_cmds "$extract_expsyms_cmds" 'exit $?' + fi + + # Create $newlib + if test -f "$output_objdir/$newlib"; then :; else + func_verbose "generating import library for '$soname'" + func_execute_cmds "$old_archive_from_expsyms_cmds" 'exit $?' + fi + # make sure the library variables are pointing to the new library + dir=$output_objdir + linklib=$newlib + fi # test -n "$old_archive_from_expsyms_cmds" + + if test prog = "$linkmode" || test relink != "$opt_mode"; then + add_shlibpath= + add_dir= + add= + lib_linked=yes + case $hardcode_action in + immediate | unsupported) + if test no = "$hardcode_direct"; then + add=$dir/$linklib + case $host in + *-*-sco3.2v5.0.[024]*) add_dir=-L$dir ;; + *-*-sysv4*uw2*) add_dir=-L$dir ;; + *-*-sysv5OpenUNIX* | *-*-sysv5UnixWare7.[01].[10]* | \ + *-*-unixware7*) add_dir=-L$dir ;; + *-*-darwin* ) + # if the lib is a (non-dlopened) module then we cannot + # link against it, someone is ignoring the earlier warnings + if /usr/bin/file -L $add 2> /dev/null | + $GREP ": [^:]* bundle" >/dev/null; then + if test "X$dlopenmodule" != "X$lib"; then + $ECHO "*** Warning: lib $linklib is a module, not a shared library" + if test -z "$old_library"; then + echo + echo "*** And there doesn't seem to be a static archive available" + echo "*** The link will probably fail, sorry" + else + add=$dir/$old_library + fi + elif test -n "$old_library"; then + add=$dir/$old_library + fi + fi + esac + elif test no = "$hardcode_minus_L"; then + case $host in + *-*-sunos*) add_shlibpath=$dir ;; + esac + add_dir=-L$dir + add=-l$name + elif test no = "$hardcode_shlibpath_var"; then + add_shlibpath=$dir + add=-l$name + else + lib_linked=no + fi + ;; + relink) + if test yes = "$hardcode_direct" && + test no = "$hardcode_direct_absolute"; then + add=$dir/$linklib + elif test yes = "$hardcode_minus_L"; then + add_dir=-L$absdir + # Try looking first in the location we're being installed to. + if test -n "$inst_prefix_dir"; then + case $libdir in + [\\/]*) + func_append add_dir " -L$inst_prefix_dir$libdir" + ;; + esac + fi + add=-l$name + elif test yes = "$hardcode_shlibpath_var"; then + add_shlibpath=$dir + add=-l$name + else + lib_linked=no + fi + ;; + *) lib_linked=no ;; + esac + + if test yes != "$lib_linked"; then + func_fatal_configuration "unsupported hardcode properties" + fi + + if test -n "$add_shlibpath"; then + case :$compile_shlibpath: in + *":$add_shlibpath:"*) ;; + *) func_append compile_shlibpath "$add_shlibpath:" ;; + esac + fi + if test prog = "$linkmode"; then + test -n "$add_dir" && compile_deplibs="$add_dir $compile_deplibs" + test -n "$add" && compile_deplibs="$add $compile_deplibs" + else + test -n "$add_dir" && deplibs="$add_dir $deplibs" + test -n "$add" && deplibs="$add $deplibs" + if test yes != "$hardcode_direct" && + test yes != "$hardcode_minus_L" && + test yes = "$hardcode_shlibpath_var"; then + case :$finalize_shlibpath: in + *":$libdir:"*) ;; + *) func_append finalize_shlibpath "$libdir:" ;; + esac + fi + fi + fi + + if test prog = "$linkmode" || test relink = "$opt_mode"; then + add_shlibpath= + add_dir= + add= + # Finalize command for both is simple: just hardcode it. + if test yes = "$hardcode_direct" && + test no = "$hardcode_direct_absolute"; then + add=$libdir/$linklib + elif test yes = "$hardcode_minus_L"; then + add_dir=-L$libdir + add=-l$name + elif test yes = "$hardcode_shlibpath_var"; then + case :$finalize_shlibpath: in + *":$libdir:"*) ;; + *) func_append finalize_shlibpath "$libdir:" ;; + esac + add=-l$name + elif test yes = "$hardcode_automatic"; then + if test -n "$inst_prefix_dir" && + test -f "$inst_prefix_dir$libdir/$linklib"; then + add=$inst_prefix_dir$libdir/$linklib + else + add=$libdir/$linklib + fi + else + # We cannot seem to hardcode it, guess we'll fake it. + add_dir=-L$libdir + # Try looking first in the location we're being installed to. + if test -n "$inst_prefix_dir"; then + case $libdir in + [\\/]*) + func_append add_dir " -L$inst_prefix_dir$libdir" + ;; + esac + fi + add=-l$name + fi + + if test prog = "$linkmode"; then + test -n "$add_dir" && finalize_deplibs="$add_dir $finalize_deplibs" + test -n "$add" && finalize_deplibs="$add $finalize_deplibs" + else + test -n "$add_dir" && deplibs="$add_dir $deplibs" + test -n "$add" && deplibs="$add $deplibs" + fi + fi + elif test prog = "$linkmode"; then + # Here we assume that one of hardcode_direct or hardcode_minus_L + # is not unsupported. This is valid on all known static and + # shared platforms. + if test unsupported != "$hardcode_direct"; then + test -n "$old_library" && linklib=$old_library + compile_deplibs="$dir/$linklib $compile_deplibs" + finalize_deplibs="$dir/$linklib $finalize_deplibs" + else + compile_deplibs="-l$name -L$dir $compile_deplibs" + finalize_deplibs="-l$name -L$dir $finalize_deplibs" + fi + elif test yes = "$build_libtool_libs"; then + # Not a shared library + if test pass_all != "$deplibs_check_method"; then + # We're trying link a shared library against a static one + # but the system doesn't support it. + + # Just print a warning and add the library to dependency_libs so + # that the program can be linked against the static library. + echo + $ECHO "*** Warning: This system cannot link to static lib archive $lib." + echo "*** I have the capability to make that library automatically link in when" + echo "*** you link to this library. But I can only do this if you have a" + echo "*** shared version of the library, which you do not appear to have." + if test yes = "$module"; then + echo "*** But as you try to build a module library, libtool will still create " + echo "*** a static module, that should work as long as the dlopening application" + echo "*** is linked with the -dlopen flag to resolve symbols at runtime." + if test -z "$global_symbol_pipe"; then + echo + echo "*** However, this would only work if libtool was able to extract symbol" + echo "*** lists from a program, using 'nm' or equivalent, but libtool could" + echo "*** not find such a program. So, this module is probably useless." + echo "*** 'nm' from GNU binutils and a full rebuild may help." + fi + if test no = "$build_old_libs"; then + build_libtool_libs=module + build_old_libs=yes + else + build_libtool_libs=no + fi + fi + else + deplibs="$dir/$old_library $deplibs" + link_static=yes + fi + fi # link shared/static library? + + if test lib = "$linkmode"; then + if test -n "$dependency_libs" && + { test yes != "$hardcode_into_libs" || + test yes = "$build_old_libs" || + test yes = "$link_static"; }; then + # Extract -R from dependency_libs + temp_deplibs= + for libdir in $dependency_libs; do + case $libdir in + -R*) func_stripname '-R' '' "$libdir" + temp_xrpath=$func_stripname_result + case " $xrpath " in + *" $temp_xrpath "*) ;; + *) func_append xrpath " $temp_xrpath";; + esac;; + *) func_append temp_deplibs " $libdir";; + esac + done + dependency_libs=$temp_deplibs + fi + + func_append newlib_search_path " $absdir" + # Link against this library + test no = "$link_static" && newdependency_libs="$abs_ladir/$laname $newdependency_libs" + # ... and its dependency_libs + tmp_libs= + for deplib in $dependency_libs; do + newdependency_libs="$deplib $newdependency_libs" + case $deplib in + -L*) func_stripname '-L' '' "$deplib" + func_resolve_sysroot "$func_stripname_result";; + *) func_resolve_sysroot "$deplib" ;; + esac + if $opt_preserve_dup_deps; then + case "$tmp_libs " in + *" $func_resolve_sysroot_result "*) + func_append specialdeplibs " $func_resolve_sysroot_result" ;; + esac + fi + func_append tmp_libs " $func_resolve_sysroot_result" + done + + if test no != "$link_all_deplibs"; then + # Add the search paths of all dependency libraries + for deplib in $dependency_libs; do + path= + case $deplib in + -L*) path=$deplib ;; + *.la) + func_resolve_sysroot "$deplib" + deplib=$func_resolve_sysroot_result + func_dirname "$deplib" "" "." + dir=$func_dirname_result + # We need an absolute path. + case $dir in + [\\/]* | [A-Za-z]:[\\/]*) absdir=$dir ;; + *) + absdir=`cd "$dir" && pwd` + if test -z "$absdir"; then + func_warning "cannot determine absolute directory name of '$dir'" + absdir=$dir + fi + ;; + esac + if $GREP "^installed=no" $deplib > /dev/null; then + case $host in + *-*-darwin*) + depdepl= + eval deplibrary_names=`$SED -n -e 's/^library_names=\(.*\)$/\1/p' $deplib` + if test -n "$deplibrary_names"; then + for tmp in $deplibrary_names; do + depdepl=$tmp + done + if test -f "$absdir/$objdir/$depdepl"; then + depdepl=$absdir/$objdir/$depdepl + darwin_install_name=`$OTOOL -L $depdepl | awk '{if (NR == 2) {print $1;exit}}'` + if test -z "$darwin_install_name"; then + darwin_install_name=`$OTOOL64 -L $depdepl | awk '{if (NR == 2) {print $1;exit}}'` + fi + func_append compiler_flags " $wl-dylib_file $wl$darwin_install_name:$depdepl" + func_append linker_flags " -dylib_file $darwin_install_name:$depdepl" + path= + fi + fi + ;; + *) + path=-L$absdir/$objdir + ;; + esac + else + eval libdir=`$SED -n -e 's/^libdir=\(.*\)$/\1/p' $deplib` + test -z "$libdir" && \ + func_fatal_error "'$deplib' is not a valid libtool archive" + test "$absdir" != "$libdir" && \ + func_warning "'$deplib' seems to be moved" + + path=-L$absdir + fi + ;; + esac + case " $deplibs " in + *" $path "*) ;; + *) deplibs="$path $deplibs" ;; + esac + done + fi # link_all_deplibs != no + fi # linkmode = lib + done # for deplib in $libs + if test link = "$pass"; then + if test prog = "$linkmode"; then + compile_deplibs="$new_inherited_linker_flags $compile_deplibs" + finalize_deplibs="$new_inherited_linker_flags $finalize_deplibs" + else + compiler_flags="$compiler_flags "`$ECHO " $new_inherited_linker_flags" | $SED 's% \([^ $]*\).ltframework% -framework \1%g'` + fi + fi + dependency_libs=$newdependency_libs + if test dlpreopen = "$pass"; then + # Link the dlpreopened libraries before other libraries + for deplib in $save_deplibs; do + deplibs="$deplib $deplibs" + done + fi + if test dlopen != "$pass"; then + test conv = "$pass" || { + # Make sure lib_search_path contains only unique directories. + lib_search_path= + for dir in $newlib_search_path; do + case "$lib_search_path " in + *" $dir "*) ;; + *) func_append lib_search_path " $dir" ;; + esac + done + newlib_search_path= + } + + if test prog,link = "$linkmode,$pass"; then + vars="compile_deplibs finalize_deplibs" + else + vars=deplibs + fi + for var in $vars dependency_libs; do + # Add libraries to $var in reverse order + eval tmp_libs=\"\$$var\" + new_libs= + for deplib in $tmp_libs; do + # FIXME: Pedantically, this is the right thing to do, so + # that some nasty dependency loop isn't accidentally + # broken: + #new_libs="$deplib $new_libs" + # Pragmatically, this seems to cause very few problems in + # practice: + case $deplib in + -L*) new_libs="$deplib $new_libs" ;; + -R*) ;; + *) + # And here is the reason: when a library appears more + # than once as an explicit dependence of a library, or + # is implicitly linked in more than once by the + # compiler, it is considered special, and multiple + # occurrences thereof are not removed. Compare this + # with having the same library being listed as a + # dependency of multiple other libraries: in this case, + # we know (pedantically, we assume) the library does not + # need to be listed more than once, so we keep only the + # last copy. This is not always right, but it is rare + # enough that we require users that really mean to play + # such unportable linking tricks to link the library + # using -Wl,-lname, so that libtool does not consider it + # for duplicate removal. + case " $specialdeplibs " in + *" $deplib "*) new_libs="$deplib $new_libs" ;; + *) + case " $new_libs " in + *" $deplib "*) ;; + *) new_libs="$deplib $new_libs" ;; + esac + ;; + esac + ;; + esac + done + tmp_libs= + for deplib in $new_libs; do + case $deplib in + -L*) + case " $tmp_libs " in + *" $deplib "*) ;; + *) func_append tmp_libs " $deplib" ;; + esac + ;; + *) func_append tmp_libs " $deplib" ;; + esac + done + eval $var=\"$tmp_libs\" + done # for var + fi + + # Add Sun CC postdeps if required: + test CXX = "$tagname" && { + case $host_os in + linux*) + case `$CC -V 2>&1 | sed 5q` in + *Sun\ C*) # Sun C++ 5.9 + func_suncc_cstd_abi + + if test no != "$suncc_use_cstd_abi"; then + func_append postdeps ' -library=Cstd -library=Crun' + fi + ;; + esac + ;; + + solaris*) + func_cc_basename "$CC" + case $func_cc_basename_result in + CC* | sunCC*) + func_suncc_cstd_abi + + if test no != "$suncc_use_cstd_abi"; then + func_append postdeps ' -library=Cstd -library=Crun' + fi + ;; + esac + ;; + esac + } + + # Last step: remove runtime libs from dependency_libs + # (they stay in deplibs) + tmp_libs= + for i in $dependency_libs; do + case " $predeps $postdeps $compiler_lib_search_path " in + *" $i "*) + i= + ;; + esac + if test -n "$i"; then + func_append tmp_libs " $i" + fi + done + dependency_libs=$tmp_libs + done # for pass + if test prog = "$linkmode"; then + dlfiles=$newdlfiles + fi + if test prog = "$linkmode" || test lib = "$linkmode"; then + dlprefiles=$newdlprefiles + fi + + case $linkmode in + oldlib) + if test -n "$dlfiles$dlprefiles" || test no != "$dlself"; then + func_warning "'-dlopen' is ignored for archives" + fi + + case " $deplibs" in + *\ -l* | *\ -L*) + func_warning "'-l' and '-L' are ignored for archives" ;; + esac + + test -n "$rpath" && \ + func_warning "'-rpath' is ignored for archives" + + test -n "$xrpath" && \ + func_warning "'-R' is ignored for archives" + + test -n "$vinfo" && \ + func_warning "'-version-info/-version-number' is ignored for archives" + + test -n "$release" && \ + func_warning "'-release' is ignored for archives" + + test -n "$export_symbols$export_symbols_regex" && \ + func_warning "'-export-symbols' is ignored for archives" + + # Now set the variables for building old libraries. + build_libtool_libs=no + oldlibs=$output + func_append objs "$old_deplibs" + ;; + + lib) + # Make sure we only generate libraries of the form 'libNAME.la'. + case $outputname in + lib*) + func_stripname 'lib' '.la' "$outputname" + name=$func_stripname_result + eval shared_ext=\"$shrext_cmds\" + eval libname=\"$libname_spec\" + ;; + *) + test no = "$module" \ + && func_fatal_help "libtool library '$output' must begin with 'lib'" + + if test no != "$need_lib_prefix"; then + # Add the "lib" prefix for modules if required + func_stripname '' '.la' "$outputname" + name=$func_stripname_result + eval shared_ext=\"$shrext_cmds\" + eval libname=\"$libname_spec\" + else + func_stripname '' '.la' "$outputname" + libname=$func_stripname_result + fi + ;; + esac + + if test -n "$objs"; then + if test pass_all != "$deplibs_check_method"; then + func_fatal_error "cannot build libtool library '$output' from non-libtool objects on this host:$objs" + else + echo + $ECHO "*** Warning: Linking the shared library $output against the non-libtool" + $ECHO "*** objects $objs is not portable!" + func_append libobjs " $objs" + fi + fi + + test no = "$dlself" \ + || func_warning "'-dlopen self' is ignored for libtool libraries" + + set dummy $rpath + shift + test 1 -lt "$#" \ + && func_warning "ignoring multiple '-rpath's for a libtool library" + + install_libdir=$1 + + oldlibs= + if test -z "$rpath"; then + if test yes = "$build_libtool_libs"; then + # Building a libtool convenience library. + # Some compilers have problems with a '.al' extension so + # convenience libraries should have the same extension an + # archive normally would. + oldlibs="$output_objdir/$libname.$libext $oldlibs" + build_libtool_libs=convenience + build_old_libs=yes + fi + + test -n "$vinfo" && \ + func_warning "'-version-info/-version-number' is ignored for convenience libraries" + + test -n "$release" && \ + func_warning "'-release' is ignored for convenience libraries" + else + + # Parse the version information argument. + save_ifs=$IFS; IFS=: + set dummy $vinfo 0 0 0 + shift + IFS=$save_ifs + + test -n "$7" && \ + func_fatal_help "too many parameters to '-version-info'" + + # convert absolute version numbers to libtool ages + # this retains compatibility with .la files and attempts + # to make the code below a bit more comprehensible + + case $vinfo_number in + yes) + number_major=$1 + number_minor=$2 + number_revision=$3 + # + # There are really only two kinds -- those that + # use the current revision as the major version + # and those that subtract age and use age as + # a minor version. But, then there is irix + # that has an extra 1 added just for fun + # + case $version_type in + # correct linux to gnu/linux during the next big refactor + darwin|freebsd-elf|linux|osf|windows|none) + func_arith $number_major + $number_minor + current=$func_arith_result + age=$number_minor + revision=$number_revision + ;; + freebsd-aout|qnx|sunos) + current=$number_major + revision=$number_minor + age=0 + ;; + irix|nonstopux) + func_arith $number_major + $number_minor + current=$func_arith_result + age=$number_minor + revision=$number_minor + lt_irix_increment=no + ;; + *) + func_fatal_configuration "$modename: unknown library version type '$version_type'" + ;; + esac + ;; + no) + current=$1 + revision=$2 + age=$3 + ;; + esac + + # Check that each of the things are valid numbers. + case $current in + 0|[1-9]|[1-9][0-9]|[1-9][0-9][0-9]|[1-9][0-9][0-9][0-9]|[1-9][0-9][0-9][0-9][0-9]) ;; + *) + func_error "CURRENT '$current' must be a nonnegative integer" + func_fatal_error "'$vinfo' is not valid version information" + ;; + esac + + case $revision in + 0|[1-9]|[1-9][0-9]|[1-9][0-9][0-9]|[1-9][0-9][0-9][0-9]|[1-9][0-9][0-9][0-9][0-9]) ;; + *) + func_error "REVISION '$revision' must be a nonnegative integer" + func_fatal_error "'$vinfo' is not valid version information" + ;; + esac + + case $age in + 0|[1-9]|[1-9][0-9]|[1-9][0-9][0-9]|[1-9][0-9][0-9][0-9]|[1-9][0-9][0-9][0-9][0-9]) ;; + *) + func_error "AGE '$age' must be a nonnegative integer" + func_fatal_error "'$vinfo' is not valid version information" + ;; + esac + + if test "$age" -gt "$current"; then + func_error "AGE '$age' is greater than the current interface number '$current'" + func_fatal_error "'$vinfo' is not valid version information" + fi + + # Calculate the version variables. + major= + versuffix= + verstring= + case $version_type in + none) ;; + + darwin) + # Like Linux, but with the current version available in + # verstring for coding it into the library header + func_arith $current - $age + major=.$func_arith_result + versuffix=$major.$age.$revision + # Darwin ld doesn't like 0 for these options... + func_arith $current + 1 + minor_current=$func_arith_result + xlcverstring="$wl-compatibility_version $wl$minor_current $wl-current_version $wl$minor_current.$revision" + verstring="-compatibility_version $minor_current -current_version $minor_current.$revision" + # On Darwin other compilers + case $CC in + nagfor*) + verstring="$wl-compatibility_version $wl$minor_current $wl-current_version $wl$minor_current.$revision" + ;; + *) + verstring="-compatibility_version $minor_current -current_version $minor_current.$revision" + ;; + esac + ;; + + freebsd-aout) + major=.$current + versuffix=.$current.$revision + ;; + + freebsd-elf) + func_arith $current - $age + major=.$func_arith_result + versuffix=$major.$age.$revision + ;; + + irix | nonstopux) + if test no = "$lt_irix_increment"; then + func_arith $current - $age + else + func_arith $current - $age + 1 + fi + major=$func_arith_result + + case $version_type in + nonstopux) verstring_prefix=nonstopux ;; + *) verstring_prefix=sgi ;; + esac + verstring=$verstring_prefix$major.$revision + + # Add in all the interfaces that we are compatible with. + loop=$revision + while test 0 -ne "$loop"; do + func_arith $revision - $loop + iface=$func_arith_result + func_arith $loop - 1 + loop=$func_arith_result + verstring=$verstring_prefix$major.$iface:$verstring + done + + # Before this point, $major must not contain '.'. + major=.$major + versuffix=$major.$revision + ;; + + linux) # correct to gnu/linux during the next big refactor + func_arith $current - $age + major=.$func_arith_result + versuffix=$major.$age.$revision + ;; + + osf) + func_arith $current - $age + major=.$func_arith_result + versuffix=.$current.$age.$revision + verstring=$current.$age.$revision + + # Add in all the interfaces that we are compatible with. + loop=$age + while test 0 -ne "$loop"; do + func_arith $current - $loop + iface=$func_arith_result + func_arith $loop - 1 + loop=$func_arith_result + verstring=$verstring:$iface.0 + done + + # Make executables depend on our current version. + func_append verstring ":$current.0" + ;; + + qnx) + major=.$current + versuffix=.$current + ;; + + sco) + major=.$current + versuffix=.$current + ;; + + sunos) + major=.$current + versuffix=.$current.$revision + ;; + + windows) + # Use '-' rather than '.', since we only want one + # extension on DOS 8.3 file systems. + func_arith $current - $age + major=$func_arith_result + versuffix=-$major + ;; + + *) + func_fatal_configuration "unknown library version type '$version_type'" + ;; + esac + + # Clear the version info if we defaulted, and they specified a release. + if test -z "$vinfo" && test -n "$release"; then + major= + case $version_type in + darwin) + # we can't check for "0.0" in archive_cmds due to quoting + # problems, so we reset it completely + verstring= + ;; + *) + verstring=0.0 + ;; + esac + if test no = "$need_version"; then + versuffix= + else + versuffix=.0.0 + fi + fi + + # Remove version info from name if versioning should be avoided + if test yes,no = "$avoid_version,$need_version"; then + major= + versuffix= + verstring= + fi + + # Check to see if the archive will have undefined symbols. + if test yes = "$allow_undefined"; then + if test unsupported = "$allow_undefined_flag"; then + if test yes = "$build_old_libs"; then + func_warning "undefined symbols not allowed in $host shared libraries; building static only" + build_libtool_libs=no + else + func_fatal_error "can't build $host shared library unless -no-undefined is specified" + fi + fi + else + # Don't allow undefined symbols. + allow_undefined_flag=$no_undefined_flag + fi + + fi + + func_generate_dlsyms "$libname" "$libname" : + func_append libobjs " $symfileobj" + test " " = "$libobjs" && libobjs= + + if test relink != "$opt_mode"; then + # Remove our outputs, but don't remove object files since they + # may have been created when compiling PIC objects. + removelist= + tempremovelist=`$ECHO "$output_objdir/*"` + for p in $tempremovelist; do + case $p in + *.$objext | *.gcno) + ;; + $output_objdir/$outputname | $output_objdir/$libname.* | $output_objdir/$libname$release.*) + if test -n "$precious_files_regex"; then + if $ECHO "$p" | $EGREP -e "$precious_files_regex" >/dev/null 2>&1 + then + continue + fi + fi + func_append removelist " $p" + ;; + *) ;; + esac + done + test -n "$removelist" && \ + func_show_eval "${RM}r \$removelist" + fi + + # Now set the variables for building old libraries. + if test yes = "$build_old_libs" && test convenience != "$build_libtool_libs"; then + func_append oldlibs " $output_objdir/$libname.$libext" + + # Transform .lo files to .o files. + oldobjs="$objs "`$ECHO "$libobjs" | $SP2NL | $SED "/\.$libext$/d; $lo2o" | $NL2SP` + fi + + # Eliminate all temporary directories. + #for path in $notinst_path; do + # lib_search_path=`$ECHO "$lib_search_path " | $SED "s% $path % %g"` + # deplibs=`$ECHO "$deplibs " | $SED "s% -L$path % %g"` + # dependency_libs=`$ECHO "$dependency_libs " | $SED "s% -L$path % %g"` + #done + + if test -n "$xrpath"; then + # If the user specified any rpath flags, then add them. + temp_xrpath= + for libdir in $xrpath; do + func_replace_sysroot "$libdir" + func_append temp_xrpath " -R$func_replace_sysroot_result" + case "$finalize_rpath " in + *" $libdir "*) ;; + *) func_append finalize_rpath " $libdir" ;; + esac + done + if test yes != "$hardcode_into_libs" || test yes = "$build_old_libs"; then + dependency_libs="$temp_xrpath $dependency_libs" + fi + fi + + # Make sure dlfiles contains only unique files that won't be dlpreopened + old_dlfiles=$dlfiles + dlfiles= + for lib in $old_dlfiles; do + case " $dlprefiles $dlfiles " in + *" $lib "*) ;; + *) func_append dlfiles " $lib" ;; + esac + done + + # Make sure dlprefiles contains only unique files + old_dlprefiles=$dlprefiles + dlprefiles= + for lib in $old_dlprefiles; do + case "$dlprefiles " in + *" $lib "*) ;; + *) func_append dlprefiles " $lib" ;; + esac + done + + if test yes = "$build_libtool_libs"; then + if test -n "$rpath"; then + case $host in + *-*-cygwin* | *-*-mingw* | *-*-pw32* | *-*-os2* | *-*-beos* | *-cegcc* | *-*-haiku*) + # these systems don't actually have a c library (as such)! + ;; + *-*-rhapsody* | *-*-darwin1.[012]) + # Rhapsody C library is in the System framework + func_append deplibs " System.ltframework" + ;; + *-*-netbsd*) + # Don't link with libc until the a.out ld.so is fixed. + ;; + *-*-openbsd* | *-*-freebsd* | *-*-dragonfly*) + # Do not include libc due to us having libc/libc_r. + ;; + *-*-sco3.2v5* | *-*-sco5v6*) + # Causes problems with __ctype + ;; + *-*-sysv4.2uw2* | *-*-sysv5* | *-*-unixware* | *-*-OpenUNIX*) + # Compiler inserts libc in the correct place for threads to work + ;; + *) + # Add libc to deplibs on all other systems if necessary. + if test yes = "$build_libtool_need_lc"; then + func_append deplibs " -lc" + fi + ;; + esac + fi + + # Transform deplibs into only deplibs that can be linked in shared. + name_save=$name + libname_save=$libname + release_save=$release + versuffix_save=$versuffix + major_save=$major + # I'm not sure if I'm treating the release correctly. I think + # release should show up in the -l (ie -lgmp5) so we don't want to + # add it in twice. Is that correct? + release= + versuffix= + major= + newdeplibs= + droppeddeps=no + case $deplibs_check_method in + pass_all) + # Don't check for shared/static. Everything works. + # This might be a little naive. We might want to check + # whether the library exists or not. But this is on + # osf3 & osf4 and I'm not really sure... Just + # implementing what was already the behavior. + newdeplibs=$deplibs + ;; + test_compile) + # This code stresses the "libraries are programs" paradigm to its + # limits. Maybe even breaks it. We compile a program, linking it + # against the deplibs as a proxy for the library. Then we can check + # whether they linked in statically or dynamically with ldd. + $opt_dry_run || $RM conftest.c + cat > conftest.c </dev/null` + $nocaseglob + else + potential_libs=`ls $i/$libnameglob[.-]* 2>/dev/null` + fi + for potent_lib in $potential_libs; do + # Follow soft links. + if ls -lLd "$potent_lib" 2>/dev/null | + $GREP " -> " >/dev/null; then + continue + fi + # The statement above tries to avoid entering an + # endless loop below, in case of cyclic links. + # We might still enter an endless loop, since a link + # loop can be closed while we follow links, + # but so what? + potlib=$potent_lib + while test -h "$potlib" 2>/dev/null; do + potliblink=`ls -ld $potlib | $SED 's/.* -> //'` + case $potliblink in + [\\/]* | [A-Za-z]:[\\/]*) potlib=$potliblink;; + *) potlib=`$ECHO "$potlib" | $SED 's|[^/]*$||'`"$potliblink";; + esac + done + if eval $file_magic_cmd \"\$potlib\" 2>/dev/null | + $SED -e 10q | + $EGREP "$file_magic_regex" > /dev/null; then + func_append newdeplibs " $a_deplib" + a_deplib= + break 2 + fi + done + done + fi + if test -n "$a_deplib"; then + droppeddeps=yes + echo + $ECHO "*** Warning: linker path does not have real file for library $a_deplib." + echo "*** I have the capability to make that library automatically link in when" + echo "*** you link to this library. But I can only do this if you have a" + echo "*** shared version of the library, which you do not appear to have" + echo "*** because I did check the linker path looking for a file starting" + if test -z "$potlib"; then + $ECHO "*** with $libname but no candidates were found. (...for file magic test)" + else + $ECHO "*** with $libname and none of the candidates passed a file format test" + $ECHO "*** using a file magic. Last file checked: $potlib" + fi + fi + ;; + *) + # Add a -L argument. + func_append newdeplibs " $a_deplib" + ;; + esac + done # Gone through all deplibs. + ;; + match_pattern*) + set dummy $deplibs_check_method; shift + match_pattern_regex=`expr "$deplibs_check_method" : "$1 \(.*\)"` + for a_deplib in $deplibs; do + case $a_deplib in + -l*) + func_stripname -l '' "$a_deplib" + name=$func_stripname_result + if test yes = "$allow_libtool_libs_with_static_runtimes"; then + case " $predeps $postdeps " in + *" $a_deplib "*) + func_append newdeplibs " $a_deplib" + a_deplib= + ;; + esac + fi + if test -n "$a_deplib"; then + libname=`eval "\\$ECHO \"$libname_spec\""` + for i in $lib_search_path $sys_lib_search_path $shlib_search_path; do + potential_libs=`ls $i/$libname[.-]* 2>/dev/null` + for potent_lib in $potential_libs; do + potlib=$potent_lib # see symlink-check above in file_magic test + if eval "\$ECHO \"$potent_lib\"" 2>/dev/null | $SED 10q | \ + $EGREP "$match_pattern_regex" > /dev/null; then + func_append newdeplibs " $a_deplib" + a_deplib= + break 2 + fi + done + done + fi + if test -n "$a_deplib"; then + droppeddeps=yes + echo + $ECHO "*** Warning: linker path does not have real file for library $a_deplib." + echo "*** I have the capability to make that library automatically link in when" + echo "*** you link to this library. But I can only do this if you have a" + echo "*** shared version of the library, which you do not appear to have" + echo "*** because I did check the linker path looking for a file starting" + if test -z "$potlib"; then + $ECHO "*** with $libname but no candidates were found. (...for regex pattern test)" + else + $ECHO "*** with $libname and none of the candidates passed a file format test" + $ECHO "*** using a regex pattern. Last file checked: $potlib" + fi + fi + ;; + *) + # Add a -L argument. + func_append newdeplibs " $a_deplib" + ;; + esac + done # Gone through all deplibs. + ;; + none | unknown | *) + newdeplibs= + tmp_deplibs=`$ECHO " $deplibs" | $SED 's/ -lc$//; s/ -[LR][^ ]*//g'` + if test yes = "$allow_libtool_libs_with_static_runtimes"; then + for i in $predeps $postdeps; do + # can't use Xsed below, because $i might contain '/' + tmp_deplibs=`$ECHO " $tmp_deplibs" | $SED "s|$i||"` + done + fi + case $tmp_deplibs in + *[!\ \ ]*) + echo + if test none = "$deplibs_check_method"; then + echo "*** Warning: inter-library dependencies are not supported in this platform." + else + echo "*** Warning: inter-library dependencies are not known to be supported." + fi + echo "*** All declared inter-library dependencies are being dropped." + droppeddeps=yes + ;; + esac + ;; + esac + versuffix=$versuffix_save + major=$major_save + release=$release_save + libname=$libname_save + name=$name_save + + case $host in + *-*-rhapsody* | *-*-darwin1.[012]) + # On Rhapsody replace the C library with the System framework + newdeplibs=`$ECHO " $newdeplibs" | $SED 's/ -lc / System.ltframework /'` + ;; + esac + + if test yes = "$droppeddeps"; then + if test yes = "$module"; then + echo + echo "*** Warning: libtool could not satisfy all declared inter-library" + $ECHO "*** dependencies of module $libname. Therefore, libtool will create" + echo "*** a static module, that should work as long as the dlopening" + echo "*** application is linked with the -dlopen flag." + if test -z "$global_symbol_pipe"; then + echo + echo "*** However, this would only work if libtool was able to extract symbol" + echo "*** lists from a program, using 'nm' or equivalent, but libtool could" + echo "*** not find such a program. So, this module is probably useless." + echo "*** 'nm' from GNU binutils and a full rebuild may help." + fi + if test no = "$build_old_libs"; then + oldlibs=$output_objdir/$libname.$libext + build_libtool_libs=module + build_old_libs=yes + else + build_libtool_libs=no + fi + else + echo "*** The inter-library dependencies that have been dropped here will be" + echo "*** automatically added whenever a program is linked with this library" + echo "*** or is declared to -dlopen it." + + if test no = "$allow_undefined"; then + echo + echo "*** Since this library must not contain undefined symbols," + echo "*** because either the platform does not support them or" + echo "*** it was explicitly requested with -no-undefined," + echo "*** libtool will only create a static version of it." + if test no = "$build_old_libs"; then + oldlibs=$output_objdir/$libname.$libext + build_libtool_libs=module + build_old_libs=yes + else + build_libtool_libs=no + fi + fi + fi + fi + # Done checking deplibs! + deplibs=$newdeplibs + fi + # Time to change all our "foo.ltframework" stuff back to "-framework foo" + case $host in + *-*-darwin*) + newdeplibs=`$ECHO " $newdeplibs" | $SED 's% \([^ $]*\).ltframework% -framework \1%g'` + new_inherited_linker_flags=`$ECHO " $new_inherited_linker_flags" | $SED 's% \([^ $]*\).ltframework% -framework \1%g'` + deplibs=`$ECHO " $deplibs" | $SED 's% \([^ $]*\).ltframework% -framework \1%g'` + ;; + esac + + # move library search paths that coincide with paths to not yet + # installed libraries to the beginning of the library search list + new_libs= + for path in $notinst_path; do + case " $new_libs " in + *" -L$path/$objdir "*) ;; + *) + case " $deplibs " in + *" -L$path/$objdir "*) + func_append new_libs " -L$path/$objdir" ;; + esac + ;; + esac + done + for deplib in $deplibs; do + case $deplib in + -L*) + case " $new_libs " in + *" $deplib "*) ;; + *) func_append new_libs " $deplib" ;; + esac + ;; + *) func_append new_libs " $deplib" ;; + esac + done + deplibs=$new_libs + + # All the library-specific variables (install_libdir is set above). + library_names= + old_library= + dlname= + + # Test again, we may have decided not to build it any more + if test yes = "$build_libtool_libs"; then + # Remove $wl instances when linking with ld. + # FIXME: should test the right _cmds variable. + case $archive_cmds in + *\$LD\ *) wl= ;; + esac + if test yes = "$hardcode_into_libs"; then + # Hardcode the library paths + hardcode_libdirs= + dep_rpath= + rpath=$finalize_rpath + test relink = "$opt_mode" || rpath=$compile_rpath$rpath + for libdir in $rpath; do + if test -n "$hardcode_libdir_flag_spec"; then + if test -n "$hardcode_libdir_separator"; then + func_replace_sysroot "$libdir" + libdir=$func_replace_sysroot_result + if test -z "$hardcode_libdirs"; then + hardcode_libdirs=$libdir + else + # Just accumulate the unique libdirs. + case $hardcode_libdir_separator$hardcode_libdirs$hardcode_libdir_separator in + *"$hardcode_libdir_separator$libdir$hardcode_libdir_separator"*) + ;; + *) + func_append hardcode_libdirs "$hardcode_libdir_separator$libdir" + ;; + esac + fi + else + eval flag=\"$hardcode_libdir_flag_spec\" + func_append dep_rpath " $flag" + fi + elif test -n "$runpath_var"; then + case "$perm_rpath " in + *" $libdir "*) ;; + *) func_append perm_rpath " $libdir" ;; + esac + fi + done + # Substitute the hardcoded libdirs into the rpath. + if test -n "$hardcode_libdir_separator" && + test -n "$hardcode_libdirs"; then + libdir=$hardcode_libdirs + eval "dep_rpath=\"$hardcode_libdir_flag_spec\"" + fi + if test -n "$runpath_var" && test -n "$perm_rpath"; then + # We should set the runpath_var. + rpath= + for dir in $perm_rpath; do + func_append rpath "$dir:" + done + eval "$runpath_var='$rpath\$$runpath_var'; export $runpath_var" + fi + test -n "$dep_rpath" && deplibs="$dep_rpath $deplibs" + fi + + shlibpath=$finalize_shlibpath + test relink = "$opt_mode" || shlibpath=$compile_shlibpath$shlibpath + if test -n "$shlibpath"; then + eval "$shlibpath_var='$shlibpath\$$shlibpath_var'; export $shlibpath_var" + fi + + # Get the real and link names of the library. + eval shared_ext=\"$shrext_cmds\" + eval library_names=\"$library_names_spec\" + set dummy $library_names + shift + realname=$1 + shift + + if test -n "$soname_spec"; then + eval soname=\"$soname_spec\" + else + soname=$realname + fi + if test -z "$dlname"; then + dlname=$soname + fi + + lib=$output_objdir/$realname + linknames= + for link + do + func_append linknames " $link" + done + + # Use standard objects if they are pic + test -z "$pic_flag" && libobjs=`$ECHO "$libobjs" | $SP2NL | $SED "$lo2o" | $NL2SP` + test "X$libobjs" = "X " && libobjs= + + delfiles= + if test -n "$export_symbols" && test -n "$include_expsyms"; then + $opt_dry_run || cp "$export_symbols" "$output_objdir/$libname.uexp" + export_symbols=$output_objdir/$libname.uexp + func_append delfiles " $export_symbols" + fi + + orig_export_symbols= + case $host_os in + cygwin* | mingw* | cegcc*) + if test -n "$export_symbols" && test -z "$export_symbols_regex"; then + # exporting using user supplied symfile + func_dll_def_p "$export_symbols" || { + # and it's NOT already a .def file. Must figure out + # which of the given symbols are data symbols and tag + # them as such. So, trigger use of export_symbols_cmds. + # export_symbols gets reassigned inside the "prepare + # the list of exported symbols" if statement, so the + # include_expsyms logic still works. + orig_export_symbols=$export_symbols + export_symbols= + always_export_symbols=yes + } + fi + ;; + esac + + # Prepare the list of exported symbols + if test -z "$export_symbols"; then + if test yes = "$always_export_symbols" || test -n "$export_symbols_regex"; then + func_verbose "generating symbol list for '$libname.la'" + export_symbols=$output_objdir/$libname.exp + $opt_dry_run || $RM $export_symbols + cmds=$export_symbols_cmds + save_ifs=$IFS; IFS='~' + for cmd1 in $cmds; do + IFS=$save_ifs + # Take the normal branch if the nm_file_list_spec branch + # doesn't work or if tool conversion is not needed. + case $nm_file_list_spec~$to_tool_file_cmd in + *~func_convert_file_noop | *~func_convert_file_msys_to_w32 | ~*) + try_normal_branch=yes + eval cmd=\"$cmd1\" + func_len " $cmd" + len=$func_len_result + ;; + *) + try_normal_branch=no + ;; + esac + if test yes = "$try_normal_branch" \ + && { test "$len" -lt "$max_cmd_len" \ + || test "$max_cmd_len" -le -1; } + then + func_show_eval "$cmd" 'exit $?' + skipped_export=false + elif test -n "$nm_file_list_spec"; then + func_basename "$output" + output_la=$func_basename_result + save_libobjs=$libobjs + save_output=$output + output=$output_objdir/$output_la.nm + func_to_tool_file "$output" + libobjs=$nm_file_list_spec$func_to_tool_file_result + func_append delfiles " $output" + func_verbose "creating $NM input file list: $output" + for obj in $save_libobjs; do + func_to_tool_file "$obj" + $ECHO "$func_to_tool_file_result" + done > "$output" + eval cmd=\"$cmd1\" + func_show_eval "$cmd" 'exit $?' + output=$save_output + libobjs=$save_libobjs + skipped_export=false + else + # The command line is too long to execute in one step. + func_verbose "using reloadable object file for export list..." + skipped_export=: + # Break out early, otherwise skipped_export may be + # set to false by a later but shorter cmd. + break + fi + done + IFS=$save_ifs + if test -n "$export_symbols_regex" && test : != "$skipped_export"; then + func_show_eval '$EGREP -e "$export_symbols_regex" "$export_symbols" > "${export_symbols}T"' + func_show_eval '$MV "${export_symbols}T" "$export_symbols"' + fi + fi + fi + + if test -n "$export_symbols" && test -n "$include_expsyms"; then + tmp_export_symbols=$export_symbols + test -n "$orig_export_symbols" && tmp_export_symbols=$orig_export_symbols + $opt_dry_run || eval '$ECHO "$include_expsyms" | $SP2NL >> "$tmp_export_symbols"' + fi + + if test : != "$skipped_export" && test -n "$orig_export_symbols"; then + # The given exports_symbols file has to be filtered, so filter it. + func_verbose "filter symbol list for '$libname.la' to tag DATA exports" + # FIXME: $output_objdir/$libname.filter potentially contains lots of + # 's' commands, which not all seds can handle. GNU sed should be fine + # though. Also, the filter scales superlinearly with the number of + # global variables. join(1) would be nice here, but unfortunately + # isn't a blessed tool. + $opt_dry_run || $SED -e '/[ ,]DATA/!d;s,\(.*\)\([ \,].*\),s|^\1$|\1\2|,' < $export_symbols > $output_objdir/$libname.filter + func_append delfiles " $export_symbols $output_objdir/$libname.filter" + export_symbols=$output_objdir/$libname.def + $opt_dry_run || $SED -f $output_objdir/$libname.filter < $orig_export_symbols > $export_symbols + fi + + tmp_deplibs= + for test_deplib in $deplibs; do + case " $convenience " in + *" $test_deplib "*) ;; + *) + func_append tmp_deplibs " $test_deplib" + ;; + esac + done + deplibs=$tmp_deplibs + + if test -n "$convenience"; then + if test -n "$whole_archive_flag_spec" && + test yes = "$compiler_needs_object" && + test -z "$libobjs"; then + # extract the archives, so we have objects to list. + # TODO: could optimize this to just extract one archive. + whole_archive_flag_spec= + fi + if test -n "$whole_archive_flag_spec"; then + save_libobjs=$libobjs + eval libobjs=\"\$libobjs $whole_archive_flag_spec\" + test "X$libobjs" = "X " && libobjs= + else + gentop=$output_objdir/${outputname}x + func_append generated " $gentop" + + func_extract_archives $gentop $convenience + func_append libobjs " $func_extract_archives_result" + test "X$libobjs" = "X " && libobjs= + fi + fi + + if test yes = "$thread_safe" && test -n "$thread_safe_flag_spec"; then + eval flag=\"$thread_safe_flag_spec\" + func_append linker_flags " $flag" + fi + + # Make a backup of the uninstalled library when relinking + if test relink = "$opt_mode"; then + $opt_dry_run || eval '(cd $output_objdir && $RM ${realname}U && $MV $realname ${realname}U)' || exit $? + fi + + # Do each of the archive commands. + if test yes = "$module" && test -n "$module_cmds"; then + if test -n "$export_symbols" && test -n "$module_expsym_cmds"; then + eval test_cmds=\"$module_expsym_cmds\" + cmds=$module_expsym_cmds + else + eval test_cmds=\"$module_cmds\" + cmds=$module_cmds + fi + else + if test -n "$export_symbols" && test -n "$archive_expsym_cmds"; then + eval test_cmds=\"$archive_expsym_cmds\" + cmds=$archive_expsym_cmds + else + eval test_cmds=\"$archive_cmds\" + cmds=$archive_cmds + fi + fi + + if test : != "$skipped_export" && + func_len " $test_cmds" && + len=$func_len_result && + test "$len" -lt "$max_cmd_len" || test "$max_cmd_len" -le -1; then + : + else + # The command line is too long to link in one step, link piecewise + # or, if using GNU ld and skipped_export is not :, use a linker + # script. + + # Save the value of $output and $libobjs because we want to + # use them later. If we have whole_archive_flag_spec, we + # want to use save_libobjs as it was before + # whole_archive_flag_spec was expanded, because we can't + # assume the linker understands whole_archive_flag_spec. + # This may have to be revisited, in case too many + # convenience libraries get linked in and end up exceeding + # the spec. + if test -z "$convenience" || test -z "$whole_archive_flag_spec"; then + save_libobjs=$libobjs + fi + save_output=$output + func_basename "$output" + output_la=$func_basename_result + + # Clear the reloadable object creation command queue and + # initialize k to one. + test_cmds= + concat_cmds= + objlist= + last_robj= + k=1 + + if test -n "$save_libobjs" && test : != "$skipped_export" && test yes = "$with_gnu_ld"; then + output=$output_objdir/$output_la.lnkscript + func_verbose "creating GNU ld script: $output" + echo 'INPUT (' > $output + for obj in $save_libobjs + do + func_to_tool_file "$obj" + $ECHO "$func_to_tool_file_result" >> $output + done + echo ')' >> $output + func_append delfiles " $output" + func_to_tool_file "$output" + output=$func_to_tool_file_result + elif test -n "$save_libobjs" && test : != "$skipped_export" && test -n "$file_list_spec"; then + output=$output_objdir/$output_la.lnk + func_verbose "creating linker input file list: $output" + : > $output + set x $save_libobjs + shift + firstobj= + if test yes = "$compiler_needs_object"; then + firstobj="$1 " + shift + fi + for obj + do + func_to_tool_file "$obj" + $ECHO "$func_to_tool_file_result" >> $output + done + func_append delfiles " $output" + func_to_tool_file "$output" + output=$firstobj\"$file_list_spec$func_to_tool_file_result\" + else + if test -n "$save_libobjs"; then + func_verbose "creating reloadable object files..." + output=$output_objdir/$output_la-$k.$objext + eval test_cmds=\"$reload_cmds\" + func_len " $test_cmds" + len0=$func_len_result + len=$len0 + + # Loop over the list of objects to be linked. + for obj in $save_libobjs + do + func_len " $obj" + func_arith $len + $func_len_result + len=$func_arith_result + if test -z "$objlist" || + test "$len" -lt "$max_cmd_len"; then + func_append objlist " $obj" + else + # The command $test_cmds is almost too long, add a + # command to the queue. + if test 1 -eq "$k"; then + # The first file doesn't have a previous command to add. + reload_objs=$objlist + eval concat_cmds=\"$reload_cmds\" + else + # All subsequent reloadable object files will link in + # the last one created. + reload_objs="$objlist $last_robj" + eval concat_cmds=\"\$concat_cmds~$reload_cmds~\$RM $last_robj\" + fi + last_robj=$output_objdir/$output_la-$k.$objext + func_arith $k + 1 + k=$func_arith_result + output=$output_objdir/$output_la-$k.$objext + objlist=" $obj" + func_len " $last_robj" + func_arith $len0 + $func_len_result + len=$func_arith_result + fi + done + # Handle the remaining objects by creating one last + # reloadable object file. All subsequent reloadable object + # files will link in the last one created. + test -z "$concat_cmds" || concat_cmds=$concat_cmds~ + reload_objs="$objlist $last_robj" + eval concat_cmds=\"\$concat_cmds$reload_cmds\" + if test -n "$last_robj"; then + eval concat_cmds=\"\$concat_cmds~\$RM $last_robj\" + fi + func_append delfiles " $output" + + else + output= + fi + + ${skipped_export-false} && { + func_verbose "generating symbol list for '$libname.la'" + export_symbols=$output_objdir/$libname.exp + $opt_dry_run || $RM $export_symbols + libobjs=$output + # Append the command to create the export file. + test -z "$concat_cmds" || concat_cmds=$concat_cmds~ + eval concat_cmds=\"\$concat_cmds$export_symbols_cmds\" + if test -n "$last_robj"; then + eval concat_cmds=\"\$concat_cmds~\$RM $last_robj\" + fi + } + + test -n "$save_libobjs" && + func_verbose "creating a temporary reloadable object file: $output" + + # Loop through the commands generated above and execute them. + save_ifs=$IFS; IFS='~' + for cmd in $concat_cmds; do + IFS=$save_ifs + $opt_quiet || { + func_quote_for_expand "$cmd" + eval "func_echo $func_quote_for_expand_result" + } + $opt_dry_run || eval "$cmd" || { + lt_exit=$? + + # Restore the uninstalled library and exit + if test relink = "$opt_mode"; then + ( cd "$output_objdir" && \ + $RM "${realname}T" && \ + $MV "${realname}U" "$realname" ) + fi + + exit $lt_exit + } + done + IFS=$save_ifs + + if test -n "$export_symbols_regex" && ${skipped_export-false}; then + func_show_eval '$EGREP -e "$export_symbols_regex" "$export_symbols" > "${export_symbols}T"' + func_show_eval '$MV "${export_symbols}T" "$export_symbols"' + fi + fi + + ${skipped_export-false} && { + if test -n "$export_symbols" && test -n "$include_expsyms"; then + tmp_export_symbols=$export_symbols + test -n "$orig_export_symbols" && tmp_export_symbols=$orig_export_symbols + $opt_dry_run || eval '$ECHO "$include_expsyms" | $SP2NL >> "$tmp_export_symbols"' + fi + + if test -n "$orig_export_symbols"; then + # The given exports_symbols file has to be filtered, so filter it. + func_verbose "filter symbol list for '$libname.la' to tag DATA exports" + # FIXME: $output_objdir/$libname.filter potentially contains lots of + # 's' commands, which not all seds can handle. GNU sed should be fine + # though. Also, the filter scales superlinearly with the number of + # global variables. join(1) would be nice here, but unfortunately + # isn't a blessed tool. + $opt_dry_run || $SED -e '/[ ,]DATA/!d;s,\(.*\)\([ \,].*\),s|^\1$|\1\2|,' < $export_symbols > $output_objdir/$libname.filter + func_append delfiles " $export_symbols $output_objdir/$libname.filter" + export_symbols=$output_objdir/$libname.def + $opt_dry_run || $SED -f $output_objdir/$libname.filter < $orig_export_symbols > $export_symbols + fi + } + + libobjs=$output + # Restore the value of output. + output=$save_output + + if test -n "$convenience" && test -n "$whole_archive_flag_spec"; then + eval libobjs=\"\$libobjs $whole_archive_flag_spec\" + test "X$libobjs" = "X " && libobjs= + fi + # Expand the library linking commands again to reset the + # value of $libobjs for piecewise linking. + + # Do each of the archive commands. + if test yes = "$module" && test -n "$module_cmds"; then + if test -n "$export_symbols" && test -n "$module_expsym_cmds"; then + cmds=$module_expsym_cmds + else + cmds=$module_cmds + fi + else + if test -n "$export_symbols" && test -n "$archive_expsym_cmds"; then + cmds=$archive_expsym_cmds + else + cmds=$archive_cmds + fi + fi + fi + + if test -n "$delfiles"; then + # Append the command to remove temporary files to $cmds. + eval cmds=\"\$cmds~\$RM $delfiles\" + fi + + # Add any objects from preloaded convenience libraries + if test -n "$dlprefiles"; then + gentop=$output_objdir/${outputname}x + func_append generated " $gentop" + + func_extract_archives $gentop $dlprefiles + func_append libobjs " $func_extract_archives_result" + test "X$libobjs" = "X " && libobjs= + fi + + save_ifs=$IFS; IFS='~' + for cmd in $cmds; do + IFS=$sp$nl + eval cmd=\"$cmd\" + IFS=$save_ifs + $opt_quiet || { + func_quote_for_expand "$cmd" + eval "func_echo $func_quote_for_expand_result" + } + $opt_dry_run || eval "$cmd" || { + lt_exit=$? + + # Restore the uninstalled library and exit + if test relink = "$opt_mode"; then + ( cd "$output_objdir" && \ + $RM "${realname}T" && \ + $MV "${realname}U" "$realname" ) + fi + + exit $lt_exit + } + done + IFS=$save_ifs + + # Restore the uninstalled library and exit + if test relink = "$opt_mode"; then + $opt_dry_run || eval '(cd $output_objdir && $RM ${realname}T && $MV $realname ${realname}T && $MV ${realname}U $realname)' || exit $? + + if test -n "$convenience"; then + if test -z "$whole_archive_flag_spec"; then + func_show_eval '${RM}r "$gentop"' + fi + fi + + exit $EXIT_SUCCESS + fi + + # Create links to the real library. + for linkname in $linknames; do + if test "$realname" != "$linkname"; then + func_show_eval '(cd "$output_objdir" && $RM "$linkname" && $LN_S "$realname" "$linkname")' 'exit $?' + fi + done + + # If -module or -export-dynamic was specified, set the dlname. + if test yes = "$module" || test yes = "$export_dynamic"; then + # On all known operating systems, these are identical. + dlname=$soname + fi + fi + ;; + + obj) + if test -n "$dlfiles$dlprefiles" || test no != "$dlself"; then + func_warning "'-dlopen' is ignored for objects" + fi + + case " $deplibs" in + *\ -l* | *\ -L*) + func_warning "'-l' and '-L' are ignored for objects" ;; + esac + + test -n "$rpath" && \ + func_warning "'-rpath' is ignored for objects" + + test -n "$xrpath" && \ + func_warning "'-R' is ignored for objects" + + test -n "$vinfo" && \ + func_warning "'-version-info' is ignored for objects" + + test -n "$release" && \ + func_warning "'-release' is ignored for objects" + + case $output in + *.lo) + test -n "$objs$old_deplibs" && \ + func_fatal_error "cannot build library object '$output' from non-libtool objects" + + libobj=$output + func_lo2o "$libobj" + obj=$func_lo2o_result + ;; + *) + libobj= + obj=$output + ;; + esac + + # Delete the old objects. + $opt_dry_run || $RM $obj $libobj + + # Objects from convenience libraries. This assumes + # single-version convenience libraries. Whenever we create + # different ones for PIC/non-PIC, this we'll have to duplicate + # the extraction. + reload_conv_objs= + gentop= + # if reload_cmds runs $LD directly, get rid of -Wl from + # whole_archive_flag_spec and hope we can get by with turning comma + # into space. + case $reload_cmds in + *\$LD[\ \$]*) wl= ;; + esac + if test -n "$convenience"; then + if test -n "$whole_archive_flag_spec"; then + eval tmp_whole_archive_flags=\"$whole_archive_flag_spec\" + test -n "$wl" || tmp_whole_archive_flags=`$ECHO "$tmp_whole_archive_flags" | $SED 's|,| |g'` + reload_conv_objs=$reload_objs\ $tmp_whole_archive_flags + else + gentop=$output_objdir/${obj}x + func_append generated " $gentop" + + func_extract_archives $gentop $convenience + reload_conv_objs="$reload_objs $func_extract_archives_result" + fi + fi + + # If we're not building shared, we need to use non_pic_objs + test yes = "$build_libtool_libs" || libobjs=$non_pic_objects + + # Create the old-style object. + reload_objs=$objs$old_deplibs' '`$ECHO "$libobjs" | $SP2NL | $SED "/\.$libext$/d; /\.lib$/d; $lo2o" | $NL2SP`' '$reload_conv_objs + + output=$obj + func_execute_cmds "$reload_cmds" 'exit $?' + + # Exit if we aren't doing a library object file. + if test -z "$libobj"; then + if test -n "$gentop"; then + func_show_eval '${RM}r "$gentop"' + fi + + exit $EXIT_SUCCESS + fi + + test yes = "$build_libtool_libs" || { + if test -n "$gentop"; then + func_show_eval '${RM}r "$gentop"' + fi + + # Create an invalid libtool object if no PIC, so that we don't + # accidentally link it into a program. + # $show "echo timestamp > $libobj" + # $opt_dry_run || eval "echo timestamp > $libobj" || exit $? + exit $EXIT_SUCCESS + } + + if test -n "$pic_flag" || test default != "$pic_mode"; then + # Only do commands if we really have different PIC objects. + reload_objs="$libobjs $reload_conv_objs" + output=$libobj + func_execute_cmds "$reload_cmds" 'exit $?' + fi + + if test -n "$gentop"; then + func_show_eval '${RM}r "$gentop"' + fi + + exit $EXIT_SUCCESS + ;; + + prog) + case $host in + *cygwin*) func_stripname '' '.exe' "$output" + output=$func_stripname_result.exe;; + esac + test -n "$vinfo" && \ + func_warning "'-version-info' is ignored for programs" + + test -n "$release" && \ + func_warning "'-release' is ignored for programs" + + $preload \ + && test unknown,unknown,unknown = "$dlopen_support,$dlopen_self,$dlopen_self_static" \ + && func_warning "'LT_INIT([dlopen])' not used. Assuming no dlopen support." + + case $host in + *-*-rhapsody* | *-*-darwin1.[012]) + # On Rhapsody replace the C library is the System framework + compile_deplibs=`$ECHO " $compile_deplibs" | $SED 's/ -lc / System.ltframework /'` + finalize_deplibs=`$ECHO " $finalize_deplibs" | $SED 's/ -lc / System.ltframework /'` + ;; + esac + + case $host in + *-*-darwin*) + # Don't allow lazy linking, it breaks C++ global constructors + # But is supposedly fixed on 10.4 or later (yay!). + if test CXX = "$tagname"; then + case ${MACOSX_DEPLOYMENT_TARGET-10.0} in + 10.[0123]) + func_append compile_command " $wl-bind_at_load" + func_append finalize_command " $wl-bind_at_load" + ;; + esac + fi + # Time to change all our "foo.ltframework" stuff back to "-framework foo" + compile_deplibs=`$ECHO " $compile_deplibs" | $SED 's% \([^ $]*\).ltframework% -framework \1%g'` + finalize_deplibs=`$ECHO " $finalize_deplibs" | $SED 's% \([^ $]*\).ltframework% -framework \1%g'` + ;; + esac + + + # move library search paths that coincide with paths to not yet + # installed libraries to the beginning of the library search list + new_libs= + for path in $notinst_path; do + case " $new_libs " in + *" -L$path/$objdir "*) ;; + *) + case " $compile_deplibs " in + *" -L$path/$objdir "*) + func_append new_libs " -L$path/$objdir" ;; + esac + ;; + esac + done + for deplib in $compile_deplibs; do + case $deplib in + -L*) + case " $new_libs " in + *" $deplib "*) ;; + *) func_append new_libs " $deplib" ;; + esac + ;; + *) func_append new_libs " $deplib" ;; + esac + done + compile_deplibs=$new_libs + + + func_append compile_command " $compile_deplibs" + func_append finalize_command " $finalize_deplibs" + + if test -n "$rpath$xrpath"; then + # If the user specified any rpath flags, then add them. + for libdir in $rpath $xrpath; do + # This is the magic to use -rpath. + case "$finalize_rpath " in + *" $libdir "*) ;; + *) func_append finalize_rpath " $libdir" ;; + esac + done + fi + + # Now hardcode the library paths + rpath= + hardcode_libdirs= + for libdir in $compile_rpath $finalize_rpath; do + if test -n "$hardcode_libdir_flag_spec"; then + if test -n "$hardcode_libdir_separator"; then + if test -z "$hardcode_libdirs"; then + hardcode_libdirs=$libdir + else + # Just accumulate the unique libdirs. + case $hardcode_libdir_separator$hardcode_libdirs$hardcode_libdir_separator in + *"$hardcode_libdir_separator$libdir$hardcode_libdir_separator"*) + ;; + *) + func_append hardcode_libdirs "$hardcode_libdir_separator$libdir" + ;; + esac + fi + else + eval flag=\"$hardcode_libdir_flag_spec\" + func_append rpath " $flag" + fi + elif test -n "$runpath_var"; then + case "$perm_rpath " in + *" $libdir "*) ;; + *) func_append perm_rpath " $libdir" ;; + esac + fi + case $host in + *-*-cygwin* | *-*-mingw* | *-*-pw32* | *-*-os2* | *-cegcc*) + testbindir=`$ECHO "$libdir" | $SED -e 's*/lib$*/bin*'` + case :$dllsearchpath: in + *":$libdir:"*) ;; + ::) dllsearchpath=$libdir;; + *) func_append dllsearchpath ":$libdir";; + esac + case :$dllsearchpath: in + *":$testbindir:"*) ;; + ::) dllsearchpath=$testbindir;; + *) func_append dllsearchpath ":$testbindir";; + esac + ;; + esac + done + # Substitute the hardcoded libdirs into the rpath. + if test -n "$hardcode_libdir_separator" && + test -n "$hardcode_libdirs"; then + libdir=$hardcode_libdirs + eval rpath=\" $hardcode_libdir_flag_spec\" + fi + compile_rpath=$rpath + + rpath= + hardcode_libdirs= + for libdir in $finalize_rpath; do + if test -n "$hardcode_libdir_flag_spec"; then + if test -n "$hardcode_libdir_separator"; then + if test -z "$hardcode_libdirs"; then + hardcode_libdirs=$libdir + else + # Just accumulate the unique libdirs. + case $hardcode_libdir_separator$hardcode_libdirs$hardcode_libdir_separator in + *"$hardcode_libdir_separator$libdir$hardcode_libdir_separator"*) + ;; + *) + func_append hardcode_libdirs "$hardcode_libdir_separator$libdir" + ;; + esac + fi + else + eval flag=\"$hardcode_libdir_flag_spec\" + func_append rpath " $flag" + fi + elif test -n "$runpath_var"; then + case "$finalize_perm_rpath " in + *" $libdir "*) ;; + *) func_append finalize_perm_rpath " $libdir" ;; + esac + fi + done + # Substitute the hardcoded libdirs into the rpath. + if test -n "$hardcode_libdir_separator" && + test -n "$hardcode_libdirs"; then + libdir=$hardcode_libdirs + eval rpath=\" $hardcode_libdir_flag_spec\" + fi + finalize_rpath=$rpath + + if test -n "$libobjs" && test yes = "$build_old_libs"; then + # Transform all the library objects into standard objects. + compile_command=`$ECHO "$compile_command" | $SP2NL | $SED "$lo2o" | $NL2SP` + finalize_command=`$ECHO "$finalize_command" | $SP2NL | $SED "$lo2o" | $NL2SP` + fi + + func_generate_dlsyms "$outputname" "@PROGRAM@" false + + # template prelinking step + if test -n "$prelink_cmds"; then + func_execute_cmds "$prelink_cmds" 'exit $?' + fi + + wrappers_required=: + case $host in + *cegcc* | *mingw32ce*) + # Disable wrappers for cegcc and mingw32ce hosts, we are cross compiling anyway. + wrappers_required=false + ;; + *cygwin* | *mingw* ) + test yes = "$build_libtool_libs" || wrappers_required=false + ;; + *) + if test no = "$need_relink" || test yes != "$build_libtool_libs"; then + wrappers_required=false + fi + ;; + esac + $wrappers_required || { + # Replace the output file specification. + compile_command=`$ECHO "$compile_command" | $SED 's%@OUTPUT@%'"$output"'%g'` + link_command=$compile_command$compile_rpath + + # We have no uninstalled library dependencies, so finalize right now. + exit_status=0 + func_show_eval "$link_command" 'exit_status=$?' + + if test -n "$postlink_cmds"; then + func_to_tool_file "$output" + postlink_cmds=`func_echo_all "$postlink_cmds" | $SED -e 's%@OUTPUT@%'"$output"'%g' -e 's%@TOOL_OUTPUT@%'"$func_to_tool_file_result"'%g'` + func_execute_cmds "$postlink_cmds" 'exit $?' + fi + + # Delete the generated files. + if test -f "$output_objdir/${outputname}S.$objext"; then + func_show_eval '$RM "$output_objdir/${outputname}S.$objext"' + fi + + exit $exit_status + } + + if test -n "$compile_shlibpath$finalize_shlibpath"; then + compile_command="$shlibpath_var=\"$compile_shlibpath$finalize_shlibpath\$$shlibpath_var\" $compile_command" + fi + if test -n "$finalize_shlibpath"; then + finalize_command="$shlibpath_var=\"$finalize_shlibpath\$$shlibpath_var\" $finalize_command" + fi + + compile_var= + finalize_var= + if test -n "$runpath_var"; then + if test -n "$perm_rpath"; then + # We should set the runpath_var. + rpath= + for dir in $perm_rpath; do + func_append rpath "$dir:" + done + compile_var="$runpath_var=\"$rpath\$$runpath_var\" " + fi + if test -n "$finalize_perm_rpath"; then + # We should set the runpath_var. + rpath= + for dir in $finalize_perm_rpath; do + func_append rpath "$dir:" + done + finalize_var="$runpath_var=\"$rpath\$$runpath_var\" " + fi + fi + + if test yes = "$no_install"; then + # We don't need to create a wrapper script. + link_command=$compile_var$compile_command$compile_rpath + # Replace the output file specification. + link_command=`$ECHO "$link_command" | $SED 's%@OUTPUT@%'"$output"'%g'` + # Delete the old output file. + $opt_dry_run || $RM $output + # Link the executable and exit + func_show_eval "$link_command" 'exit $?' + + if test -n "$postlink_cmds"; then + func_to_tool_file "$output" + postlink_cmds=`func_echo_all "$postlink_cmds" | $SED -e 's%@OUTPUT@%'"$output"'%g' -e 's%@TOOL_OUTPUT@%'"$func_to_tool_file_result"'%g'` + func_execute_cmds "$postlink_cmds" 'exit $?' + fi + + exit $EXIT_SUCCESS + fi + + case $hardcode_action,$fast_install in + relink,*) + # Fast installation is not supported + link_command=$compile_var$compile_command$compile_rpath + relink_command=$finalize_var$finalize_command$finalize_rpath + + func_warning "this platform does not like uninstalled shared libraries" + func_warning "'$output' will be relinked during installation" + ;; + *,yes) + link_command=$finalize_var$compile_command$finalize_rpath + relink_command=`$ECHO "$compile_var$compile_command$compile_rpath" | $SED 's%@OUTPUT@%\$progdir/\$file%g'` + ;; + *,no) + link_command=$compile_var$compile_command$compile_rpath + relink_command=$finalize_var$finalize_command$finalize_rpath + ;; + *,needless) + link_command=$finalize_var$compile_command$finalize_rpath + relink_command= + ;; + esac + + # Replace the output file specification. + link_command=`$ECHO "$link_command" | $SED 's%@OUTPUT@%'"$output_objdir/$outputname"'%g'` + + # Delete the old output files. + $opt_dry_run || $RM $output $output_objdir/$outputname $output_objdir/lt-$outputname + + func_show_eval "$link_command" 'exit $?' + + if test -n "$postlink_cmds"; then + func_to_tool_file "$output_objdir/$outputname" + postlink_cmds=`func_echo_all "$postlink_cmds" | $SED -e 's%@OUTPUT@%'"$output_objdir/$outputname"'%g' -e 's%@TOOL_OUTPUT@%'"$func_to_tool_file_result"'%g'` + func_execute_cmds "$postlink_cmds" 'exit $?' + fi + + # Now create the wrapper script. + func_verbose "creating $output" + + # Quote the relink command for shipping. + if test -n "$relink_command"; then + # Preserve any variables that may affect compiler behavior + for var in $variables_saved_for_relink; do + if eval test -z \"\${$var+set}\"; then + relink_command="{ test -z \"\${$var+set}\" || $lt_unset $var || { $var=; export $var; }; }; $relink_command" + elif eval var_value=\$$var; test -z "$var_value"; then + relink_command="$var=; export $var; $relink_command" + else + func_quote_for_eval "$var_value" + relink_command="$var=$func_quote_for_eval_result; export $var; $relink_command" + fi + done + relink_command="(cd `pwd`; $relink_command)" + relink_command=`$ECHO "$relink_command" | $SED "$sed_quote_subst"` + fi + + # Only actually do things if not in dry run mode. + $opt_dry_run || { + # win32 will think the script is a binary if it has + # a .exe suffix, so we strip it off here. + case $output in + *.exe) func_stripname '' '.exe' "$output" + output=$func_stripname_result ;; + esac + # test for cygwin because mv fails w/o .exe extensions + case $host in + *cygwin*) + exeext=.exe + func_stripname '' '.exe' "$outputname" + outputname=$func_stripname_result ;; + *) exeext= ;; + esac + case $host in + *cygwin* | *mingw* ) + func_dirname_and_basename "$output" "" "." + output_name=$func_basename_result + output_path=$func_dirname_result + cwrappersource=$output_path/$objdir/lt-$output_name.c + cwrapper=$output_path/$output_name.exe + $RM $cwrappersource $cwrapper + trap "$RM $cwrappersource $cwrapper; exit $EXIT_FAILURE" 1 2 15 + + func_emit_cwrapperexe_src > $cwrappersource + + # The wrapper executable is built using the $host compiler, + # because it contains $host paths and files. If cross- + # compiling, it, like the target executable, must be + # executed on the $host or under an emulation environment. + $opt_dry_run || { + $LTCC $LTCFLAGS -o $cwrapper $cwrappersource + $STRIP $cwrapper + } + + # Now, create the wrapper script for func_source use: + func_ltwrapper_scriptname $cwrapper + $RM $func_ltwrapper_scriptname_result + trap "$RM $func_ltwrapper_scriptname_result; exit $EXIT_FAILURE" 1 2 15 + $opt_dry_run || { + # note: this script will not be executed, so do not chmod. + if test "x$build" = "x$host"; then + $cwrapper --lt-dump-script > $func_ltwrapper_scriptname_result + else + func_emit_wrapper no > $func_ltwrapper_scriptname_result + fi + } + ;; + * ) + $RM $output + trap "$RM $output; exit $EXIT_FAILURE" 1 2 15 + + func_emit_wrapper no > $output + chmod +x $output + ;; + esac + } + exit $EXIT_SUCCESS + ;; + esac + + # See if we need to build an old-fashioned archive. + for oldlib in $oldlibs; do + + case $build_libtool_libs in + convenience) + oldobjs="$libobjs_save $symfileobj" + addlibs=$convenience + build_libtool_libs=no + ;; + module) + oldobjs=$libobjs_save + addlibs=$old_convenience + build_libtool_libs=no + ;; + *) + oldobjs="$old_deplibs $non_pic_objects" + $preload && test -f "$symfileobj" \ + && func_append oldobjs " $symfileobj" + addlibs=$old_convenience + ;; + esac + + if test -n "$addlibs"; then + gentop=$output_objdir/${outputname}x + func_append generated " $gentop" + + func_extract_archives $gentop $addlibs + func_append oldobjs " $func_extract_archives_result" + fi + + # Do each command in the archive commands. + if test -n "$old_archive_from_new_cmds" && test yes = "$build_libtool_libs"; then + cmds=$old_archive_from_new_cmds + else + + # Add any objects from preloaded convenience libraries + if test -n "$dlprefiles"; then + gentop=$output_objdir/${outputname}x + func_append generated " $gentop" + + func_extract_archives $gentop $dlprefiles + func_append oldobjs " $func_extract_archives_result" + fi + + # POSIX demands no paths to be encoded in archives. We have + # to avoid creating archives with duplicate basenames if we + # might have to extract them afterwards, e.g., when creating a + # static archive out of a convenience library, or when linking + # the entirety of a libtool archive into another (currently + # not supported by libtool). + if (for obj in $oldobjs + do + func_basename "$obj" + $ECHO "$func_basename_result" + done | sort | sort -uc >/dev/null 2>&1); then + : + else + echo "copying selected object files to avoid basename conflicts..." + gentop=$output_objdir/${outputname}x + func_append generated " $gentop" + func_mkdir_p "$gentop" + save_oldobjs=$oldobjs + oldobjs= + counter=1 + for obj in $save_oldobjs + do + func_basename "$obj" + objbase=$func_basename_result + case " $oldobjs " in + " ") oldobjs=$obj ;; + *[\ /]"$objbase "*) + while :; do + # Make sure we don't pick an alternate name that also + # overlaps. + newobj=lt$counter-$objbase + func_arith $counter + 1 + counter=$func_arith_result + case " $oldobjs " in + *[\ /]"$newobj "*) ;; + *) if test ! -f "$gentop/$newobj"; then break; fi ;; + esac + done + func_show_eval "ln $obj $gentop/$newobj || cp $obj $gentop/$newobj" + func_append oldobjs " $gentop/$newobj" + ;; + *) func_append oldobjs " $obj" ;; + esac + done + fi + func_to_tool_file "$oldlib" func_convert_file_msys_to_w32 + tool_oldlib=$func_to_tool_file_result + eval cmds=\"$old_archive_cmds\" + + func_len " $cmds" + len=$func_len_result + if test "$len" -lt "$max_cmd_len" || test "$max_cmd_len" -le -1; then + cmds=$old_archive_cmds + elif test -n "$archiver_list_spec"; then + func_verbose "using command file archive linking..." + for obj in $oldobjs + do + func_to_tool_file "$obj" + $ECHO "$func_to_tool_file_result" + done > $output_objdir/$libname.libcmd + func_to_tool_file "$output_objdir/$libname.libcmd" + oldobjs=" $archiver_list_spec$func_to_tool_file_result" + cmds=$old_archive_cmds + else + # the command line is too long to link in one step, link in parts + func_verbose "using piecewise archive linking..." + save_RANLIB=$RANLIB + RANLIB=: + objlist= + concat_cmds= + save_oldobjs=$oldobjs + oldobjs= + # Is there a better way of finding the last object in the list? + for obj in $save_oldobjs + do + last_oldobj=$obj + done + eval test_cmds=\"$old_archive_cmds\" + func_len " $test_cmds" + len0=$func_len_result + len=$len0 + for obj in $save_oldobjs + do + func_len " $obj" + func_arith $len + $func_len_result + len=$func_arith_result + func_append objlist " $obj" + if test "$len" -lt "$max_cmd_len"; then + : + else + # the above command should be used before it gets too long + oldobjs=$objlist + if test "$obj" = "$last_oldobj"; then + RANLIB=$save_RANLIB + fi + test -z "$concat_cmds" || concat_cmds=$concat_cmds~ + eval concat_cmds=\"\$concat_cmds$old_archive_cmds\" + objlist= + len=$len0 + fi + done + RANLIB=$save_RANLIB + oldobjs=$objlist + if test -z "$oldobjs"; then + eval cmds=\"\$concat_cmds\" + else + eval cmds=\"\$concat_cmds~\$old_archive_cmds\" + fi + fi + fi + func_execute_cmds "$cmds" 'exit $?' + done + + test -n "$generated" && \ + func_show_eval "${RM}r$generated" + + # Now create the libtool archive. + case $output in + *.la) + old_library= + test yes = "$build_old_libs" && old_library=$libname.$libext + func_verbose "creating $output" + + # Preserve any variables that may affect compiler behavior + for var in $variables_saved_for_relink; do + if eval test -z \"\${$var+set}\"; then + relink_command="{ test -z \"\${$var+set}\" || $lt_unset $var || { $var=; export $var; }; }; $relink_command" + elif eval var_value=\$$var; test -z "$var_value"; then + relink_command="$var=; export $var; $relink_command" + else + func_quote_for_eval "$var_value" + relink_command="$var=$func_quote_for_eval_result; export $var; $relink_command" + fi + done + # Quote the link command for shipping. + relink_command="(cd `pwd`; $SHELL \"$progpath\" $preserve_args --mode=relink $libtool_args @inst_prefix_dir@)" + relink_command=`$ECHO "$relink_command" | $SED "$sed_quote_subst"` + if test yes = "$hardcode_automatic"; then + relink_command= + fi + + # Only create the output if not a dry run. + $opt_dry_run || { + for installed in no yes; do + if test yes = "$installed"; then + if test -z "$install_libdir"; then + break + fi + output=$output_objdir/${outputname}i + # Replace all uninstalled libtool libraries with the installed ones + newdependency_libs= + for deplib in $dependency_libs; do + case $deplib in + *.la) + func_basename "$deplib" + name=$func_basename_result + func_resolve_sysroot "$deplib" + eval libdir=`$SED -n -e 's/^libdir=\(.*\)$/\1/p' $func_resolve_sysroot_result` + test -z "$libdir" && \ + func_fatal_error "'$deplib' is not a valid libtool archive" + func_append newdependency_libs " ${lt_sysroot:+=}$libdir/$name" + ;; + -L*) + func_stripname -L '' "$deplib" + func_replace_sysroot "$func_stripname_result" + func_append newdependency_libs " -L$func_replace_sysroot_result" + ;; + -R*) + func_stripname -R '' "$deplib" + func_replace_sysroot "$func_stripname_result" + func_append newdependency_libs " -R$func_replace_sysroot_result" + ;; + *) func_append newdependency_libs " $deplib" ;; + esac + done + dependency_libs=$newdependency_libs + newdlfiles= + + for lib in $dlfiles; do + case $lib in + *.la) + func_basename "$lib" + name=$func_basename_result + eval libdir=`$SED -n -e 's/^libdir=\(.*\)$/\1/p' $lib` + test -z "$libdir" && \ + func_fatal_error "'$lib' is not a valid libtool archive" + func_append newdlfiles " ${lt_sysroot:+=}$libdir/$name" + ;; + *) func_append newdlfiles " $lib" ;; + esac + done + dlfiles=$newdlfiles + newdlprefiles= + for lib in $dlprefiles; do + case $lib in + *.la) + # Only pass preopened files to the pseudo-archive (for + # eventual linking with the app. that links it) if we + # didn't already link the preopened objects directly into + # the library: + func_basename "$lib" + name=$func_basename_result + eval libdir=`$SED -n -e 's/^libdir=\(.*\)$/\1/p' $lib` + test -z "$libdir" && \ + func_fatal_error "'$lib' is not a valid libtool archive" + func_append newdlprefiles " ${lt_sysroot:+=}$libdir/$name" + ;; + esac + done + dlprefiles=$newdlprefiles + else + newdlfiles= + for lib in $dlfiles; do + case $lib in + [\\/]* | [A-Za-z]:[\\/]*) abs=$lib ;; + *) abs=`pwd`"/$lib" ;; + esac + func_append newdlfiles " $abs" + done + dlfiles=$newdlfiles + newdlprefiles= + for lib in $dlprefiles; do + case $lib in + [\\/]* | [A-Za-z]:[\\/]*) abs=$lib ;; + *) abs=`pwd`"/$lib" ;; + esac + func_append newdlprefiles " $abs" + done + dlprefiles=$newdlprefiles + fi + $RM $output + # place dlname in correct position for cygwin + # In fact, it would be nice if we could use this code for all target + # systems that can't hard-code library paths into their executables + # and that have no shared library path variable independent of PATH, + # but it turns out we can't easily determine that from inspecting + # libtool variables, so we have to hard-code the OSs to which it + # applies here; at the moment, that means platforms that use the PE + # object format with DLL files. See the long comment at the top of + # tests/bindir.at for full details. + tdlname=$dlname + case $host,$output,$installed,$module,$dlname in + *cygwin*,*lai,yes,no,*.dll | *mingw*,*lai,yes,no,*.dll | *cegcc*,*lai,yes,no,*.dll) + # If a -bindir argument was supplied, place the dll there. + if test -n "$bindir"; then + func_relative_path "$install_libdir" "$bindir" + tdlname=$func_relative_path_result/$dlname + else + # Otherwise fall back on heuristic. + tdlname=../bin/$dlname + fi + ;; + esac + $ECHO > $output "\ +# $outputname - a libtool library file +# Generated by $PROGRAM (GNU $PACKAGE) $VERSION +# +# Please DO NOT delete this file! +# It is necessary for linking the library. + +# The name that we can dlopen(3). +dlname='$tdlname' + +# Names of this library. +library_names='$library_names' + +# The name of the static archive. +old_library='$old_library' + +# Linker flags that cannot go in dependency_libs. +inherited_linker_flags='$new_inherited_linker_flags' + +# Libraries that this one depends upon. +dependency_libs='$dependency_libs' + +# Names of additional weak libraries provided by this library +weak_library_names='$weak_libs' + +# Version information for $libname. +current=$current +age=$age +revision=$revision + +# Is this an already installed library? +installed=$installed + +# Should we warn about portability when linking against -modules? +shouldnotlink=$module + +# Files to dlopen/dlpreopen +dlopen='$dlfiles' +dlpreopen='$dlprefiles' + +# Directory that this library needs to be installed in: +libdir='$install_libdir'" + if test no,yes = "$installed,$need_relink"; then + $ECHO >> $output "\ +relink_command=\"$relink_command\"" + fi + done + } + + # Do a symbolic link so that the libtool archive can be found in + # LD_LIBRARY_PATH before the program is installed. + func_show_eval '( cd "$output_objdir" && $RM "$outputname" && $LN_S "../$outputname" "$outputname" )' 'exit $?' + ;; + esac + exit $EXIT_SUCCESS +} + +if test link = "$opt_mode" || test relink = "$opt_mode"; then + func_mode_link ${1+"$@"} +fi + + +# func_mode_uninstall arg... +func_mode_uninstall () +{ + $debug_cmd + + RM=$nonopt + files= + rmforce=false + exit_status=0 + + # This variable tells wrapper scripts just to set variables rather + # than running their programs. + libtool_install_magic=$magic + + for arg + do + case $arg in + -f) func_append RM " $arg"; rmforce=: ;; + -*) func_append RM " $arg" ;; + *) func_append files " $arg" ;; + esac + done + + test -z "$RM" && \ + func_fatal_help "you must specify an RM program" + + rmdirs= + + for file in $files; do + func_dirname "$file" "" "." + dir=$func_dirname_result + if test . = "$dir"; then + odir=$objdir + else + odir=$dir/$objdir + fi + func_basename "$file" + name=$func_basename_result + test uninstall = "$opt_mode" && odir=$dir + + # Remember odir for removal later, being careful to avoid duplicates + if test clean = "$opt_mode"; then + case " $rmdirs " in + *" $odir "*) ;; + *) func_append rmdirs " $odir" ;; + esac + fi + + # Don't error if the file doesn't exist and rm -f was used. + if { test -L "$file"; } >/dev/null 2>&1 || + { test -h "$file"; } >/dev/null 2>&1 || + test -f "$file"; then + : + elif test -d "$file"; then + exit_status=1 + continue + elif $rmforce; then + continue + fi + + rmfiles=$file + + case $name in + *.la) + # Possibly a libtool archive, so verify it. + if func_lalib_p "$file"; then + func_source $dir/$name + + # Delete the libtool libraries and symlinks. + for n in $library_names; do + func_append rmfiles " $odir/$n" + done + test -n "$old_library" && func_append rmfiles " $odir/$old_library" + + case $opt_mode in + clean) + case " $library_names " in + *" $dlname "*) ;; + *) test -n "$dlname" && func_append rmfiles " $odir/$dlname" ;; + esac + test -n "$libdir" && func_append rmfiles " $odir/$name $odir/${name}i" + ;; + uninstall) + if test -n "$library_names"; then + # Do each command in the postuninstall commands. + func_execute_cmds "$postuninstall_cmds" '$rmforce || exit_status=1' + fi + + if test -n "$old_library"; then + # Do each command in the old_postuninstall commands. + func_execute_cmds "$old_postuninstall_cmds" '$rmforce || exit_status=1' + fi + # FIXME: should reinstall the best remaining shared library. + ;; + esac + fi + ;; + + *.lo) + # Possibly a libtool object, so verify it. + if func_lalib_p "$file"; then + + # Read the .lo file + func_source $dir/$name + + # Add PIC object to the list of files to remove. + if test -n "$pic_object" && test none != "$pic_object"; then + func_append rmfiles " $dir/$pic_object" + fi + + # Add non-PIC object to the list of files to remove. + if test -n "$non_pic_object" && test none != "$non_pic_object"; then + func_append rmfiles " $dir/$non_pic_object" + fi + fi + ;; + + *) + if test clean = "$opt_mode"; then + noexename=$name + case $file in + *.exe) + func_stripname '' '.exe' "$file" + file=$func_stripname_result + func_stripname '' '.exe' "$name" + noexename=$func_stripname_result + # $file with .exe has already been added to rmfiles, + # add $file without .exe + func_append rmfiles " $file" + ;; + esac + # Do a test to see if this is a libtool program. + if func_ltwrapper_p "$file"; then + if func_ltwrapper_executable_p "$file"; then + func_ltwrapper_scriptname "$file" + relink_command= + func_source $func_ltwrapper_scriptname_result + func_append rmfiles " $func_ltwrapper_scriptname_result" + else + relink_command= + func_source $dir/$noexename + fi + + # note $name still contains .exe if it was in $file originally + # as does the version of $file that was added into $rmfiles + func_append rmfiles " $odir/$name $odir/${name}S.$objext" + if test yes = "$fast_install" && test -n "$relink_command"; then + func_append rmfiles " $odir/lt-$name" + fi + if test "X$noexename" != "X$name"; then + func_append rmfiles " $odir/lt-$noexename.c" + fi + fi + fi + ;; + esac + func_show_eval "$RM $rmfiles" 'exit_status=1' + done + + # Try to remove the $objdir's in the directories where we deleted files + for dir in $rmdirs; do + if test -d "$dir"; then + func_show_eval "rmdir $dir >/dev/null 2>&1" + fi + done + + exit $exit_status +} + +if test uninstall = "$opt_mode" || test clean = "$opt_mode"; then + func_mode_uninstall ${1+"$@"} +fi + +test -z "$opt_mode" && { + help=$generic_help + func_fatal_help "you must specify a MODE" +} + +test -z "$exec_cmd" && \ + func_fatal_help "invalid operation mode '$opt_mode'" + +if test -n "$exec_cmd"; then + eval exec "$exec_cmd" + exit $EXIT_FAILURE +fi + +exit $exit_status + + +# The TAGs below are defined such that we never get into a situation +# where we disable both kinds of libraries. Given conflicting +# choices, we go for a static library, that is the most portable, +# since we can't tell whether shared libraries were disabled because +# the user asked for that or because the platform doesn't support +# them. This is particularly important on AIX, because we don't +# support having both static and shared libraries enabled at the same +# time on that platform, so we default to a shared-only configuration. +# If a disable-shared tag is given, we'll fallback to a static-only +# configuration. But we'll never go from static-only to shared-only. + +# ### BEGIN LIBTOOL TAG CONFIG: disable-shared +build_libtool_libs=no +build_old_libs=yes +# ### END LIBTOOL TAG CONFIG: disable-shared + +# ### BEGIN LIBTOOL TAG CONFIG: disable-static +build_old_libs=`case $build_libtool_libs in yes) echo no;; *) echo yes;; esac` +# ### END LIBTOOL TAG CONFIG: disable-static + +# Local Variables: +# mode:shell-script +# sh-indentation:2 +# End: diff --git a/m4/libtool.m4 b/m4/libtool.m4 new file mode 100644 index 0000000..10ab284 --- /dev/null +++ b/m4/libtool.m4 @@ -0,0 +1,8388 @@ +# libtool.m4 - Configure libtool for the host system. -*-Autoconf-*- +# +# Copyright (C) 1996-2001, 2003-2015 Free Software Foundation, Inc. +# Written by Gordon Matzigkeit, 1996 +# +# This file is free software; the Free Software Foundation gives +# unlimited permission to copy and/or distribute it, with or without +# modifications, as long as this notice is preserved. + +m4_define([_LT_COPYING], [dnl +# Copyright (C) 2014 Free Software Foundation, Inc. +# This is free software; see the source for copying conditions. There is NO +# warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. + +# GNU Libtool is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 2 of of the License, or +# (at your option) any later version. +# +# As a special exception to the GNU General Public License, if you +# distribute this file as part of a program or library that is built +# using GNU Libtool, you may include this file under the same +# distribution terms that you use for the rest of that program. +# +# GNU Libtool is distributed in the hope that it will be useful, but +# WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . +]) + +# serial 58 LT_INIT + + +# LT_PREREQ(VERSION) +# ------------------ +# Complain and exit if this libtool version is less that VERSION. +m4_defun([LT_PREREQ], +[m4_if(m4_version_compare(m4_defn([LT_PACKAGE_VERSION]), [$1]), -1, + [m4_default([$3], + [m4_fatal([Libtool version $1 or higher is required], + 63)])], + [$2])]) + + +# _LT_CHECK_BUILDDIR +# ------------------ +# Complain if the absolute build directory name contains unusual characters +m4_defun([_LT_CHECK_BUILDDIR], +[case `pwd` in + *\ * | *\ *) + AC_MSG_WARN([Libtool does not cope well with whitespace in `pwd`]) ;; +esac +]) + + +# LT_INIT([OPTIONS]) +# ------------------ +AC_DEFUN([LT_INIT], +[AC_PREREQ([2.62])dnl We use AC_PATH_PROGS_FEATURE_CHECK +AC_REQUIRE([AC_CONFIG_AUX_DIR_DEFAULT])dnl +AC_BEFORE([$0], [LT_LANG])dnl +AC_BEFORE([$0], [LT_OUTPUT])dnl +AC_BEFORE([$0], [LTDL_INIT])dnl +m4_require([_LT_CHECK_BUILDDIR])dnl + +dnl Autoconf doesn't catch unexpanded LT_ macros by default: +m4_pattern_forbid([^_?LT_[A-Z_]+$])dnl +m4_pattern_allow([^(_LT_EOF|LT_DLGLOBAL|LT_DLLAZY_OR_NOW|LT_MULTI_MODULE)$])dnl +dnl aclocal doesn't pull ltoptions.m4, ltsugar.m4, or ltversion.m4 +dnl unless we require an AC_DEFUNed macro: +AC_REQUIRE([LTOPTIONS_VERSION])dnl +AC_REQUIRE([LTSUGAR_VERSION])dnl +AC_REQUIRE([LTVERSION_VERSION])dnl +AC_REQUIRE([LTOBSOLETE_VERSION])dnl +m4_require([_LT_PROG_LTMAIN])dnl + +_LT_SHELL_INIT([SHELL=${CONFIG_SHELL-/bin/sh}]) + +dnl Parse OPTIONS +_LT_SET_OPTIONS([$0], [$1]) + +# This can be used to rebuild libtool when needed +LIBTOOL_DEPS=$ltmain + +# Always use our own libtool. +LIBTOOL='$(SHELL) $(top_builddir)/libtool' +AC_SUBST(LIBTOOL)dnl + +_LT_SETUP + +# Only expand once: +m4_define([LT_INIT]) +])# LT_INIT + +# Old names: +AU_ALIAS([AC_PROG_LIBTOOL], [LT_INIT]) +AU_ALIAS([AM_PROG_LIBTOOL], [LT_INIT]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_PROG_LIBTOOL], []) +dnl AC_DEFUN([AM_PROG_LIBTOOL], []) + + +# _LT_PREPARE_CC_BASENAME +# ----------------------- +m4_defun([_LT_PREPARE_CC_BASENAME], [ +# Calculate cc_basename. Skip known compiler wrappers and cross-prefix. +func_cc_basename () +{ + for cc_temp in @S|@*""; do + case $cc_temp in + compile | *[[\\/]]compile | ccache | *[[\\/]]ccache ) ;; + distcc | *[[\\/]]distcc | purify | *[[\\/]]purify ) ;; + \-*) ;; + *) break;; + esac + done + func_cc_basename_result=`$ECHO "$cc_temp" | $SED "s%.*/%%; s%^$host_alias-%%"` +} +])# _LT_PREPARE_CC_BASENAME + + +# _LT_CC_BASENAME(CC) +# ------------------- +# It would be clearer to call AC_REQUIREs from _LT_PREPARE_CC_BASENAME, +# but that macro is also expanded into generated libtool script, which +# arranges for $SED and $ECHO to be set by different means. +m4_defun([_LT_CC_BASENAME], +[m4_require([_LT_PREPARE_CC_BASENAME])dnl +AC_REQUIRE([_LT_DECL_SED])dnl +AC_REQUIRE([_LT_PROG_ECHO_BACKSLASH])dnl +func_cc_basename $1 +cc_basename=$func_cc_basename_result +]) + + +# _LT_FILEUTILS_DEFAULTS +# ---------------------- +# It is okay to use these file commands and assume they have been set +# sensibly after 'm4_require([_LT_FILEUTILS_DEFAULTS])'. +m4_defun([_LT_FILEUTILS_DEFAULTS], +[: ${CP="cp -f"} +: ${MV="mv -f"} +: ${RM="rm -f"} +])# _LT_FILEUTILS_DEFAULTS + + +# _LT_SETUP +# --------- +m4_defun([_LT_SETUP], +[AC_REQUIRE([AC_CANONICAL_HOST])dnl +AC_REQUIRE([AC_CANONICAL_BUILD])dnl +AC_REQUIRE([_LT_PREPARE_SED_QUOTE_VARS])dnl +AC_REQUIRE([_LT_PROG_ECHO_BACKSLASH])dnl + +_LT_DECL([], [PATH_SEPARATOR], [1], [The PATH separator for the build system])dnl +dnl +_LT_DECL([], [host_alias], [0], [The host system])dnl +_LT_DECL([], [host], [0])dnl +_LT_DECL([], [host_os], [0])dnl +dnl +_LT_DECL([], [build_alias], [0], [The build system])dnl +_LT_DECL([], [build], [0])dnl +_LT_DECL([], [build_os], [0])dnl +dnl +AC_REQUIRE([AC_PROG_CC])dnl +AC_REQUIRE([LT_PATH_LD])dnl +AC_REQUIRE([LT_PATH_NM])dnl +dnl +AC_REQUIRE([AC_PROG_LN_S])dnl +test -z "$LN_S" && LN_S="ln -s" +_LT_DECL([], [LN_S], [1], [Whether we need soft or hard links])dnl +dnl +AC_REQUIRE([LT_CMD_MAX_LEN])dnl +_LT_DECL([objext], [ac_objext], [0], [Object file suffix (normally "o")])dnl +_LT_DECL([], [exeext], [0], [Executable file suffix (normally "")])dnl +dnl +m4_require([_LT_FILEUTILS_DEFAULTS])dnl +m4_require([_LT_CHECK_SHELL_FEATURES])dnl +m4_require([_LT_PATH_CONVERSION_FUNCTIONS])dnl +m4_require([_LT_CMD_RELOAD])dnl +m4_require([_LT_CHECK_MAGIC_METHOD])dnl +m4_require([_LT_CHECK_SHAREDLIB_FROM_LINKLIB])dnl +m4_require([_LT_CMD_OLD_ARCHIVE])dnl +m4_require([_LT_CMD_GLOBAL_SYMBOLS])dnl +m4_require([_LT_WITH_SYSROOT])dnl +m4_require([_LT_CMD_TRUNCATE])dnl + +_LT_CONFIG_LIBTOOL_INIT([ +# See if we are running on zsh, and set the options that allow our +# commands through without removal of \ escapes INIT. +if test -n "\${ZSH_VERSION+set}"; then + setopt NO_GLOB_SUBST +fi +]) +if test -n "${ZSH_VERSION+set}"; then + setopt NO_GLOB_SUBST +fi + +_LT_CHECK_OBJDIR + +m4_require([_LT_TAG_COMPILER])dnl + +case $host_os in +aix3*) + # AIX sometimes has problems with the GCC collect2 program. For some + # reason, if we set the COLLECT_NAMES environment variable, the problems + # vanish in a puff of smoke. + if test set != "${COLLECT_NAMES+set}"; then + COLLECT_NAMES= + export COLLECT_NAMES + fi + ;; +esac + +# Global variables: +ofile=libtool +can_build_shared=yes + +# All known linkers require a '.a' archive for static linking (except MSVC, +# which needs '.lib'). +libext=a + +with_gnu_ld=$lt_cv_prog_gnu_ld + +old_CC=$CC +old_CFLAGS=$CFLAGS + +# Set sane defaults for various variables +test -z "$CC" && CC=cc +test -z "$LTCC" && LTCC=$CC +test -z "$LTCFLAGS" && LTCFLAGS=$CFLAGS +test -z "$LD" && LD=ld +test -z "$ac_objext" && ac_objext=o + +_LT_CC_BASENAME([$compiler]) + +# Only perform the check for file, if the check method requires it +test -z "$MAGIC_CMD" && MAGIC_CMD=file +case $deplibs_check_method in +file_magic*) + if test "$file_magic_cmd" = '$MAGIC_CMD'; then + _LT_PATH_MAGIC + fi + ;; +esac + +# Use C for the default configuration in the libtool script +LT_SUPPORTED_TAG([CC]) +_LT_LANG_C_CONFIG +_LT_LANG_DEFAULT_CONFIG +_LT_CONFIG_COMMANDS +])# _LT_SETUP + + +# _LT_PREPARE_SED_QUOTE_VARS +# -------------------------- +# Define a few sed substitution that help us do robust quoting. +m4_defun([_LT_PREPARE_SED_QUOTE_VARS], +[# Backslashify metacharacters that are still active within +# double-quoted strings. +sed_quote_subst='s/\([["`$\\]]\)/\\\1/g' + +# Same as above, but do not quote variable references. +double_quote_subst='s/\([["`\\]]\)/\\\1/g' + +# Sed substitution to delay expansion of an escaped shell variable in a +# double_quote_subst'ed string. +delay_variable_subst='s/\\\\\\\\\\\$/\\\\\\$/g' + +# Sed substitution to delay expansion of an escaped single quote. +delay_single_quote_subst='s/'\''/'\'\\\\\\\'\''/g' + +# Sed substitution to avoid accidental globbing in evaled expressions +no_glob_subst='s/\*/\\\*/g' +]) + +# _LT_PROG_LTMAIN +# --------------- +# Note that this code is called both from 'configure', and 'config.status' +# now that we use AC_CONFIG_COMMANDS to generate libtool. Notably, +# 'config.status' has no value for ac_aux_dir unless we are using Automake, +# so we pass a copy along to make sure it has a sensible value anyway. +m4_defun([_LT_PROG_LTMAIN], +[m4_ifdef([AC_REQUIRE_AUX_FILE], [AC_REQUIRE_AUX_FILE([ltmain.sh])])dnl +_LT_CONFIG_LIBTOOL_INIT([ac_aux_dir='$ac_aux_dir']) +ltmain=$ac_aux_dir/ltmain.sh +])# _LT_PROG_LTMAIN + + +## ------------------------------------- ## +## Accumulate code for creating libtool. ## +## ------------------------------------- ## + +# So that we can recreate a full libtool script including additional +# tags, we accumulate the chunks of code to send to AC_CONFIG_COMMANDS +# in macros and then make a single call at the end using the 'libtool' +# label. + + +# _LT_CONFIG_LIBTOOL_INIT([INIT-COMMANDS]) +# ---------------------------------------- +# Register INIT-COMMANDS to be passed to AC_CONFIG_COMMANDS later. +m4_define([_LT_CONFIG_LIBTOOL_INIT], +[m4_ifval([$1], + [m4_append([_LT_OUTPUT_LIBTOOL_INIT], + [$1 +])])]) + +# Initialize. +m4_define([_LT_OUTPUT_LIBTOOL_INIT]) + + +# _LT_CONFIG_LIBTOOL([COMMANDS]) +# ------------------------------ +# Register COMMANDS to be passed to AC_CONFIG_COMMANDS later. +m4_define([_LT_CONFIG_LIBTOOL], +[m4_ifval([$1], + [m4_append([_LT_OUTPUT_LIBTOOL_COMMANDS], + [$1 +])])]) + +# Initialize. +m4_define([_LT_OUTPUT_LIBTOOL_COMMANDS]) + + +# _LT_CONFIG_SAVE_COMMANDS([COMMANDS], [INIT_COMMANDS]) +# ----------------------------------------------------- +m4_defun([_LT_CONFIG_SAVE_COMMANDS], +[_LT_CONFIG_LIBTOOL([$1]) +_LT_CONFIG_LIBTOOL_INIT([$2]) +]) + + +# _LT_FORMAT_COMMENT([COMMENT]) +# ----------------------------- +# Add leading comment marks to the start of each line, and a trailing +# full-stop to the whole comment if one is not present already. +m4_define([_LT_FORMAT_COMMENT], +[m4_ifval([$1], [ +m4_bpatsubst([m4_bpatsubst([$1], [^ *], [# ])], + [['`$\]], [\\\&])]m4_bmatch([$1], [[!?.]$], [], [.]) +)]) + + + +## ------------------------ ## +## FIXME: Eliminate VARNAME ## +## ------------------------ ## + + +# _LT_DECL([CONFIGNAME], VARNAME, VALUE, [DESCRIPTION], [IS-TAGGED?]) +# ------------------------------------------------------------------- +# CONFIGNAME is the name given to the value in the libtool script. +# VARNAME is the (base) name used in the configure script. +# VALUE may be 0, 1 or 2 for a computed quote escaped value based on +# VARNAME. Any other value will be used directly. +m4_define([_LT_DECL], +[lt_if_append_uniq([lt_decl_varnames], [$2], [, ], + [lt_dict_add_subkey([lt_decl_dict], [$2], [libtool_name], + [m4_ifval([$1], [$1], [$2])]) + lt_dict_add_subkey([lt_decl_dict], [$2], [value], [$3]) + m4_ifval([$4], + [lt_dict_add_subkey([lt_decl_dict], [$2], [description], [$4])]) + lt_dict_add_subkey([lt_decl_dict], [$2], + [tagged?], [m4_ifval([$5], [yes], [no])])]) +]) + + +# _LT_TAGDECL([CONFIGNAME], VARNAME, VALUE, [DESCRIPTION]) +# -------------------------------------------------------- +m4_define([_LT_TAGDECL], [_LT_DECL([$1], [$2], [$3], [$4], [yes])]) + + +# lt_decl_tag_varnames([SEPARATOR], [VARNAME1...]) +# ------------------------------------------------ +m4_define([lt_decl_tag_varnames], +[_lt_decl_filter([tagged?], [yes], $@)]) + + +# _lt_decl_filter(SUBKEY, VALUE, [SEPARATOR], [VARNAME1..]) +# --------------------------------------------------------- +m4_define([_lt_decl_filter], +[m4_case([$#], + [0], [m4_fatal([$0: too few arguments: $#])], + [1], [m4_fatal([$0: too few arguments: $#: $1])], + [2], [lt_dict_filter([lt_decl_dict], [$1], [$2], [], lt_decl_varnames)], + [3], [lt_dict_filter([lt_decl_dict], [$1], [$2], [$3], lt_decl_varnames)], + [lt_dict_filter([lt_decl_dict], $@)])[]dnl +]) + + +# lt_decl_quote_varnames([SEPARATOR], [VARNAME1...]) +# -------------------------------------------------- +m4_define([lt_decl_quote_varnames], +[_lt_decl_filter([value], [1], $@)]) + + +# lt_decl_dquote_varnames([SEPARATOR], [VARNAME1...]) +# --------------------------------------------------- +m4_define([lt_decl_dquote_varnames], +[_lt_decl_filter([value], [2], $@)]) + + +# lt_decl_varnames_tagged([SEPARATOR], [VARNAME1...]) +# --------------------------------------------------- +m4_define([lt_decl_varnames_tagged], +[m4_assert([$# <= 2])dnl +_$0(m4_quote(m4_default([$1], [[, ]])), + m4_ifval([$2], [[$2]], [m4_dquote(lt_decl_tag_varnames)]), + m4_split(m4_normalize(m4_quote(_LT_TAGS)), [ ]))]) +m4_define([_lt_decl_varnames_tagged], +[m4_ifval([$3], [lt_combine([$1], [$2], [_], $3)])]) + + +# lt_decl_all_varnames([SEPARATOR], [VARNAME1...]) +# ------------------------------------------------ +m4_define([lt_decl_all_varnames], +[_$0(m4_quote(m4_default([$1], [[, ]])), + m4_if([$2], [], + m4_quote(lt_decl_varnames), + m4_quote(m4_shift($@))))[]dnl +]) +m4_define([_lt_decl_all_varnames], +[lt_join($@, lt_decl_varnames_tagged([$1], + lt_decl_tag_varnames([[, ]], m4_shift($@))))dnl +]) + + +# _LT_CONFIG_STATUS_DECLARE([VARNAME]) +# ------------------------------------ +# Quote a variable value, and forward it to 'config.status' so that its +# declaration there will have the same value as in 'configure'. VARNAME +# must have a single quote delimited value for this to work. +m4_define([_LT_CONFIG_STATUS_DECLARE], +[$1='`$ECHO "$][$1" | $SED "$delay_single_quote_subst"`']) + + +# _LT_CONFIG_STATUS_DECLARATIONS +# ------------------------------ +# We delimit libtool config variables with single quotes, so when +# we write them to config.status, we have to be sure to quote all +# embedded single quotes properly. In configure, this macro expands +# each variable declared with _LT_DECL (and _LT_TAGDECL) into: +# +# ='`$ECHO "$" | $SED "$delay_single_quote_subst"`' +m4_defun([_LT_CONFIG_STATUS_DECLARATIONS], +[m4_foreach([_lt_var], m4_quote(lt_decl_all_varnames), + [m4_n([_LT_CONFIG_STATUS_DECLARE(_lt_var)])])]) + + +# _LT_LIBTOOL_TAGS +# ---------------- +# Output comment and list of tags supported by the script +m4_defun([_LT_LIBTOOL_TAGS], +[_LT_FORMAT_COMMENT([The names of the tagged configurations supported by this script])dnl +available_tags='_LT_TAGS'dnl +]) + + +# _LT_LIBTOOL_DECLARE(VARNAME, [TAG]) +# ----------------------------------- +# Extract the dictionary values for VARNAME (optionally with TAG) and +# expand to a commented shell variable setting: +# +# # Some comment about what VAR is for. +# visible_name=$lt_internal_name +m4_define([_LT_LIBTOOL_DECLARE], +[_LT_FORMAT_COMMENT(m4_quote(lt_dict_fetch([lt_decl_dict], [$1], + [description])))[]dnl +m4_pushdef([_libtool_name], + m4_quote(lt_dict_fetch([lt_decl_dict], [$1], [libtool_name])))[]dnl +m4_case(m4_quote(lt_dict_fetch([lt_decl_dict], [$1], [value])), + [0], [_libtool_name=[$]$1], + [1], [_libtool_name=$lt_[]$1], + [2], [_libtool_name=$lt_[]$1], + [_libtool_name=lt_dict_fetch([lt_decl_dict], [$1], [value])])[]dnl +m4_ifval([$2], [_$2])[]m4_popdef([_libtool_name])[]dnl +]) + + +# _LT_LIBTOOL_CONFIG_VARS +# ----------------------- +# Produce commented declarations of non-tagged libtool config variables +# suitable for insertion in the LIBTOOL CONFIG section of the 'libtool' +# script. Tagged libtool config variables (even for the LIBTOOL CONFIG +# section) are produced by _LT_LIBTOOL_TAG_VARS. +m4_defun([_LT_LIBTOOL_CONFIG_VARS], +[m4_foreach([_lt_var], + m4_quote(_lt_decl_filter([tagged?], [no], [], lt_decl_varnames)), + [m4_n([_LT_LIBTOOL_DECLARE(_lt_var)])])]) + + +# _LT_LIBTOOL_TAG_VARS(TAG) +# ------------------------- +m4_define([_LT_LIBTOOL_TAG_VARS], +[m4_foreach([_lt_var], m4_quote(lt_decl_tag_varnames), + [m4_n([_LT_LIBTOOL_DECLARE(_lt_var, [$1])])])]) + + +# _LT_TAGVAR(VARNAME, [TAGNAME]) +# ------------------------------ +m4_define([_LT_TAGVAR], [m4_ifval([$2], [$1_$2], [$1])]) + + +# _LT_CONFIG_COMMANDS +# ------------------- +# Send accumulated output to $CONFIG_STATUS. Thanks to the lists of +# variables for single and double quote escaping we saved from calls +# to _LT_DECL, we can put quote escaped variables declarations +# into 'config.status', and then the shell code to quote escape them in +# for loops in 'config.status'. Finally, any additional code accumulated +# from calls to _LT_CONFIG_LIBTOOL_INIT is expanded. +m4_defun([_LT_CONFIG_COMMANDS], +[AC_PROVIDE_IFELSE([LT_OUTPUT], + dnl If the libtool generation code has been placed in $CONFIG_LT, + dnl instead of duplicating it all over again into config.status, + dnl then we will have config.status run $CONFIG_LT later, so it + dnl needs to know what name is stored there: + [AC_CONFIG_COMMANDS([libtool], + [$SHELL $CONFIG_LT || AS_EXIT(1)], [CONFIG_LT='$CONFIG_LT'])], + dnl If the libtool generation code is destined for config.status, + dnl expand the accumulated commands and init code now: + [AC_CONFIG_COMMANDS([libtool], + [_LT_OUTPUT_LIBTOOL_COMMANDS], [_LT_OUTPUT_LIBTOOL_COMMANDS_INIT])]) +])#_LT_CONFIG_COMMANDS + + +# Initialize. +m4_define([_LT_OUTPUT_LIBTOOL_COMMANDS_INIT], +[ + +# The HP-UX ksh and POSIX shell print the target directory to stdout +# if CDPATH is set. +(unset CDPATH) >/dev/null 2>&1 && unset CDPATH + +sed_quote_subst='$sed_quote_subst' +double_quote_subst='$double_quote_subst' +delay_variable_subst='$delay_variable_subst' +_LT_CONFIG_STATUS_DECLARATIONS +LTCC='$LTCC' +LTCFLAGS='$LTCFLAGS' +compiler='$compiler_DEFAULT' + +# A function that is used when there is no print builtin or printf. +func_fallback_echo () +{ + eval 'cat <<_LTECHO_EOF +\$[]1 +_LTECHO_EOF' +} + +# Quote evaled strings. +for var in lt_decl_all_varnames([[ \ +]], lt_decl_quote_varnames); do + case \`eval \\\\\$ECHO \\\\""\\\\\$\$var"\\\\"\` in + *[[\\\\\\\`\\"\\\$]]*) + eval "lt_\$var=\\\\\\"\\\`\\\$ECHO \\"\\\$\$var\\" | \\\$SED \\"\\\$sed_quote_subst\\"\\\`\\\\\\"" ## exclude from sc_prohibit_nested_quotes + ;; + *) + eval "lt_\$var=\\\\\\"\\\$\$var\\\\\\"" + ;; + esac +done + +# Double-quote double-evaled strings. +for var in lt_decl_all_varnames([[ \ +]], lt_decl_dquote_varnames); do + case \`eval \\\\\$ECHO \\\\""\\\\\$\$var"\\\\"\` in + *[[\\\\\\\`\\"\\\$]]*) + eval "lt_\$var=\\\\\\"\\\`\\\$ECHO \\"\\\$\$var\\" | \\\$SED -e \\"\\\$double_quote_subst\\" -e \\"\\\$sed_quote_subst\\" -e \\"\\\$delay_variable_subst\\"\\\`\\\\\\"" ## exclude from sc_prohibit_nested_quotes + ;; + *) + eval "lt_\$var=\\\\\\"\\\$\$var\\\\\\"" + ;; + esac +done + +_LT_OUTPUT_LIBTOOL_INIT +]) + +# _LT_GENERATED_FILE_INIT(FILE, [COMMENT]) +# ------------------------------------ +# Generate a child script FILE with all initialization necessary to +# reuse the environment learned by the parent script, and make the +# file executable. If COMMENT is supplied, it is inserted after the +# '#!' sequence but before initialization text begins. After this +# macro, additional text can be appended to FILE to form the body of +# the child script. The macro ends with non-zero status if the +# file could not be fully written (such as if the disk is full). +m4_ifdef([AS_INIT_GENERATED], +[m4_defun([_LT_GENERATED_FILE_INIT],[AS_INIT_GENERATED($@)])], +[m4_defun([_LT_GENERATED_FILE_INIT], +[m4_require([AS_PREPARE])]dnl +[m4_pushdef([AS_MESSAGE_LOG_FD])]dnl +[lt_write_fail=0 +cat >$1 <<_ASEOF || lt_write_fail=1 +#! $SHELL +# Generated by $as_me. +$2 +SHELL=\${CONFIG_SHELL-$SHELL} +export SHELL +_ASEOF +cat >>$1 <<\_ASEOF || lt_write_fail=1 +AS_SHELL_SANITIZE +_AS_PREPARE +exec AS_MESSAGE_FD>&1 +_ASEOF +test 0 = "$lt_write_fail" && chmod +x $1[]dnl +m4_popdef([AS_MESSAGE_LOG_FD])])])# _LT_GENERATED_FILE_INIT + +# LT_OUTPUT +# --------- +# This macro allows early generation of the libtool script (before +# AC_OUTPUT is called), incase it is used in configure for compilation +# tests. +AC_DEFUN([LT_OUTPUT], +[: ${CONFIG_LT=./config.lt} +AC_MSG_NOTICE([creating $CONFIG_LT]) +_LT_GENERATED_FILE_INIT(["$CONFIG_LT"], +[# Run this file to recreate a libtool stub with the current configuration.]) + +cat >>"$CONFIG_LT" <<\_LTEOF +lt_cl_silent=false +exec AS_MESSAGE_LOG_FD>>config.log +{ + echo + AS_BOX([Running $as_me.]) +} >&AS_MESSAGE_LOG_FD + +lt_cl_help="\ +'$as_me' creates a local libtool stub from the current configuration, +for use in further configure time tests before the real libtool is +generated. + +Usage: $[0] [[OPTIONS]] + + -h, --help print this help, then exit + -V, --version print version number, then exit + -q, --quiet do not print progress messages + -d, --debug don't remove temporary files + +Report bugs to ." + +lt_cl_version="\ +m4_ifset([AC_PACKAGE_NAME], [AC_PACKAGE_NAME ])config.lt[]dnl +m4_ifset([AC_PACKAGE_VERSION], [ AC_PACKAGE_VERSION]) +configured by $[0], generated by m4_PACKAGE_STRING. + +Copyright (C) 2011 Free Software Foundation, Inc. +This config.lt script is free software; the Free Software Foundation +gives unlimited permision to copy, distribute and modify it." + +while test 0 != $[#] +do + case $[1] in + --version | --v* | -V ) + echo "$lt_cl_version"; exit 0 ;; + --help | --h* | -h ) + echo "$lt_cl_help"; exit 0 ;; + --debug | --d* | -d ) + debug=: ;; + --quiet | --q* | --silent | --s* | -q ) + lt_cl_silent=: ;; + + -*) AC_MSG_ERROR([unrecognized option: $[1] +Try '$[0] --help' for more information.]) ;; + + *) AC_MSG_ERROR([unrecognized argument: $[1] +Try '$[0] --help' for more information.]) ;; + esac + shift +done + +if $lt_cl_silent; then + exec AS_MESSAGE_FD>/dev/null +fi +_LTEOF + +cat >>"$CONFIG_LT" <<_LTEOF +_LT_OUTPUT_LIBTOOL_COMMANDS_INIT +_LTEOF + +cat >>"$CONFIG_LT" <<\_LTEOF +AC_MSG_NOTICE([creating $ofile]) +_LT_OUTPUT_LIBTOOL_COMMANDS +AS_EXIT(0) +_LTEOF +chmod +x "$CONFIG_LT" + +# configure is writing to config.log, but config.lt does its own redirection, +# appending to config.log, which fails on DOS, as config.log is still kept +# open by configure. Here we exec the FD to /dev/null, effectively closing +# config.log, so it can be properly (re)opened and appended to by config.lt. +lt_cl_success=: +test yes = "$silent" && + lt_config_lt_args="$lt_config_lt_args --quiet" +exec AS_MESSAGE_LOG_FD>/dev/null +$SHELL "$CONFIG_LT" $lt_config_lt_args || lt_cl_success=false +exec AS_MESSAGE_LOG_FD>>config.log +$lt_cl_success || AS_EXIT(1) +])# LT_OUTPUT + + +# _LT_CONFIG(TAG) +# --------------- +# If TAG is the built-in tag, create an initial libtool script with a +# default configuration from the untagged config vars. Otherwise add code +# to config.status for appending the configuration named by TAG from the +# matching tagged config vars. +m4_defun([_LT_CONFIG], +[m4_require([_LT_FILEUTILS_DEFAULTS])dnl +_LT_CONFIG_SAVE_COMMANDS([ + m4_define([_LT_TAG], m4_if([$1], [], [C], [$1]))dnl + m4_if(_LT_TAG, [C], [ + # See if we are running on zsh, and set the options that allow our + # commands through without removal of \ escapes. + if test -n "${ZSH_VERSION+set}"; then + setopt NO_GLOB_SUBST + fi + + cfgfile=${ofile}T + trap "$RM \"$cfgfile\"; exit 1" 1 2 15 + $RM "$cfgfile" + + cat <<_LT_EOF >> "$cfgfile" +#! $SHELL +# Generated automatically by $as_me ($PACKAGE) $VERSION +# Libtool was configured on host `(hostname || uname -n) 2>/dev/null | sed 1q`: +# NOTE: Changes made to this file will be lost: look at ltmain.sh. + +# Provide generalized library-building support services. +# Written by Gordon Matzigkeit, 1996 + +_LT_COPYING +_LT_LIBTOOL_TAGS + +# Configured defaults for sys_lib_dlsearch_path munging. +: \${LT_SYS_LIBRARY_PATH="$configure_time_lt_sys_library_path"} + +# ### BEGIN LIBTOOL CONFIG +_LT_LIBTOOL_CONFIG_VARS +_LT_LIBTOOL_TAG_VARS +# ### END LIBTOOL CONFIG + +_LT_EOF + + cat <<'_LT_EOF' >> "$cfgfile" + +# ### BEGIN FUNCTIONS SHARED WITH CONFIGURE + +_LT_PREPARE_MUNGE_PATH_LIST +_LT_PREPARE_CC_BASENAME + +# ### END FUNCTIONS SHARED WITH CONFIGURE + +_LT_EOF + + case $host_os in + aix3*) + cat <<\_LT_EOF >> "$cfgfile" +# AIX sometimes has problems with the GCC collect2 program. For some +# reason, if we set the COLLECT_NAMES environment variable, the problems +# vanish in a puff of smoke. +if test set != "${COLLECT_NAMES+set}"; then + COLLECT_NAMES= + export COLLECT_NAMES +fi +_LT_EOF + ;; + esac + + _LT_PROG_LTMAIN + + # We use sed instead of cat because bash on DJGPP gets confused if + # if finds mixed CR/LF and LF-only lines. Since sed operates in + # text mode, it properly converts lines to CR/LF. This bash problem + # is reportedly fixed, but why not run on old versions too? + sed '$q' "$ltmain" >> "$cfgfile" \ + || (rm -f "$cfgfile"; exit 1) + + mv -f "$cfgfile" "$ofile" || + (rm -f "$ofile" && cp "$cfgfile" "$ofile" && rm -f "$cfgfile") + chmod +x "$ofile" +], +[cat <<_LT_EOF >> "$ofile" + +dnl Unfortunately we have to use $1 here, since _LT_TAG is not expanded +dnl in a comment (ie after a #). +# ### BEGIN LIBTOOL TAG CONFIG: $1 +_LT_LIBTOOL_TAG_VARS(_LT_TAG) +# ### END LIBTOOL TAG CONFIG: $1 +_LT_EOF +])dnl /m4_if +], +[m4_if([$1], [], [ + PACKAGE='$PACKAGE' + VERSION='$VERSION' + RM='$RM' + ofile='$ofile'], []) +])dnl /_LT_CONFIG_SAVE_COMMANDS +])# _LT_CONFIG + + +# LT_SUPPORTED_TAG(TAG) +# --------------------- +# Trace this macro to discover what tags are supported by the libtool +# --tag option, using: +# autoconf --trace 'LT_SUPPORTED_TAG:$1' +AC_DEFUN([LT_SUPPORTED_TAG], []) + + +# C support is built-in for now +m4_define([_LT_LANG_C_enabled], []) +m4_define([_LT_TAGS], []) + + +# LT_LANG(LANG) +# ------------- +# Enable libtool support for the given language if not already enabled. +AC_DEFUN([LT_LANG], +[AC_BEFORE([$0], [LT_OUTPUT])dnl +m4_case([$1], + [C], [_LT_LANG(C)], + [C++], [_LT_LANG(CXX)], + [Go], [_LT_LANG(GO)], + [Java], [_LT_LANG(GCJ)], + [Fortran 77], [_LT_LANG(F77)], + [Fortran], [_LT_LANG(FC)], + [Windows Resource], [_LT_LANG(RC)], + [m4_ifdef([_LT_LANG_]$1[_CONFIG], + [_LT_LANG($1)], + [m4_fatal([$0: unsupported language: "$1"])])])dnl +])# LT_LANG + + +# _LT_LANG(LANGNAME) +# ------------------ +m4_defun([_LT_LANG], +[m4_ifdef([_LT_LANG_]$1[_enabled], [], + [LT_SUPPORTED_TAG([$1])dnl + m4_append([_LT_TAGS], [$1 ])dnl + m4_define([_LT_LANG_]$1[_enabled], [])dnl + _LT_LANG_$1_CONFIG($1)])dnl +])# _LT_LANG + + +m4_ifndef([AC_PROG_GO], [ +############################################################ +# NOTE: This macro has been submitted for inclusion into # +# GNU Autoconf as AC_PROG_GO. When it is available in # +# a released version of Autoconf we should remove this # +# macro and use it instead. # +############################################################ +m4_defun([AC_PROG_GO], +[AC_LANG_PUSH(Go)dnl +AC_ARG_VAR([GOC], [Go compiler command])dnl +AC_ARG_VAR([GOFLAGS], [Go compiler flags])dnl +_AC_ARG_VAR_LDFLAGS()dnl +AC_CHECK_TOOL(GOC, gccgo) +if test -z "$GOC"; then + if test -n "$ac_tool_prefix"; then + AC_CHECK_PROG(GOC, [${ac_tool_prefix}gccgo], [${ac_tool_prefix}gccgo]) + fi +fi +if test -z "$GOC"; then + AC_CHECK_PROG(GOC, gccgo, gccgo, false) +fi +])#m4_defun +])#m4_ifndef + + +# _LT_LANG_DEFAULT_CONFIG +# ----------------------- +m4_defun([_LT_LANG_DEFAULT_CONFIG], +[AC_PROVIDE_IFELSE([AC_PROG_CXX], + [LT_LANG(CXX)], + [m4_define([AC_PROG_CXX], defn([AC_PROG_CXX])[LT_LANG(CXX)])]) + +AC_PROVIDE_IFELSE([AC_PROG_F77], + [LT_LANG(F77)], + [m4_define([AC_PROG_F77], defn([AC_PROG_F77])[LT_LANG(F77)])]) + +AC_PROVIDE_IFELSE([AC_PROG_FC], + [LT_LANG(FC)], + [m4_define([AC_PROG_FC], defn([AC_PROG_FC])[LT_LANG(FC)])]) + +dnl The call to [A][M_PROG_GCJ] is quoted like that to stop aclocal +dnl pulling things in needlessly. +AC_PROVIDE_IFELSE([AC_PROG_GCJ], + [LT_LANG(GCJ)], + [AC_PROVIDE_IFELSE([A][M_PROG_GCJ], + [LT_LANG(GCJ)], + [AC_PROVIDE_IFELSE([LT_PROG_GCJ], + [LT_LANG(GCJ)], + [m4_ifdef([AC_PROG_GCJ], + [m4_define([AC_PROG_GCJ], defn([AC_PROG_GCJ])[LT_LANG(GCJ)])]) + m4_ifdef([A][M_PROG_GCJ], + [m4_define([A][M_PROG_GCJ], defn([A][M_PROG_GCJ])[LT_LANG(GCJ)])]) + m4_ifdef([LT_PROG_GCJ], + [m4_define([LT_PROG_GCJ], defn([LT_PROG_GCJ])[LT_LANG(GCJ)])])])])]) + +AC_PROVIDE_IFELSE([AC_PROG_GO], + [LT_LANG(GO)], + [m4_define([AC_PROG_GO], defn([AC_PROG_GO])[LT_LANG(GO)])]) + +AC_PROVIDE_IFELSE([LT_PROG_RC], + [LT_LANG(RC)], + [m4_define([LT_PROG_RC], defn([LT_PROG_RC])[LT_LANG(RC)])]) +])# _LT_LANG_DEFAULT_CONFIG + +# Obsolete macros: +AU_DEFUN([AC_LIBTOOL_CXX], [LT_LANG(C++)]) +AU_DEFUN([AC_LIBTOOL_F77], [LT_LANG(Fortran 77)]) +AU_DEFUN([AC_LIBTOOL_FC], [LT_LANG(Fortran)]) +AU_DEFUN([AC_LIBTOOL_GCJ], [LT_LANG(Java)]) +AU_DEFUN([AC_LIBTOOL_RC], [LT_LANG(Windows Resource)]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_LIBTOOL_CXX], []) +dnl AC_DEFUN([AC_LIBTOOL_F77], []) +dnl AC_DEFUN([AC_LIBTOOL_FC], []) +dnl AC_DEFUN([AC_LIBTOOL_GCJ], []) +dnl AC_DEFUN([AC_LIBTOOL_RC], []) + + +# _LT_TAG_COMPILER +# ---------------- +m4_defun([_LT_TAG_COMPILER], +[AC_REQUIRE([AC_PROG_CC])dnl + +_LT_DECL([LTCC], [CC], [1], [A C compiler])dnl +_LT_DECL([LTCFLAGS], [CFLAGS], [1], [LTCC compiler flags])dnl +_LT_TAGDECL([CC], [compiler], [1], [A language specific compiler])dnl +_LT_TAGDECL([with_gcc], [GCC], [0], [Is the compiler the GNU compiler?])dnl + +# If no C compiler was specified, use CC. +LTCC=${LTCC-"$CC"} + +# If no C compiler flags were specified, use CFLAGS. +LTCFLAGS=${LTCFLAGS-"$CFLAGS"} + +# Allow CC to be a program name with arguments. +compiler=$CC +])# _LT_TAG_COMPILER + + +# _LT_COMPILER_BOILERPLATE +# ------------------------ +# Check for compiler boilerplate output or warnings with +# the simple compiler test code. +m4_defun([_LT_COMPILER_BOILERPLATE], +[m4_require([_LT_DECL_SED])dnl +ac_outfile=conftest.$ac_objext +echo "$lt_simple_compile_test_code" >conftest.$ac_ext +eval "$ac_compile" 2>&1 >/dev/null | $SED '/^$/d; /^ *+/d' >conftest.err +_lt_compiler_boilerplate=`cat conftest.err` +$RM conftest* +])# _LT_COMPILER_BOILERPLATE + + +# _LT_LINKER_BOILERPLATE +# ---------------------- +# Check for linker boilerplate output or warnings with +# the simple link test code. +m4_defun([_LT_LINKER_BOILERPLATE], +[m4_require([_LT_DECL_SED])dnl +ac_outfile=conftest.$ac_objext +echo "$lt_simple_link_test_code" >conftest.$ac_ext +eval "$ac_link" 2>&1 >/dev/null | $SED '/^$/d; /^ *+/d' >conftest.err +_lt_linker_boilerplate=`cat conftest.err` +$RM -r conftest* +])# _LT_LINKER_BOILERPLATE + +# _LT_REQUIRED_DARWIN_CHECKS +# ------------------------- +m4_defun_once([_LT_REQUIRED_DARWIN_CHECKS],[ + case $host_os in + rhapsody* | darwin*) + AC_CHECK_TOOL([DSYMUTIL], [dsymutil], [:]) + AC_CHECK_TOOL([NMEDIT], [nmedit], [:]) + AC_CHECK_TOOL([LIPO], [lipo], [:]) + AC_CHECK_TOOL([OTOOL], [otool], [:]) + AC_CHECK_TOOL([OTOOL64], [otool64], [:]) + _LT_DECL([], [DSYMUTIL], [1], + [Tool to manipulate archived DWARF debug symbol files on Mac OS X]) + _LT_DECL([], [NMEDIT], [1], + [Tool to change global to local symbols on Mac OS X]) + _LT_DECL([], [LIPO], [1], + [Tool to manipulate fat objects and archives on Mac OS X]) + _LT_DECL([], [OTOOL], [1], + [ldd/readelf like tool for Mach-O binaries on Mac OS X]) + _LT_DECL([], [OTOOL64], [1], + [ldd/readelf like tool for 64 bit Mach-O binaries on Mac OS X 10.4]) + + AC_CACHE_CHECK([for -single_module linker flag],[lt_cv_apple_cc_single_mod], + [lt_cv_apple_cc_single_mod=no + if test -z "$LT_MULTI_MODULE"; then + # By default we will add the -single_module flag. You can override + # by either setting the environment variable LT_MULTI_MODULE + # non-empty at configure time, or by adding -multi_module to the + # link flags. + rm -rf libconftest.dylib* + echo "int foo(void){return 1;}" > conftest.c + echo "$LTCC $LTCFLAGS $LDFLAGS -o libconftest.dylib \ +-dynamiclib -Wl,-single_module conftest.c" >&AS_MESSAGE_LOG_FD + $LTCC $LTCFLAGS $LDFLAGS -o libconftest.dylib \ + -dynamiclib -Wl,-single_module conftest.c 2>conftest.err + _lt_result=$? + # If there is a non-empty error log, and "single_module" + # appears in it, assume the flag caused a linker warning + if test -s conftest.err && $GREP single_module conftest.err; then + cat conftest.err >&AS_MESSAGE_LOG_FD + # Otherwise, if the output was created with a 0 exit code from + # the compiler, it worked. + elif test -f libconftest.dylib && test 0 = "$_lt_result"; then + lt_cv_apple_cc_single_mod=yes + else + cat conftest.err >&AS_MESSAGE_LOG_FD + fi + rm -rf libconftest.dylib* + rm -f conftest.* + fi]) + + AC_CACHE_CHECK([for -exported_symbols_list linker flag], + [lt_cv_ld_exported_symbols_list], + [lt_cv_ld_exported_symbols_list=no + save_LDFLAGS=$LDFLAGS + echo "_main" > conftest.sym + LDFLAGS="$LDFLAGS -Wl,-exported_symbols_list,conftest.sym" + AC_LINK_IFELSE([AC_LANG_PROGRAM([],[])], + [lt_cv_ld_exported_symbols_list=yes], + [lt_cv_ld_exported_symbols_list=no]) + LDFLAGS=$save_LDFLAGS + ]) + + AC_CACHE_CHECK([for -force_load linker flag],[lt_cv_ld_force_load], + [lt_cv_ld_force_load=no + cat > conftest.c << _LT_EOF +int forced_loaded() { return 2;} +_LT_EOF + echo "$LTCC $LTCFLAGS -c -o conftest.o conftest.c" >&AS_MESSAGE_LOG_FD + $LTCC $LTCFLAGS -c -o conftest.o conftest.c 2>&AS_MESSAGE_LOG_FD + echo "$AR cru libconftest.a conftest.o" >&AS_MESSAGE_LOG_FD + $AR cru libconftest.a conftest.o 2>&AS_MESSAGE_LOG_FD + echo "$RANLIB libconftest.a" >&AS_MESSAGE_LOG_FD + $RANLIB libconftest.a 2>&AS_MESSAGE_LOG_FD + cat > conftest.c << _LT_EOF +int main() { return 0;} +_LT_EOF + echo "$LTCC $LTCFLAGS $LDFLAGS -o conftest conftest.c -Wl,-force_load,./libconftest.a" >&AS_MESSAGE_LOG_FD + $LTCC $LTCFLAGS $LDFLAGS -o conftest conftest.c -Wl,-force_load,./libconftest.a 2>conftest.err + _lt_result=$? + if test -s conftest.err && $GREP force_load conftest.err; then + cat conftest.err >&AS_MESSAGE_LOG_FD + elif test -f conftest && test 0 = "$_lt_result" && $GREP forced_load conftest >/dev/null 2>&1; then + lt_cv_ld_force_load=yes + else + cat conftest.err >&AS_MESSAGE_LOG_FD + fi + rm -f conftest.err libconftest.a conftest conftest.c + rm -rf conftest.dSYM + ]) + case $host_os in + rhapsody* | darwin1.[[012]]) + _lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;; + darwin1.*) + _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;; + darwin*) # darwin 5.x on + # if running on 10.5 or later, the deployment target defaults + # to the OS version, if on x86, and 10.4, the deployment + # target defaults to 10.4. Don't you love it? + case ${MACOSX_DEPLOYMENT_TARGET-10.0},$host in + 10.0,*86*-darwin8*|10.0,*-darwin[[91]]*) + _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;; + 10.[[012]][[,.]]*) + _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;; + 10.*) + _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;; + esac + ;; + esac + if test yes = "$lt_cv_apple_cc_single_mod"; then + _lt_dar_single_mod='$single_module' + fi + if test yes = "$lt_cv_ld_exported_symbols_list"; then + _lt_dar_export_syms=' $wl-exported_symbols_list,$output_objdir/$libname-symbols.expsym' + else + _lt_dar_export_syms='~$NMEDIT -s $output_objdir/$libname-symbols.expsym $lib' + fi + if test : != "$DSYMUTIL" && test no = "$lt_cv_ld_force_load"; then + _lt_dsymutil='~$DSYMUTIL $lib || :' + else + _lt_dsymutil= + fi + ;; + esac +]) + + +# _LT_DARWIN_LINKER_FEATURES([TAG]) +# --------------------------------- +# Checks for linker and compiler features on darwin +m4_defun([_LT_DARWIN_LINKER_FEATURES], +[ + m4_require([_LT_REQUIRED_DARWIN_CHECKS]) + _LT_TAGVAR(archive_cmds_need_lc, $1)=no + _LT_TAGVAR(hardcode_direct, $1)=no + _LT_TAGVAR(hardcode_automatic, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=unsupported + if test yes = "$lt_cv_ld_force_load"; then + _LT_TAGVAR(whole_archive_flag_spec, $1)='`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience $wl-force_load,$conv\"; done; func_echo_all \"$new_convenience\"`' + m4_case([$1], [F77], [_LT_TAGVAR(compiler_needs_object, $1)=yes], + [FC], [_LT_TAGVAR(compiler_needs_object, $1)=yes]) + else + _LT_TAGVAR(whole_archive_flag_spec, $1)='' + fi + _LT_TAGVAR(link_all_deplibs, $1)=yes + _LT_TAGVAR(allow_undefined_flag, $1)=$_lt_dar_allow_undefined + case $cc_basename in + ifort*|nagfor*) _lt_dar_can_shared=yes ;; + *) _lt_dar_can_shared=$GCC ;; + esac + if test yes = "$_lt_dar_can_shared"; then + output_verbose_link_cmd=func_echo_all + _LT_TAGVAR(archive_cmds, $1)="\$CC -dynamiclib \$allow_undefined_flag -o \$lib \$libobjs \$deplibs \$compiler_flags -install_name \$rpath/\$soname \$verstring $_lt_dar_single_mod$_lt_dsymutil" + _LT_TAGVAR(module_cmds, $1)="\$CC \$allow_undefined_flag -o \$lib -bundle \$libobjs \$deplibs \$compiler_flags$_lt_dsymutil" + _LT_TAGVAR(archive_expsym_cmds, $1)="sed 's|^|_|' < \$export_symbols > \$output_objdir/\$libname-symbols.expsym~\$CC -dynamiclib \$allow_undefined_flag -o \$lib \$libobjs \$deplibs \$compiler_flags -install_name \$rpath/\$soname \$verstring $_lt_dar_single_mod$_lt_dar_export_syms$_lt_dsymutil" + _LT_TAGVAR(module_expsym_cmds, $1)="sed -e 's|^|_|' < \$export_symbols > \$output_objdir/\$libname-symbols.expsym~\$CC \$allow_undefined_flag -o \$lib -bundle \$libobjs \$deplibs \$compiler_flags$_lt_dar_export_syms$_lt_dsymutil" + m4_if([$1], [CXX], +[ if test yes != "$lt_cv_apple_cc_single_mod"; then + _LT_TAGVAR(archive_cmds, $1)="\$CC -r -keep_private_externs -nostdlib -o \$lib-master.o \$libobjs~\$CC -dynamiclib \$allow_undefined_flag -o \$lib \$lib-master.o \$deplibs \$compiler_flags -install_name \$rpath/\$soname \$verstring$_lt_dsymutil" + _LT_TAGVAR(archive_expsym_cmds, $1)="sed 's|^|_|' < \$export_symbols > \$output_objdir/\$libname-symbols.expsym~\$CC -r -keep_private_externs -nostdlib -o \$lib-master.o \$libobjs~\$CC -dynamiclib \$allow_undefined_flag -o \$lib \$lib-master.o \$deplibs \$compiler_flags -install_name \$rpath/\$soname \$verstring$_lt_dar_export_syms$_lt_dsymutil" + fi +],[]) + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi +]) + +# _LT_SYS_MODULE_PATH_AIX([TAGNAME]) +# ---------------------------------- +# Links a minimal program and checks the executable +# for the system default hardcoded library path. In most cases, +# this is /usr/lib:/lib, but when the MPI compilers are used +# the location of the communication and MPI libs are included too. +# If we don't find anything, use the default library path according +# to the aix ld manual. +# Store the results from the different compilers for each TAGNAME. +# Allow to override them for all tags through lt_cv_aix_libpath. +m4_defun([_LT_SYS_MODULE_PATH_AIX], +[m4_require([_LT_DECL_SED])dnl +if test set = "${lt_cv_aix_libpath+set}"; then + aix_libpath=$lt_cv_aix_libpath +else + AC_CACHE_VAL([_LT_TAGVAR([lt_cv_aix_libpath_], [$1])], + [AC_LINK_IFELSE([AC_LANG_PROGRAM],[ + lt_aix_libpath_sed='[ + /Import File Strings/,/^$/ { + /^0/ { + s/^0 *\([^ ]*\) *$/\1/ + p + } + }]' + _LT_TAGVAR([lt_cv_aix_libpath_], [$1])=`dump -H conftest$ac_exeext 2>/dev/null | $SED -n -e "$lt_aix_libpath_sed"` + # Check for a 64-bit object if we didn't find anything. + if test -z "$_LT_TAGVAR([lt_cv_aix_libpath_], [$1])"; then + _LT_TAGVAR([lt_cv_aix_libpath_], [$1])=`dump -HX64 conftest$ac_exeext 2>/dev/null | $SED -n -e "$lt_aix_libpath_sed"` + fi],[]) + if test -z "$_LT_TAGVAR([lt_cv_aix_libpath_], [$1])"; then + _LT_TAGVAR([lt_cv_aix_libpath_], [$1])=/usr/lib:/lib + fi + ]) + aix_libpath=$_LT_TAGVAR([lt_cv_aix_libpath_], [$1]) +fi +])# _LT_SYS_MODULE_PATH_AIX + + +# _LT_SHELL_INIT(ARG) +# ------------------- +m4_define([_LT_SHELL_INIT], +[m4_divert_text([M4SH-INIT], [$1 +])])# _LT_SHELL_INIT + + + +# _LT_PROG_ECHO_BACKSLASH +# ----------------------- +# Find how we can fake an echo command that does not interpret backslash. +# In particular, with Autoconf 2.60 or later we add some code to the start +# of the generated configure script that will find a shell with a builtin +# printf (that we can use as an echo command). +m4_defun([_LT_PROG_ECHO_BACKSLASH], +[ECHO='\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\' +ECHO=$ECHO$ECHO$ECHO$ECHO$ECHO +ECHO=$ECHO$ECHO$ECHO$ECHO$ECHO$ECHO + +AC_MSG_CHECKING([how to print strings]) +# Test print first, because it will be a builtin if present. +if test "X`( print -r -- -n ) 2>/dev/null`" = X-n && \ + test "X`print -r -- $ECHO 2>/dev/null`" = "X$ECHO"; then + ECHO='print -r --' +elif test "X`printf %s $ECHO 2>/dev/null`" = "X$ECHO"; then + ECHO='printf %s\n' +else + # Use this function as a fallback that always works. + func_fallback_echo () + { + eval 'cat <<_LTECHO_EOF +$[]1 +_LTECHO_EOF' + } + ECHO='func_fallback_echo' +fi + +# func_echo_all arg... +# Invoke $ECHO with all args, space-separated. +func_echo_all () +{ + $ECHO "$*" +} + +case $ECHO in + printf*) AC_MSG_RESULT([printf]) ;; + print*) AC_MSG_RESULT([print -r]) ;; + *) AC_MSG_RESULT([cat]) ;; +esac + +m4_ifdef([_AS_DETECT_SUGGESTED], +[_AS_DETECT_SUGGESTED([ + test -n "${ZSH_VERSION+set}${BASH_VERSION+set}" || ( + ECHO='\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\' + ECHO=$ECHO$ECHO$ECHO$ECHO$ECHO + ECHO=$ECHO$ECHO$ECHO$ECHO$ECHO$ECHO + PATH=/empty FPATH=/empty; export PATH FPATH + test "X`printf %s $ECHO`" = "X$ECHO" \ + || test "X`print -r -- $ECHO`" = "X$ECHO" )])]) + +_LT_DECL([], [SHELL], [1], [Shell to use when invoking shell scripts]) +_LT_DECL([], [ECHO], [1], [An echo program that protects backslashes]) +])# _LT_PROG_ECHO_BACKSLASH + + +# _LT_WITH_SYSROOT +# ---------------- +AC_DEFUN([_LT_WITH_SYSROOT], +[AC_MSG_CHECKING([for sysroot]) +AC_ARG_WITH([sysroot], +[AS_HELP_STRING([--with-sysroot@<:@=DIR@:>@], + [Search for dependent libraries within DIR (or the compiler's sysroot + if not specified).])], +[], [with_sysroot=no]) + +dnl lt_sysroot will always be passed unquoted. We quote it here +dnl in case the user passed a directory name. +lt_sysroot= +case $with_sysroot in #( + yes) + if test yes = "$GCC"; then + lt_sysroot=`$CC --print-sysroot 2>/dev/null` + fi + ;; #( + /*) + lt_sysroot=`echo "$with_sysroot" | sed -e "$sed_quote_subst"` + ;; #( + no|'') + ;; #( + *) + AC_MSG_RESULT([$with_sysroot]) + AC_MSG_ERROR([The sysroot must be an absolute path.]) + ;; +esac + + AC_MSG_RESULT([${lt_sysroot:-no}]) +_LT_DECL([], [lt_sysroot], [0], [The root where to search for ]dnl +[dependent libraries, and where our libraries should be installed.])]) + +# _LT_ENABLE_LOCK +# --------------- +m4_defun([_LT_ENABLE_LOCK], +[AC_ARG_ENABLE([libtool-lock], + [AS_HELP_STRING([--disable-libtool-lock], + [avoid locking (might break parallel builds)])]) +test no = "$enable_libtool_lock" || enable_libtool_lock=yes + +# Some flags need to be propagated to the compiler or linker for good +# libtool support. +case $host in +ia64-*-hpux*) + # Find out what ABI is being produced by ac_compile, and set mode + # options accordingly. + echo 'int i;' > conftest.$ac_ext + if AC_TRY_EVAL(ac_compile); then + case `/usr/bin/file conftest.$ac_objext` in + *ELF-32*) + HPUX_IA64_MODE=32 + ;; + *ELF-64*) + HPUX_IA64_MODE=64 + ;; + esac + fi + rm -rf conftest* + ;; +*-*-irix6*) + # Find out what ABI is being produced by ac_compile, and set linker + # options accordingly. + echo '[#]line '$LINENO' "configure"' > conftest.$ac_ext + if AC_TRY_EVAL(ac_compile); then + if test yes = "$lt_cv_prog_gnu_ld"; then + case `/usr/bin/file conftest.$ac_objext` in + *32-bit*) + LD="${LD-ld} -melf32bsmip" + ;; + *N32*) + LD="${LD-ld} -melf32bmipn32" + ;; + *64-bit*) + LD="${LD-ld} -melf64bmip" + ;; + esac + else + case `/usr/bin/file conftest.$ac_objext` in + *32-bit*) + LD="${LD-ld} -32" + ;; + *N32*) + LD="${LD-ld} -n32" + ;; + *64-bit*) + LD="${LD-ld} -64" + ;; + esac + fi + fi + rm -rf conftest* + ;; + +mips64*-*linux*) + # Find out what ABI is being produced by ac_compile, and set linker + # options accordingly. + echo '[#]line '$LINENO' "configure"' > conftest.$ac_ext + if AC_TRY_EVAL(ac_compile); then + emul=elf + case `/usr/bin/file conftest.$ac_objext` in + *32-bit*) + emul="${emul}32" + ;; + *64-bit*) + emul="${emul}64" + ;; + esac + case `/usr/bin/file conftest.$ac_objext` in + *MSB*) + emul="${emul}btsmip" + ;; + *LSB*) + emul="${emul}ltsmip" + ;; + esac + case `/usr/bin/file conftest.$ac_objext` in + *N32*) + emul="${emul}n32" + ;; + esac + LD="${LD-ld} -m $emul" + fi + rm -rf conftest* + ;; + +x86_64-*kfreebsd*-gnu|x86_64-*linux*|powerpc*-*linux*| \ +s390*-*linux*|s390*-*tpf*|sparc*-*linux*) + # Find out what ABI is being produced by ac_compile, and set linker + # options accordingly. Note that the listed cases only cover the + # situations where additional linker options are needed (such as when + # doing 32-bit compilation for a host where ld defaults to 64-bit, or + # vice versa); the common cases where no linker options are needed do + # not appear in the list. + echo 'int i;' > conftest.$ac_ext + if AC_TRY_EVAL(ac_compile); then + case `/usr/bin/file conftest.o` in + *32-bit*) + case $host in + x86_64-*kfreebsd*-gnu) + LD="${LD-ld} -m elf_i386_fbsd" + ;; + x86_64-*linux*) + case `/usr/bin/file conftest.o` in + *x86-64*) + LD="${LD-ld} -m elf32_x86_64" + ;; + *) + LD="${LD-ld} -m elf_i386" + ;; + esac + ;; + powerpc64le-*linux*) + LD="${LD-ld} -m elf32lppclinux" + ;; + powerpc64-*linux*) + LD="${LD-ld} -m elf32ppclinux" + ;; + s390x-*linux*) + LD="${LD-ld} -m elf_s390" + ;; + sparc64-*linux*) + LD="${LD-ld} -m elf32_sparc" + ;; + esac + ;; + *64-bit*) + case $host in + x86_64-*kfreebsd*-gnu) + LD="${LD-ld} -m elf_x86_64_fbsd" + ;; + x86_64-*linux*) + LD="${LD-ld} -m elf_x86_64" + ;; + powerpcle-*linux*) + LD="${LD-ld} -m elf64lppc" + ;; + powerpc-*linux*) + LD="${LD-ld} -m elf64ppc" + ;; + s390*-*linux*|s390*-*tpf*) + LD="${LD-ld} -m elf64_s390" + ;; + sparc*-*linux*) + LD="${LD-ld} -m elf64_sparc" + ;; + esac + ;; + esac + fi + rm -rf conftest* + ;; + +*-*-sco3.2v5*) + # On SCO OpenServer 5, we need -belf to get full-featured binaries. + SAVE_CFLAGS=$CFLAGS + CFLAGS="$CFLAGS -belf" + AC_CACHE_CHECK([whether the C compiler needs -belf], lt_cv_cc_needs_belf, + [AC_LANG_PUSH(C) + AC_LINK_IFELSE([AC_LANG_PROGRAM([[]],[[]])],[lt_cv_cc_needs_belf=yes],[lt_cv_cc_needs_belf=no]) + AC_LANG_POP]) + if test yes != "$lt_cv_cc_needs_belf"; then + # this is probably gcc 2.8.0, egcs 1.0 or newer; no need for -belf + CFLAGS=$SAVE_CFLAGS + fi + ;; +*-*solaris*) + # Find out what ABI is being produced by ac_compile, and set linker + # options accordingly. + echo 'int i;' > conftest.$ac_ext + if AC_TRY_EVAL(ac_compile); then + case `/usr/bin/file conftest.o` in + *64-bit*) + case $lt_cv_prog_gnu_ld in + yes*) + case $host in + i?86-*-solaris*|x86_64-*-solaris*) + LD="${LD-ld} -m elf_x86_64" + ;; + sparc*-*-solaris*) + LD="${LD-ld} -m elf64_sparc" + ;; + esac + # GNU ld 2.21 introduced _sol2 emulations. Use them if available. + if ${LD-ld} -V | grep _sol2 >/dev/null 2>&1; then + LD=${LD-ld}_sol2 + fi + ;; + *) + if ${LD-ld} -64 -r -o conftest2.o conftest.o >/dev/null 2>&1; then + LD="${LD-ld} -64" + fi + ;; + esac + ;; + esac + fi + rm -rf conftest* + ;; +esac + +need_locks=$enable_libtool_lock +])# _LT_ENABLE_LOCK + + +# _LT_PROG_AR +# ----------- +m4_defun([_LT_PROG_AR], +[AC_CHECK_TOOLS(AR, [ar], false) +: ${AR=ar} +: ${AR_FLAGS=cru} +_LT_DECL([], [AR], [1], [The archiver]) +_LT_DECL([], [AR_FLAGS], [1], [Flags to create an archive]) + +AC_CACHE_CHECK([for archiver @FILE support], [lt_cv_ar_at_file], + [lt_cv_ar_at_file=no + AC_COMPILE_IFELSE([AC_LANG_PROGRAM], + [echo conftest.$ac_objext > conftest.lst + lt_ar_try='$AR $AR_FLAGS libconftest.a @conftest.lst >&AS_MESSAGE_LOG_FD' + AC_TRY_EVAL([lt_ar_try]) + if test 0 -eq "$ac_status"; then + # Ensure the archiver fails upon bogus file names. + rm -f conftest.$ac_objext libconftest.a + AC_TRY_EVAL([lt_ar_try]) + if test 0 -ne "$ac_status"; then + lt_cv_ar_at_file=@ + fi + fi + rm -f conftest.* libconftest.a + ]) + ]) + +if test no = "$lt_cv_ar_at_file"; then + archiver_list_spec= +else + archiver_list_spec=$lt_cv_ar_at_file +fi +_LT_DECL([], [archiver_list_spec], [1], + [How to feed a file listing to the archiver]) +])# _LT_PROG_AR + + +# _LT_CMD_OLD_ARCHIVE +# ------------------- +m4_defun([_LT_CMD_OLD_ARCHIVE], +[_LT_PROG_AR + +AC_CHECK_TOOL(STRIP, strip, :) +test -z "$STRIP" && STRIP=: +_LT_DECL([], [STRIP], [1], [A symbol stripping program]) + +AC_CHECK_TOOL(RANLIB, ranlib, :) +test -z "$RANLIB" && RANLIB=: +_LT_DECL([], [RANLIB], [1], + [Commands used to install an old-style archive]) + +# Determine commands to create old-style static archives. +old_archive_cmds='$AR $AR_FLAGS $oldlib$oldobjs' +old_postinstall_cmds='chmod 644 $oldlib' +old_postuninstall_cmds= + +if test -n "$RANLIB"; then + case $host_os in + bitrig* | openbsd*) + old_postinstall_cmds="$old_postinstall_cmds~\$RANLIB -t \$tool_oldlib" + ;; + *) + old_postinstall_cmds="$old_postinstall_cmds~\$RANLIB \$tool_oldlib" + ;; + esac + old_archive_cmds="$old_archive_cmds~\$RANLIB \$tool_oldlib" +fi + +case $host_os in + darwin*) + lock_old_archive_extraction=yes ;; + *) + lock_old_archive_extraction=no ;; +esac +_LT_DECL([], [old_postinstall_cmds], [2]) +_LT_DECL([], [old_postuninstall_cmds], [2]) +_LT_TAGDECL([], [old_archive_cmds], [2], + [Commands used to build an old-style archive]) +_LT_DECL([], [lock_old_archive_extraction], [0], + [Whether to use a lock for old archive extraction]) +])# _LT_CMD_OLD_ARCHIVE + + +# _LT_COMPILER_OPTION(MESSAGE, VARIABLE-NAME, FLAGS, +# [OUTPUT-FILE], [ACTION-SUCCESS], [ACTION-FAILURE]) +# ---------------------------------------------------------------- +# Check whether the given compiler option works +AC_DEFUN([_LT_COMPILER_OPTION], +[m4_require([_LT_FILEUTILS_DEFAULTS])dnl +m4_require([_LT_DECL_SED])dnl +AC_CACHE_CHECK([$1], [$2], + [$2=no + m4_if([$4], , [ac_outfile=conftest.$ac_objext], [ac_outfile=$4]) + echo "$lt_simple_compile_test_code" > conftest.$ac_ext + lt_compiler_flag="$3" ## exclude from sc_useless_quotes_in_assignment + # Insert the option either (1) after the last *FLAGS variable, or + # (2) before a word containing "conftest.", or (3) at the end. + # Note that $ac_compile itself does not contain backslashes and begins + # with a dollar sign (not a hyphen), so the echo should work correctly. + # The option is referenced via a variable to avoid confusing sed. + lt_compile=`echo "$ac_compile" | $SED \ + -e 's:.*FLAGS}\{0,1\} :&$lt_compiler_flag :; t' \ + -e 's: [[^ ]]*conftest\.: $lt_compiler_flag&:; t' \ + -e 's:$: $lt_compiler_flag:'` + (eval echo "\"\$as_me:$LINENO: $lt_compile\"" >&AS_MESSAGE_LOG_FD) + (eval "$lt_compile" 2>conftest.err) + ac_status=$? + cat conftest.err >&AS_MESSAGE_LOG_FD + echo "$as_me:$LINENO: \$? = $ac_status" >&AS_MESSAGE_LOG_FD + if (exit $ac_status) && test -s "$ac_outfile"; then + # The compiler can only warn and ignore the option if not recognized + # So say no if there are warnings other than the usual output. + $ECHO "$_lt_compiler_boilerplate" | $SED '/^$/d' >conftest.exp + $SED '/^$/d; /^ *+/d' conftest.err >conftest.er2 + if test ! -s conftest.er2 || diff conftest.exp conftest.er2 >/dev/null; then + $2=yes + fi + fi + $RM conftest* +]) + +if test yes = "[$]$2"; then + m4_if([$5], , :, [$5]) +else + m4_if([$6], , :, [$6]) +fi +])# _LT_COMPILER_OPTION + +# Old name: +AU_ALIAS([AC_LIBTOOL_COMPILER_OPTION], [_LT_COMPILER_OPTION]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_LIBTOOL_COMPILER_OPTION], []) + + +# _LT_LINKER_OPTION(MESSAGE, VARIABLE-NAME, FLAGS, +# [ACTION-SUCCESS], [ACTION-FAILURE]) +# ---------------------------------------------------- +# Check whether the given linker option works +AC_DEFUN([_LT_LINKER_OPTION], +[m4_require([_LT_FILEUTILS_DEFAULTS])dnl +m4_require([_LT_DECL_SED])dnl +AC_CACHE_CHECK([$1], [$2], + [$2=no + save_LDFLAGS=$LDFLAGS + LDFLAGS="$LDFLAGS $3" + echo "$lt_simple_link_test_code" > conftest.$ac_ext + if (eval $ac_link 2>conftest.err) && test -s conftest$ac_exeext; then + # The linker can only warn and ignore the option if not recognized + # So say no if there are warnings + if test -s conftest.err; then + # Append any errors to the config.log. + cat conftest.err 1>&AS_MESSAGE_LOG_FD + $ECHO "$_lt_linker_boilerplate" | $SED '/^$/d' > conftest.exp + $SED '/^$/d; /^ *+/d' conftest.err >conftest.er2 + if diff conftest.exp conftest.er2 >/dev/null; then + $2=yes + fi + else + $2=yes + fi + fi + $RM -r conftest* + LDFLAGS=$save_LDFLAGS +]) + +if test yes = "[$]$2"; then + m4_if([$4], , :, [$4]) +else + m4_if([$5], , :, [$5]) +fi +])# _LT_LINKER_OPTION + +# Old name: +AU_ALIAS([AC_LIBTOOL_LINKER_OPTION], [_LT_LINKER_OPTION]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_LIBTOOL_LINKER_OPTION], []) + + +# LT_CMD_MAX_LEN +#--------------- +AC_DEFUN([LT_CMD_MAX_LEN], +[AC_REQUIRE([AC_CANONICAL_HOST])dnl +# find the maximum length of command line arguments +AC_MSG_CHECKING([the maximum length of command line arguments]) +AC_CACHE_VAL([lt_cv_sys_max_cmd_len], [dnl + i=0 + teststring=ABCD + + case $build_os in + msdosdjgpp*) + # On DJGPP, this test can blow up pretty badly due to problems in libc + # (any single argument exceeding 2000 bytes causes a buffer overrun + # during glob expansion). Even if it were fixed, the result of this + # check would be larger than it should be. + lt_cv_sys_max_cmd_len=12288; # 12K is about right + ;; + + gnu*) + # Under GNU Hurd, this test is not required because there is + # no limit to the length of command line arguments. + # Libtool will interpret -1 as no limit whatsoever + lt_cv_sys_max_cmd_len=-1; + ;; + + cygwin* | mingw* | cegcc*) + # On Win9x/ME, this test blows up -- it succeeds, but takes + # about 5 minutes as the teststring grows exponentially. + # Worse, since 9x/ME are not pre-emptively multitasking, + # you end up with a "frozen" computer, even though with patience + # the test eventually succeeds (with a max line length of 256k). + # Instead, let's just punt: use the minimum linelength reported by + # all of the supported platforms: 8192 (on NT/2K/XP). + lt_cv_sys_max_cmd_len=8192; + ;; + + mint*) + # On MiNT this can take a long time and run out of memory. + lt_cv_sys_max_cmd_len=8192; + ;; + + amigaos*) + # On AmigaOS with pdksh, this test takes hours, literally. + # So we just punt and use a minimum line length of 8192. + lt_cv_sys_max_cmd_len=8192; + ;; + + bitrig* | darwin* | dragonfly* | freebsd* | netbsd* | openbsd*) + # This has been around since 386BSD, at least. Likely further. + if test -x /sbin/sysctl; then + lt_cv_sys_max_cmd_len=`/sbin/sysctl -n kern.argmax` + elif test -x /usr/sbin/sysctl; then + lt_cv_sys_max_cmd_len=`/usr/sbin/sysctl -n kern.argmax` + else + lt_cv_sys_max_cmd_len=65536 # usable default for all BSDs + fi + # And add a safety zone + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \/ 4` + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \* 3` + ;; + + interix*) + # We know the value 262144 and hardcode it with a safety zone (like BSD) + lt_cv_sys_max_cmd_len=196608 + ;; + + os2*) + # The test takes a long time on OS/2. + lt_cv_sys_max_cmd_len=8192 + ;; + + osf*) + # Dr. Hans Ekkehard Plesser reports seeing a kernel panic running configure + # due to this test when exec_disable_arg_limit is 1 on Tru64. It is not + # nice to cause kernel panics so lets avoid the loop below. + # First set a reasonable default. + lt_cv_sys_max_cmd_len=16384 + # + if test -x /sbin/sysconfig; then + case `/sbin/sysconfig -q proc exec_disable_arg_limit` in + *1*) lt_cv_sys_max_cmd_len=-1 ;; + esac + fi + ;; + sco3.2v5*) + lt_cv_sys_max_cmd_len=102400 + ;; + sysv5* | sco5v6* | sysv4.2uw2*) + kargmax=`grep ARG_MAX /etc/conf/cf.d/stune 2>/dev/null` + if test -n "$kargmax"; then + lt_cv_sys_max_cmd_len=`echo $kargmax | sed 's/.*[[ ]]//'` + else + lt_cv_sys_max_cmd_len=32768 + fi + ;; + *) + lt_cv_sys_max_cmd_len=`(getconf ARG_MAX) 2> /dev/null` + if test -n "$lt_cv_sys_max_cmd_len" && \ + test undefined != "$lt_cv_sys_max_cmd_len"; then + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \/ 4` + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \* 3` + else + # Make teststring a little bigger before we do anything with it. + # a 1K string should be a reasonable start. + for i in 1 2 3 4 5 6 7 8; do + teststring=$teststring$teststring + done + SHELL=${SHELL-${CONFIG_SHELL-/bin/sh}} + # If test is not a shell built-in, we'll probably end up computing a + # maximum length that is only half of the actual maximum length, but + # we can't tell. + while { test X`env echo "$teststring$teststring" 2>/dev/null` \ + = "X$teststring$teststring"; } >/dev/null 2>&1 && + test 17 != "$i" # 1/2 MB should be enough + do + i=`expr $i + 1` + teststring=$teststring$teststring + done + # Only check the string length outside the loop. + lt_cv_sys_max_cmd_len=`expr "X$teststring" : ".*" 2>&1` + teststring= + # Add a significant safety factor because C++ compilers can tack on + # massive amounts of additional arguments before passing them to the + # linker. It appears as though 1/2 is a usable value. + lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len \/ 2` + fi + ;; + esac +]) +if test -n "$lt_cv_sys_max_cmd_len"; then + AC_MSG_RESULT($lt_cv_sys_max_cmd_len) +else + AC_MSG_RESULT(none) +fi +max_cmd_len=$lt_cv_sys_max_cmd_len +_LT_DECL([], [max_cmd_len], [0], + [What is the maximum length of a command?]) +])# LT_CMD_MAX_LEN + +# Old name: +AU_ALIAS([AC_LIBTOOL_SYS_MAX_CMD_LEN], [LT_CMD_MAX_LEN]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_LIBTOOL_SYS_MAX_CMD_LEN], []) + + +# _LT_HEADER_DLFCN +# ---------------- +m4_defun([_LT_HEADER_DLFCN], +[AC_CHECK_HEADERS([dlfcn.h], [], [], [AC_INCLUDES_DEFAULT])dnl +])# _LT_HEADER_DLFCN + + +# _LT_TRY_DLOPEN_SELF (ACTION-IF-TRUE, ACTION-IF-TRUE-W-USCORE, +# ACTION-IF-FALSE, ACTION-IF-CROSS-COMPILING) +# ---------------------------------------------------------------- +m4_defun([_LT_TRY_DLOPEN_SELF], +[m4_require([_LT_HEADER_DLFCN])dnl +if test yes = "$cross_compiling"; then : + [$4] +else + lt_dlunknown=0; lt_dlno_uscore=1; lt_dlneed_uscore=2 + lt_status=$lt_dlunknown + cat > conftest.$ac_ext <<_LT_EOF +[#line $LINENO "configure" +#include "confdefs.h" + +#if HAVE_DLFCN_H +#include +#endif + +#include + +#ifdef RTLD_GLOBAL +# define LT_DLGLOBAL RTLD_GLOBAL +#else +# ifdef DL_GLOBAL +# define LT_DLGLOBAL DL_GLOBAL +# else +# define LT_DLGLOBAL 0 +# endif +#endif + +/* We may have to define LT_DLLAZY_OR_NOW in the command line if we + find out it does not work in some platform. */ +#ifndef LT_DLLAZY_OR_NOW +# ifdef RTLD_LAZY +# define LT_DLLAZY_OR_NOW RTLD_LAZY +# else +# ifdef DL_LAZY +# define LT_DLLAZY_OR_NOW DL_LAZY +# else +# ifdef RTLD_NOW +# define LT_DLLAZY_OR_NOW RTLD_NOW +# else +# ifdef DL_NOW +# define LT_DLLAZY_OR_NOW DL_NOW +# else +# define LT_DLLAZY_OR_NOW 0 +# endif +# endif +# endif +# endif +#endif + +/* When -fvisibility=hidden is used, assume the code has been annotated + correspondingly for the symbols needed. */ +#if defined __GNUC__ && (((__GNUC__ == 3) && (__GNUC_MINOR__ >= 3)) || (__GNUC__ > 3)) +int fnord () __attribute__((visibility("default"))); +#endif + +int fnord () { return 42; } +int main () +{ + void *self = dlopen (0, LT_DLGLOBAL|LT_DLLAZY_OR_NOW); + int status = $lt_dlunknown; + + if (self) + { + if (dlsym (self,"fnord")) status = $lt_dlno_uscore; + else + { + if (dlsym( self,"_fnord")) status = $lt_dlneed_uscore; + else puts (dlerror ()); + } + /* dlclose (self); */ + } + else + puts (dlerror ()); + + return status; +}] +_LT_EOF + if AC_TRY_EVAL(ac_link) && test -s "conftest$ac_exeext" 2>/dev/null; then + (./conftest; exit; ) >&AS_MESSAGE_LOG_FD 2>/dev/null + lt_status=$? + case x$lt_status in + x$lt_dlno_uscore) $1 ;; + x$lt_dlneed_uscore) $2 ;; + x$lt_dlunknown|x*) $3 ;; + esac + else : + # compilation failed + $3 + fi +fi +rm -fr conftest* +])# _LT_TRY_DLOPEN_SELF + + +# LT_SYS_DLOPEN_SELF +# ------------------ +AC_DEFUN([LT_SYS_DLOPEN_SELF], +[m4_require([_LT_HEADER_DLFCN])dnl +if test yes != "$enable_dlopen"; then + enable_dlopen=unknown + enable_dlopen_self=unknown + enable_dlopen_self_static=unknown +else + lt_cv_dlopen=no + lt_cv_dlopen_libs= + + case $host_os in + beos*) + lt_cv_dlopen=load_add_on + lt_cv_dlopen_libs= + lt_cv_dlopen_self=yes + ;; + + mingw* | pw32* | cegcc*) + lt_cv_dlopen=LoadLibrary + lt_cv_dlopen_libs= + ;; + + cygwin*) + lt_cv_dlopen=dlopen + lt_cv_dlopen_libs= + ;; + + darwin*) + # if libdl is installed we need to link against it + AC_CHECK_LIB([dl], [dlopen], + [lt_cv_dlopen=dlopen lt_cv_dlopen_libs=-ldl],[ + lt_cv_dlopen=dyld + lt_cv_dlopen_libs= + lt_cv_dlopen_self=yes + ]) + ;; + + tpf*) + # Don't try to run any link tests for TPF. We know it's impossible + # because TPF is a cross-compiler, and we know how we open DSOs. + lt_cv_dlopen=dlopen + lt_cv_dlopen_libs= + lt_cv_dlopen_self=no + ;; + + *) + AC_CHECK_FUNC([shl_load], + [lt_cv_dlopen=shl_load], + [AC_CHECK_LIB([dld], [shl_load], + [lt_cv_dlopen=shl_load lt_cv_dlopen_libs=-ldld], + [AC_CHECK_FUNC([dlopen], + [lt_cv_dlopen=dlopen], + [AC_CHECK_LIB([dl], [dlopen], + [lt_cv_dlopen=dlopen lt_cv_dlopen_libs=-ldl], + [AC_CHECK_LIB([svld], [dlopen], + [lt_cv_dlopen=dlopen lt_cv_dlopen_libs=-lsvld], + [AC_CHECK_LIB([dld], [dld_link], + [lt_cv_dlopen=dld_link lt_cv_dlopen_libs=-ldld]) + ]) + ]) + ]) + ]) + ]) + ;; + esac + + if test no = "$lt_cv_dlopen"; then + enable_dlopen=no + else + enable_dlopen=yes + fi + + case $lt_cv_dlopen in + dlopen) + save_CPPFLAGS=$CPPFLAGS + test yes = "$ac_cv_header_dlfcn_h" && CPPFLAGS="$CPPFLAGS -DHAVE_DLFCN_H" + + save_LDFLAGS=$LDFLAGS + wl=$lt_prog_compiler_wl eval LDFLAGS=\"\$LDFLAGS $export_dynamic_flag_spec\" + + save_LIBS=$LIBS + LIBS="$lt_cv_dlopen_libs $LIBS" + + AC_CACHE_CHECK([whether a program can dlopen itself], + lt_cv_dlopen_self, [dnl + _LT_TRY_DLOPEN_SELF( + lt_cv_dlopen_self=yes, lt_cv_dlopen_self=yes, + lt_cv_dlopen_self=no, lt_cv_dlopen_self=cross) + ]) + + if test yes = "$lt_cv_dlopen_self"; then + wl=$lt_prog_compiler_wl eval LDFLAGS=\"\$LDFLAGS $lt_prog_compiler_static\" + AC_CACHE_CHECK([whether a statically linked program can dlopen itself], + lt_cv_dlopen_self_static, [dnl + _LT_TRY_DLOPEN_SELF( + lt_cv_dlopen_self_static=yes, lt_cv_dlopen_self_static=yes, + lt_cv_dlopen_self_static=no, lt_cv_dlopen_self_static=cross) + ]) + fi + + CPPFLAGS=$save_CPPFLAGS + LDFLAGS=$save_LDFLAGS + LIBS=$save_LIBS + ;; + esac + + case $lt_cv_dlopen_self in + yes|no) enable_dlopen_self=$lt_cv_dlopen_self ;; + *) enable_dlopen_self=unknown ;; + esac + + case $lt_cv_dlopen_self_static in + yes|no) enable_dlopen_self_static=$lt_cv_dlopen_self_static ;; + *) enable_dlopen_self_static=unknown ;; + esac +fi +_LT_DECL([dlopen_support], [enable_dlopen], [0], + [Whether dlopen is supported]) +_LT_DECL([dlopen_self], [enable_dlopen_self], [0], + [Whether dlopen of programs is supported]) +_LT_DECL([dlopen_self_static], [enable_dlopen_self_static], [0], + [Whether dlopen of statically linked programs is supported]) +])# LT_SYS_DLOPEN_SELF + +# Old name: +AU_ALIAS([AC_LIBTOOL_DLOPEN_SELF], [LT_SYS_DLOPEN_SELF]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_LIBTOOL_DLOPEN_SELF], []) + + +# _LT_COMPILER_C_O([TAGNAME]) +# --------------------------- +# Check to see if options -c and -o are simultaneously supported by compiler. +# This macro does not hard code the compiler like AC_PROG_CC_C_O. +m4_defun([_LT_COMPILER_C_O], +[m4_require([_LT_DECL_SED])dnl +m4_require([_LT_FILEUTILS_DEFAULTS])dnl +m4_require([_LT_TAG_COMPILER])dnl +AC_CACHE_CHECK([if $compiler supports -c -o file.$ac_objext], + [_LT_TAGVAR(lt_cv_prog_compiler_c_o, $1)], + [_LT_TAGVAR(lt_cv_prog_compiler_c_o, $1)=no + $RM -r conftest 2>/dev/null + mkdir conftest + cd conftest + mkdir out + echo "$lt_simple_compile_test_code" > conftest.$ac_ext + + lt_compiler_flag="-o out/conftest2.$ac_objext" + # Insert the option either (1) after the last *FLAGS variable, or + # (2) before a word containing "conftest.", or (3) at the end. + # Note that $ac_compile itself does not contain backslashes and begins + # with a dollar sign (not a hyphen), so the echo should work correctly. + lt_compile=`echo "$ac_compile" | $SED \ + -e 's:.*FLAGS}\{0,1\} :&$lt_compiler_flag :; t' \ + -e 's: [[^ ]]*conftest\.: $lt_compiler_flag&:; t' \ + -e 's:$: $lt_compiler_flag:'` + (eval echo "\"\$as_me:$LINENO: $lt_compile\"" >&AS_MESSAGE_LOG_FD) + (eval "$lt_compile" 2>out/conftest.err) + ac_status=$? + cat out/conftest.err >&AS_MESSAGE_LOG_FD + echo "$as_me:$LINENO: \$? = $ac_status" >&AS_MESSAGE_LOG_FD + if (exit $ac_status) && test -s out/conftest2.$ac_objext + then + # The compiler can only warn and ignore the option if not recognized + # So say no if there are warnings + $ECHO "$_lt_compiler_boilerplate" | $SED '/^$/d' > out/conftest.exp + $SED '/^$/d; /^ *+/d' out/conftest.err >out/conftest.er2 + if test ! -s out/conftest.er2 || diff out/conftest.exp out/conftest.er2 >/dev/null; then + _LT_TAGVAR(lt_cv_prog_compiler_c_o, $1)=yes + fi + fi + chmod u+w . 2>&AS_MESSAGE_LOG_FD + $RM conftest* + # SGI C++ compiler will create directory out/ii_files/ for + # template instantiation + test -d out/ii_files && $RM out/ii_files/* && rmdir out/ii_files + $RM out/* && rmdir out + cd .. + $RM -r conftest + $RM conftest* +]) +_LT_TAGDECL([compiler_c_o], [lt_cv_prog_compiler_c_o], [1], + [Does compiler simultaneously support -c and -o options?]) +])# _LT_COMPILER_C_O + + +# _LT_COMPILER_FILE_LOCKS([TAGNAME]) +# ---------------------------------- +# Check to see if we can do hard links to lock some files if needed +m4_defun([_LT_COMPILER_FILE_LOCKS], +[m4_require([_LT_ENABLE_LOCK])dnl +m4_require([_LT_FILEUTILS_DEFAULTS])dnl +_LT_COMPILER_C_O([$1]) + +hard_links=nottested +if test no = "$_LT_TAGVAR(lt_cv_prog_compiler_c_o, $1)" && test no != "$need_locks"; then + # do not overwrite the value of need_locks provided by the user + AC_MSG_CHECKING([if we can lock with hard links]) + hard_links=yes + $RM conftest* + ln conftest.a conftest.b 2>/dev/null && hard_links=no + touch conftest.a + ln conftest.a conftest.b 2>&5 || hard_links=no + ln conftest.a conftest.b 2>/dev/null && hard_links=no + AC_MSG_RESULT([$hard_links]) + if test no = "$hard_links"; then + AC_MSG_WARN(['$CC' does not support '-c -o', so 'make -j' may be unsafe]) + need_locks=warn + fi +else + need_locks=no +fi +_LT_DECL([], [need_locks], [1], [Must we lock files when doing compilation?]) +])# _LT_COMPILER_FILE_LOCKS + + +# _LT_CHECK_OBJDIR +# ---------------- +m4_defun([_LT_CHECK_OBJDIR], +[AC_CACHE_CHECK([for objdir], [lt_cv_objdir], +[rm -f .libs 2>/dev/null +mkdir .libs 2>/dev/null +if test -d .libs; then + lt_cv_objdir=.libs +else + # MS-DOS does not allow filenames that begin with a dot. + lt_cv_objdir=_libs +fi +rmdir .libs 2>/dev/null]) +objdir=$lt_cv_objdir +_LT_DECL([], [objdir], [0], + [The name of the directory that contains temporary libtool files])dnl +m4_pattern_allow([LT_OBJDIR])dnl +AC_DEFINE_UNQUOTED([LT_OBJDIR], "$lt_cv_objdir/", + [Define to the sub-directory where libtool stores uninstalled libraries.]) +])# _LT_CHECK_OBJDIR + + +# _LT_LINKER_HARDCODE_LIBPATH([TAGNAME]) +# -------------------------------------- +# Check hardcoding attributes. +m4_defun([_LT_LINKER_HARDCODE_LIBPATH], +[AC_MSG_CHECKING([how to hardcode library paths into programs]) +_LT_TAGVAR(hardcode_action, $1)= +if test -n "$_LT_TAGVAR(hardcode_libdir_flag_spec, $1)" || + test -n "$_LT_TAGVAR(runpath_var, $1)" || + test yes = "$_LT_TAGVAR(hardcode_automatic, $1)"; then + + # We can hardcode non-existent directories. + if test no != "$_LT_TAGVAR(hardcode_direct, $1)" && + # If the only mechanism to avoid hardcoding is shlibpath_var, we + # have to relink, otherwise we might link with an installed library + # when we should be linking with a yet-to-be-installed one + ## test no != "$_LT_TAGVAR(hardcode_shlibpath_var, $1)" && + test no != "$_LT_TAGVAR(hardcode_minus_L, $1)"; then + # Linking always hardcodes the temporary library directory. + _LT_TAGVAR(hardcode_action, $1)=relink + else + # We can link without hardcoding, and we can hardcode nonexisting dirs. + _LT_TAGVAR(hardcode_action, $1)=immediate + fi +else + # We cannot hardcode anything, or else we can only hardcode existing + # directories. + _LT_TAGVAR(hardcode_action, $1)=unsupported +fi +AC_MSG_RESULT([$_LT_TAGVAR(hardcode_action, $1)]) + +if test relink = "$_LT_TAGVAR(hardcode_action, $1)" || + test yes = "$_LT_TAGVAR(inherit_rpath, $1)"; then + # Fast installation is not supported + enable_fast_install=no +elif test yes = "$shlibpath_overrides_runpath" || + test no = "$enable_shared"; then + # Fast installation is not necessary + enable_fast_install=needless +fi +_LT_TAGDECL([], [hardcode_action], [0], + [How to hardcode a shared library path into an executable]) +])# _LT_LINKER_HARDCODE_LIBPATH + + +# _LT_CMD_STRIPLIB +# ---------------- +m4_defun([_LT_CMD_STRIPLIB], +[m4_require([_LT_DECL_EGREP]) +striplib= +old_striplib= +AC_MSG_CHECKING([whether stripping libraries is possible]) +if test -n "$STRIP" && $STRIP -V 2>&1 | $GREP "GNU strip" >/dev/null; then + test -z "$old_striplib" && old_striplib="$STRIP --strip-debug" + test -z "$striplib" && striplib="$STRIP --strip-unneeded" + AC_MSG_RESULT([yes]) +else +# FIXME - insert some real tests, host_os isn't really good enough + case $host_os in + darwin*) + if test -n "$STRIP"; then + striplib="$STRIP -x" + old_striplib="$STRIP -S" + AC_MSG_RESULT([yes]) + else + AC_MSG_RESULT([no]) + fi + ;; + *) + AC_MSG_RESULT([no]) + ;; + esac +fi +_LT_DECL([], [old_striplib], [1], [Commands to strip libraries]) +_LT_DECL([], [striplib], [1]) +])# _LT_CMD_STRIPLIB + + +# _LT_PREPARE_MUNGE_PATH_LIST +# --------------------------- +# Make sure func_munge_path_list() is defined correctly. +m4_defun([_LT_PREPARE_MUNGE_PATH_LIST], +[[# func_munge_path_list VARIABLE PATH +# ----------------------------------- +# VARIABLE is name of variable containing _space_ separated list of +# directories to be munged by the contents of PATH, which is string +# having a format: +# "DIR[:DIR]:" +# string "DIR[ DIR]" will be prepended to VARIABLE +# ":DIR[:DIR]" +# string "DIR[ DIR]" will be appended to VARIABLE +# "DIRP[:DIRP]::[DIRA:]DIRA" +# string "DIRP[ DIRP]" will be prepended to VARIABLE and string +# "DIRA[ DIRA]" will be appended to VARIABLE +# "DIR[:DIR]" +# VARIABLE will be replaced by "DIR[ DIR]" +func_munge_path_list () +{ + case x@S|@2 in + x) + ;; + *:) + eval @S|@1=\"`$ECHO @S|@2 | $SED 's/:/ /g'` \@S|@@S|@1\" + ;; + x:*) + eval @S|@1=\"\@S|@@S|@1 `$ECHO @S|@2 | $SED 's/:/ /g'`\" + ;; + *::*) + eval @S|@1=\"\@S|@@S|@1\ `$ECHO @S|@2 | $SED -e 's/.*:://' -e 's/:/ /g'`\" + eval @S|@1=\"`$ECHO @S|@2 | $SED -e 's/::.*//' -e 's/:/ /g'`\ \@S|@@S|@1\" + ;; + *) + eval @S|@1=\"`$ECHO @S|@2 | $SED 's/:/ /g'`\" + ;; + esac +} +]])# _LT_PREPARE_PATH_LIST + + +# _LT_SYS_DYNAMIC_LINKER([TAG]) +# ----------------------------- +# PORTME Fill in your ld.so characteristics +m4_defun([_LT_SYS_DYNAMIC_LINKER], +[AC_REQUIRE([AC_CANONICAL_HOST])dnl +m4_require([_LT_DECL_EGREP])dnl +m4_require([_LT_FILEUTILS_DEFAULTS])dnl +m4_require([_LT_DECL_OBJDUMP])dnl +m4_require([_LT_DECL_SED])dnl +m4_require([_LT_CHECK_SHELL_FEATURES])dnl +m4_require([_LT_PREPARE_MUNGE_PATH_LIST])dnl +AC_MSG_CHECKING([dynamic linker characteristics]) +m4_if([$1], + [], [ +if test yes = "$GCC"; then + case $host_os in + darwin*) lt_awk_arg='/^libraries:/,/LR/' ;; + *) lt_awk_arg='/^libraries:/' ;; + esac + case $host_os in + mingw* | cegcc*) lt_sed_strip_eq='s|=\([[A-Za-z]]:\)|\1|g' ;; + *) lt_sed_strip_eq='s|=/|/|g' ;; + esac + lt_search_path_spec=`$CC -print-search-dirs | awk $lt_awk_arg | $SED -e "s/^libraries://" -e $lt_sed_strip_eq` + case $lt_search_path_spec in + *\;*) + # if the path contains ";" then we assume it to be the separator + # otherwise default to the standard path separator (i.e. ":") - it is + # assumed that no part of a normal pathname contains ";" but that should + # okay in the real world where ";" in dirpaths is itself problematic. + lt_search_path_spec=`$ECHO "$lt_search_path_spec" | $SED 's/;/ /g'` + ;; + *) + lt_search_path_spec=`$ECHO "$lt_search_path_spec" | $SED "s/$PATH_SEPARATOR/ /g"` + ;; + esac + # Ok, now we have the path, separated by spaces, we can step through it + # and add multilib dir if necessary... + lt_tmp_lt_search_path_spec= + lt_multi_os_dir=/`$CC $CPPFLAGS $CFLAGS $LDFLAGS -print-multi-os-directory 2>/dev/null` + # ...but if some path component already ends with the multilib dir we assume + # that all is fine and trust -print-search-dirs as is (GCC 4.2? or newer). + case "$lt_multi_os_dir; $lt_search_path_spec " in + "/; "* | "/.; "* | "/./; "* | *"$lt_multi_os_dir "* | *"$lt_multi_os_dir/ "*) + lt_multi_os_dir= + ;; + esac + for lt_sys_path in $lt_search_path_spec; do + if test -d "$lt_sys_path$lt_multi_os_dir"; then + lt_tmp_lt_search_path_spec="$lt_tmp_lt_search_path_spec $lt_sys_path$lt_multi_os_dir" + elif test -n "$lt_multi_os_dir"; then + test -d "$lt_sys_path" && \ + lt_tmp_lt_search_path_spec="$lt_tmp_lt_search_path_spec $lt_sys_path" + fi + done + lt_search_path_spec=`$ECHO "$lt_tmp_lt_search_path_spec" | awk ' +BEGIN {RS = " "; FS = "/|\n";} { + lt_foo = ""; + lt_count = 0; + for (lt_i = NF; lt_i > 0; lt_i--) { + if ($lt_i != "" && $lt_i != ".") { + if ($lt_i == "..") { + lt_count++; + } else { + if (lt_count == 0) { + lt_foo = "/" $lt_i lt_foo; + } else { + lt_count--; + } + } + } + } + if (lt_foo != "") { lt_freq[[lt_foo]]++; } + if (lt_freq[[lt_foo]] == 1) { print lt_foo; } +}'` + # AWK program above erroneously prepends '/' to C:/dos/paths + # for these hosts. + case $host_os in + mingw* | cegcc*) lt_search_path_spec=`$ECHO "$lt_search_path_spec" |\ + $SED 's|/\([[A-Za-z]]:\)|\1|g'` ;; + esac + sys_lib_search_path_spec=`$ECHO "$lt_search_path_spec" | $lt_NL2SP` +else + sys_lib_search_path_spec="/lib /usr/lib /usr/local/lib" +fi]) +library_names_spec= +libname_spec='lib$name' +soname_spec= +shrext_cmds=.so +postinstall_cmds= +postuninstall_cmds= +finish_cmds= +finish_eval= +shlibpath_var= +shlibpath_overrides_runpath=unknown +version_type=none +dynamic_linker="$host_os ld.so" +sys_lib_dlsearch_path_spec="/lib /usr/lib" +need_lib_prefix=unknown +hardcode_into_libs=no + +# when you set need_version to no, make sure it does not cause -set_version +# flags to be left without arguments +need_version=unknown + +AC_ARG_VAR([LT_SYS_LIBRARY_PATH], +[User-defined run-time library search path.]) + +case $host_os in +aix3*) + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$release$shared_ext$versuffix $libname.a' + shlibpath_var=LIBPATH + + # AIX 3 has no versioning support, so we append a major version to the name. + soname_spec='$libname$release$shared_ext$major' + ;; + +aix[[4-9]]*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + hardcode_into_libs=yes + if test ia64 = "$host_cpu"; then + # AIX 5 supports IA64 + library_names_spec='$libname$release$shared_ext$major $libname$release$shared_ext$versuffix $libname$shared_ext' + shlibpath_var=LD_LIBRARY_PATH + else + # With GCC up to 2.95.x, collect2 would create an import file + # for dependence libraries. The import file would start with + # the line '#! .'. This would cause the generated library to + # depend on '.', always an invalid library. This was fixed in + # development snapshots of GCC prior to 3.0. + case $host_os in + aix4 | aix4.[[01]] | aix4.[[01]].*) + if { echo '#if __GNUC__ > 2 || (__GNUC__ == 2 && __GNUC_MINOR__ >= 97)' + echo ' yes ' + echo '#endif'; } | $CC -E - | $GREP yes > /dev/null; then + : + else + can_build_shared=no + fi + ;; + esac + # Using Import Files as archive members, it is possible to support + # filename-based versioning of shared library archives on AIX. While + # this would work for both with and without runtime linking, it will + # prevent static linking of such archives. So we do filename-based + # shared library versioning with .so extension only, which is used + # when both runtime linking and shared linking is enabled. + # Unfortunately, runtime linking may impact performance, so we do + # not want this to be the default eventually. Also, we use the + # versioned .so libs for executables only if there is the -brtl + # linker flag in LDFLAGS as well, or --with-aix-soname=svr4 only. + # To allow for filename-based versioning support, we need to create + # libNAME.so.V as an archive file, containing: + # *) an Import File, referring to the versioned filename of the + # archive as well as the shared archive member, telling the + # bitwidth (32 or 64) of that shared object, and providing the + # list of exported symbols of that shared object, eventually + # decorated with the 'weak' keyword + # *) the shared object with the F_LOADONLY flag set, to really avoid + # it being seen by the linker. + # At run time we better use the real file rather than another symlink, + # but for link time we create the symlink libNAME.so -> libNAME.so.V + + case $with_aix_soname,$aix_use_runtimelinking in + # AIX (on Power*) has no versioning support, so currently we cannot hardcode correct + # soname into executable. Probably we can add versioning support to + # collect2, so additional links can be useful in future. + aix,yes) # traditional libtool + dynamic_linker='AIX unversionable lib.so' + # If using run time linking (on AIX 4.2 or later) use lib.so + # instead of lib.a to let people know that these are not + # typical AIX shared libraries. + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + ;; + aix,no) # traditional AIX only + dynamic_linker='AIX lib.a[(]lib.so.V[)]' + # We preserve .a as extension for shared libraries through AIX4.2 + # and later when we are not doing run time linking. + library_names_spec='$libname$release.a $libname.a' + soname_spec='$libname$release$shared_ext$major' + ;; + svr4,*) # full svr4 only + dynamic_linker="AIX lib.so.V[(]$shared_archive_member_spec.o[)]" + library_names_spec='$libname$release$shared_ext$major $libname$shared_ext' + # We do not specify a path in Import Files, so LIBPATH fires. + shlibpath_overrides_runpath=yes + ;; + *,yes) # both, prefer svr4 + dynamic_linker="AIX lib.so.V[(]$shared_archive_member_spec.o[)], lib.a[(]lib.so.V[)]" + library_names_spec='$libname$release$shared_ext$major $libname$shared_ext' + # unpreferred sharedlib libNAME.a needs extra handling + postinstall_cmds='test -n "$linkname" || linkname="$realname"~func_stripname "" ".so" "$linkname"~$install_shared_prog "$dir/$func_stripname_result.$libext" "$destdir/$func_stripname_result.$libext"~test -z "$tstripme" || test -z "$striplib" || $striplib "$destdir/$func_stripname_result.$libext"' + postuninstall_cmds='for n in $library_names $old_library; do :; done~func_stripname "" ".so" "$n"~test "$func_stripname_result" = "$n" || func_append rmfiles " $odir/$func_stripname_result.$libext"' + # We do not specify a path in Import Files, so LIBPATH fires. + shlibpath_overrides_runpath=yes + ;; + *,no) # both, prefer aix + dynamic_linker="AIX lib.a[(]lib.so.V[)], lib.so.V[(]$shared_archive_member_spec.o[)]" + library_names_spec='$libname$release.a $libname.a' + soname_spec='$libname$release$shared_ext$major' + # unpreferred sharedlib libNAME.so.V and symlink libNAME.so need extra handling + postinstall_cmds='test -z "$dlname" || $install_shared_prog $dir/$dlname $destdir/$dlname~test -z "$tstripme" || test -z "$striplib" || $striplib $destdir/$dlname~test -n "$linkname" || linkname=$realname~func_stripname "" ".a" "$linkname"~(cd "$destdir" && $LN_S -f $dlname $func_stripname_result.so)' + postuninstall_cmds='test -z "$dlname" || func_append rmfiles " $odir/$dlname"~for n in $old_library $library_names; do :; done~func_stripname "" ".a" "$n"~func_append rmfiles " $odir/$func_stripname_result.so"' + ;; + esac + shlibpath_var=LIBPATH + fi + ;; + +amigaos*) + case $host_cpu in + powerpc) + # Since July 2007 AmigaOS4 officially supports .so libraries. + # When compiling the executable, add -use-dynld -Lsobjs: to the compileline. + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + ;; + m68k) + library_names_spec='$libname.ixlibrary $libname.a' + # Create ${libname}_ixlibrary.a entries in /sys/libs. + finish_eval='for lib in `ls $libdir/*.ixlibrary 2>/dev/null`; do libname=`func_echo_all "$lib" | $SED '\''s%^.*/\([[^/]]*\)\.ixlibrary$%\1%'\''`; $RM /sys/libs/${libname}_ixlibrary.a; $show "cd /sys/libs && $LN_S $lib ${libname}_ixlibrary.a"; cd /sys/libs && $LN_S $lib ${libname}_ixlibrary.a || exit 1; done' + ;; + esac + ;; + +beos*) + library_names_spec='$libname$shared_ext' + dynamic_linker="$host_os ld.so" + shlibpath_var=LIBRARY_PATH + ;; + +bsdi[[45]]*) + version_type=linux # correct to gnu/linux during the next big refactor + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + finish_cmds='PATH="\$PATH:/sbin" ldconfig $libdir' + shlibpath_var=LD_LIBRARY_PATH + sys_lib_search_path_spec="/shlib /usr/lib /usr/X11/lib /usr/contrib/lib /lib /usr/local/lib" + sys_lib_dlsearch_path_spec="/shlib /usr/lib /usr/local/lib" + # the default ld.so.conf also contains /usr/contrib/lib and + # /usr/X11R6/lib (/usr/X11 is a link to /usr/X11R6), but let us allow + # libtool to hard-code these into programs + ;; + +cygwin* | mingw* | pw32* | cegcc*) + version_type=windows + shrext_cmds=.dll + need_version=no + need_lib_prefix=no + + case $GCC,$cc_basename in + yes,*) + # gcc + library_names_spec='$libname.dll.a' + # DLL is installed to $(libdir)/../bin by postinstall_cmds + postinstall_cmds='base_file=`basename \$file`~ + dlpath=`$SHELL 2>&1 -c '\''. $dir/'\''\$base_file'\''i; echo \$dlname'\''`~ + dldir=$destdir/`dirname \$dlpath`~ + test -d \$dldir || mkdir -p \$dldir~ + $install_prog $dir/$dlname \$dldir/$dlname~ + chmod a+x \$dldir/$dlname~ + if test -n '\''$stripme'\'' && test -n '\''$striplib'\''; then + eval '\''$striplib \$dldir/$dlname'\'' || exit \$?; + fi' + postuninstall_cmds='dldll=`$SHELL 2>&1 -c '\''. $file; echo \$dlname'\''`~ + dlpath=$dir/\$dldll~ + $RM \$dlpath' + shlibpath_overrides_runpath=yes + + case $host_os in + cygwin*) + # Cygwin DLLs use 'cyg' prefix rather than 'lib' + soname_spec='`echo $libname | sed -e 's/^lib/cyg/'``echo $release | $SED -e 's/[[.]]/-/g'`$versuffix$shared_ext' +m4_if([$1], [],[ + sys_lib_search_path_spec="$sys_lib_search_path_spec /usr/lib/w32api"]) + ;; + mingw* | cegcc*) + # MinGW DLLs use traditional 'lib' prefix + soname_spec='$libname`echo $release | $SED -e 's/[[.]]/-/g'`$versuffix$shared_ext' + ;; + pw32*) + # pw32 DLLs use 'pw' prefix rather than 'lib' + library_names_spec='`echo $libname | sed -e 's/^lib/pw/'``echo $release | $SED -e 's/[[.]]/-/g'`$versuffix$shared_ext' + ;; + esac + dynamic_linker='Win32 ld.exe' + ;; + + *,cl*) + # Native MSVC + libname_spec='$name' + soname_spec='$libname`echo $release | $SED -e 's/[[.]]/-/g'`$versuffix$shared_ext' + library_names_spec='$libname.dll.lib' + + case $build_os in + mingw*) + sys_lib_search_path_spec= + lt_save_ifs=$IFS + IFS=';' + for lt_path in $LIB + do + IFS=$lt_save_ifs + # Let DOS variable expansion print the short 8.3 style file name. + lt_path=`cd "$lt_path" 2>/dev/null && cmd //C "for %i in (".") do @echo %~si"` + sys_lib_search_path_spec="$sys_lib_search_path_spec $lt_path" + done + IFS=$lt_save_ifs + # Convert to MSYS style. + sys_lib_search_path_spec=`$ECHO "$sys_lib_search_path_spec" | sed -e 's|\\\\|/|g' -e 's| \\([[a-zA-Z]]\\):| /\\1|g' -e 's|^ ||'` + ;; + cygwin*) + # Convert to unix form, then to dos form, then back to unix form + # but this time dos style (no spaces!) so that the unix form looks + # like /cygdrive/c/PROGRA~1:/cygdr... + sys_lib_search_path_spec=`cygpath --path --unix "$LIB"` + sys_lib_search_path_spec=`cygpath --path --dos "$sys_lib_search_path_spec" 2>/dev/null` + sys_lib_search_path_spec=`cygpath --path --unix "$sys_lib_search_path_spec" | $SED -e "s/$PATH_SEPARATOR/ /g"` + ;; + *) + sys_lib_search_path_spec=$LIB + if $ECHO "$sys_lib_search_path_spec" | [$GREP ';[c-zC-Z]:/' >/dev/null]; then + # It is most probably a Windows format PATH. + sys_lib_search_path_spec=`$ECHO "$sys_lib_search_path_spec" | $SED -e 's/;/ /g'` + else + sys_lib_search_path_spec=`$ECHO "$sys_lib_search_path_spec" | $SED -e "s/$PATH_SEPARATOR/ /g"` + fi + # FIXME: find the short name or the path components, as spaces are + # common. (e.g. "Program Files" -> "PROGRA~1") + ;; + esac + + # DLL is installed to $(libdir)/../bin by postinstall_cmds + postinstall_cmds='base_file=`basename \$file`~ + dlpath=`$SHELL 2>&1 -c '\''. $dir/'\''\$base_file'\''i; echo \$dlname'\''`~ + dldir=$destdir/`dirname \$dlpath`~ + test -d \$dldir || mkdir -p \$dldir~ + $install_prog $dir/$dlname \$dldir/$dlname' + postuninstall_cmds='dldll=`$SHELL 2>&1 -c '\''. $file; echo \$dlname'\''`~ + dlpath=$dir/\$dldll~ + $RM \$dlpath' + shlibpath_overrides_runpath=yes + dynamic_linker='Win32 link.exe' + ;; + + *) + # Assume MSVC wrapper + library_names_spec='$libname`echo $release | $SED -e 's/[[.]]/-/g'`$versuffix$shared_ext $libname.lib' + dynamic_linker='Win32 ld.exe' + ;; + esac + # FIXME: first we should search . and the directory the executable is in + shlibpath_var=PATH + ;; + +darwin* | rhapsody*) + dynamic_linker="$host_os dyld" + version_type=darwin + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$major$shared_ext $libname$shared_ext' + soname_spec='$libname$release$major$shared_ext' + shlibpath_overrides_runpath=yes + shlibpath_var=DYLD_LIBRARY_PATH + shrext_cmds='`test .$module = .yes && echo .so || echo .dylib`' +m4_if([$1], [],[ + sys_lib_search_path_spec="$sys_lib_search_path_spec /usr/local/lib"]) + sys_lib_dlsearch_path_spec='/usr/local/lib /lib /usr/lib' + ;; + +dgux*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + ;; + +freebsd* | dragonfly*) + # DragonFly does not have aout. When/if they implement a new + # versioning mechanism, adjust this. + if test -x /usr/bin/objformat; then + objformat=`/usr/bin/objformat` + else + case $host_os in + freebsd[[23]].*) objformat=aout ;; + *) objformat=elf ;; + esac + fi + version_type=freebsd-$objformat + case $version_type in + freebsd-elf*) + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + need_version=no + need_lib_prefix=no + ;; + freebsd-*) + library_names_spec='$libname$release$shared_ext$versuffix $libname$shared_ext$versuffix' + need_version=yes + ;; + esac + shlibpath_var=LD_LIBRARY_PATH + case $host_os in + freebsd2.*) + shlibpath_overrides_runpath=yes + ;; + freebsd3.[[01]]* | freebsdelf3.[[01]]*) + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + ;; + freebsd3.[[2-9]]* | freebsdelf3.[[2-9]]* | \ + freebsd4.[[0-5]] | freebsdelf4.[[0-5]] | freebsd4.1.1 | freebsdelf4.1.1) + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + ;; + *) # from 4.6 on, and DragonFly + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + ;; + esac + ;; + +haiku*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + dynamic_linker="$host_os runtime_loader" + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LIBRARY_PATH + shlibpath_overrides_runpath=no + sys_lib_dlsearch_path_spec='/boot/home/config/lib /boot/common/lib /boot/system/lib' + hardcode_into_libs=yes + ;; + +hpux9* | hpux10* | hpux11*) + # Give a soname corresponding to the major version so that dld.sl refuses to + # link against other versions. + version_type=sunos + need_lib_prefix=no + need_version=no + case $host_cpu in + ia64*) + shrext_cmds='.so' + hardcode_into_libs=yes + dynamic_linker="$host_os dld.so" + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes # Unless +noenvvar is specified. + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + if test 32 = "$HPUX_IA64_MODE"; then + sys_lib_search_path_spec="/usr/lib/hpux32 /usr/local/lib/hpux32 /usr/local/lib" + sys_lib_dlsearch_path_spec=/usr/lib/hpux32 + else + sys_lib_search_path_spec="/usr/lib/hpux64 /usr/local/lib/hpux64" + sys_lib_dlsearch_path_spec=/usr/lib/hpux64 + fi + ;; + hppa*64*) + shrext_cmds='.sl' + hardcode_into_libs=yes + dynamic_linker="$host_os dld.sl" + shlibpath_var=LD_LIBRARY_PATH # How should we handle SHLIB_PATH + shlibpath_overrides_runpath=yes # Unless +noenvvar is specified. + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + sys_lib_search_path_spec="/usr/lib/pa20_64 /usr/ccs/lib/pa20_64" + sys_lib_dlsearch_path_spec=$sys_lib_search_path_spec + ;; + *) + shrext_cmds='.sl' + dynamic_linker="$host_os dld.sl" + shlibpath_var=SHLIB_PATH + shlibpath_overrides_runpath=no # +s is required to enable SHLIB_PATH + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + ;; + esac + # HP-UX runs *really* slowly unless shared libraries are mode 555, ... + postinstall_cmds='chmod 555 $lib' + # or fails outright, so override atomically: + install_override_mode=555 + ;; + +interix[[3-9]]*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + dynamic_linker='Interix 3.x ld.so.1 (PE, like ELF)' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + ;; + +irix5* | irix6* | nonstopux*) + case $host_os in + nonstopux*) version_type=nonstopux ;; + *) + if test yes = "$lt_cv_prog_gnu_ld"; then + version_type=linux # correct to gnu/linux during the next big refactor + else + version_type=irix + fi ;; + esac + need_lib_prefix=no + need_version=no + soname_spec='$libname$release$shared_ext$major' + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$release$shared_ext $libname$shared_ext' + case $host_os in + irix5* | nonstopux*) + libsuff= shlibsuff= + ;; + *) + case $LD in # libtool.m4 will add one of these switches to LD + *-32|*"-32 "|*-melf32bsmip|*"-melf32bsmip ") + libsuff= shlibsuff= libmagic=32-bit;; + *-n32|*"-n32 "|*-melf32bmipn32|*"-melf32bmipn32 ") + libsuff=32 shlibsuff=N32 libmagic=N32;; + *-64|*"-64 "|*-melf64bmip|*"-melf64bmip ") + libsuff=64 shlibsuff=64 libmagic=64-bit;; + *) libsuff= shlibsuff= libmagic=never-match;; + esac + ;; + esac + shlibpath_var=LD_LIBRARY${shlibsuff}_PATH + shlibpath_overrides_runpath=no + sys_lib_search_path_spec="/usr/lib$libsuff /lib$libsuff /usr/local/lib$libsuff" + sys_lib_dlsearch_path_spec="/usr/lib$libsuff /lib$libsuff" + hardcode_into_libs=yes + ;; + +# No shared lib support for Linux oldld, aout, or coff. +linux*oldld* | linux*aout* | linux*coff*) + dynamic_linker=no + ;; + +linux*android*) + version_type=none # Android doesn't support versioned libraries. + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext' + soname_spec='$libname$release$shared_ext' + finish_cmds= + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + + # This implies no fast_install, which is unacceptable. + # Some rework will be needed to allow for fast_install + # before this can be enabled. + hardcode_into_libs=yes + + dynamic_linker='Android linker' + # Don't embed -rpath directories since the linker doesn't support them. + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + ;; + +# This must be glibc/ELF. +linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + finish_cmds='PATH="\$PATH:/sbin" ldconfig -n $libdir' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + + # Some binutils ld are patched to set DT_RUNPATH + AC_CACHE_VAL([lt_cv_shlibpath_overrides_runpath], + [lt_cv_shlibpath_overrides_runpath=no + save_LDFLAGS=$LDFLAGS + save_libdir=$libdir + eval "libdir=/foo; wl=\"$_LT_TAGVAR(lt_prog_compiler_wl, $1)\"; \ + LDFLAGS=\"\$LDFLAGS $_LT_TAGVAR(hardcode_libdir_flag_spec, $1)\"" + AC_LINK_IFELSE([AC_LANG_PROGRAM([],[])], + [AS_IF([ ($OBJDUMP -p conftest$ac_exeext) 2>/dev/null | grep "RUNPATH.*$libdir" >/dev/null], + [lt_cv_shlibpath_overrides_runpath=yes])]) + LDFLAGS=$save_LDFLAGS + libdir=$save_libdir + ]) + shlibpath_overrides_runpath=$lt_cv_shlibpath_overrides_runpath + + # This implies no fast_install, which is unacceptable. + # Some rework will be needed to allow for fast_install + # before this can be enabled. + hardcode_into_libs=yes + + # Ideally, we could use ldconfig to report *all* directores which are + # searched for libraries, however this is still not possible. Aside from not + # being certain /sbin/ldconfig is available, command + # 'ldconfig -N -X -v | grep ^/' on 64bit Fedora does not report /usr/lib64, + # even though it is searched at run-time. Try to do the best guess by + # appending ld.so.conf contents (and includes) to the search path. + if test -f /etc/ld.so.conf; then + lt_ld_extra=`awk '/^include / { system(sprintf("cd /etc; cat %s 2>/dev/null", \[$]2)); skip = 1; } { if (!skip) print \[$]0; skip = 0; }' < /etc/ld.so.conf | $SED -e 's/#.*//;/^[ ]*hwcap[ ]/d;s/[:, ]/ /g;s/=[^=]*$//;s/=[^= ]* / /g;s/"//g;/^$/d' | tr '\n' ' '` + sys_lib_dlsearch_path_spec="/lib /usr/lib $lt_ld_extra" + fi + + # We used to test for /lib/ld.so.1 and disable shared libraries on + # powerpc, because MkLinux only supported shared libraries with the + # GNU dynamic linker. Since this was broken with cross compilers, + # most powerpc-linux boxes support dynamic linking these days and + # people can always --disable-shared, the test was removed, and we + # assume the GNU/Linux dynamic linker is in use. + dynamic_linker='GNU/Linux ld.so' + ;; + +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + +netbsd*) + version_type=sunos + need_lib_prefix=no + need_version=no + if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then + library_names_spec='$libname$release$shared_ext$versuffix $libname$shared_ext$versuffix' + finish_cmds='PATH="\$PATH:/sbin" ldconfig -m $libdir' + dynamic_linker='NetBSD (a.out) ld.so' + else + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + dynamic_linker='NetBSD ld.elf_so' + fi + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + ;; + +newsos6) + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + ;; + +*nto* | *qnx*) + version_type=qnx + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='ldqnx.so' + ;; + +openbsd* | bitrig*) + version_type=sunos + sys_lib_dlsearch_path_spec=/usr/lib + need_lib_prefix=no + if test -z "`echo __ELF__ | $CC -E - | $GREP __ELF__`"; then + need_version=no + else + need_version=yes + fi + library_names_spec='$libname$release$shared_ext$versuffix $libname$shared_ext$versuffix' + finish_cmds='PATH="\$PATH:/sbin" ldconfig -m $libdir' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + ;; + +os2*) + libname_spec='$name' + version_type=windows + shrext_cmds=.dll + need_version=no + need_lib_prefix=no + # OS/2 can only load a DLL with a base name of 8 characters or less. + soname_spec='`test -n "$os2dllname" && libname="$os2dllname"; + v=$($ECHO $release$versuffix | tr -d .-); + n=$($ECHO $libname | cut -b -$((8 - ${#v})) | tr . _); + $ECHO $n$v`$shared_ext' + library_names_spec='${libname}_dll.$libext' + dynamic_linker='OS/2 ld.exe' + shlibpath_var=BEGINLIBPATH + sys_lib_search_path_spec="/lib /usr/lib /usr/local/lib" + sys_lib_dlsearch_path_spec=$sys_lib_search_path_spec + postinstall_cmds='base_file=`basename \$file`~ + dlpath=`$SHELL 2>&1 -c '\''. $dir/'\''\$base_file'\''i; $ECHO \$dlname'\''`~ + dldir=$destdir/`dirname \$dlpath`~ + test -d \$dldir || mkdir -p \$dldir~ + $install_prog $dir/$dlname \$dldir/$dlname~ + chmod a+x \$dldir/$dlname~ + if test -n '\''$stripme'\'' && test -n '\''$striplib'\''; then + eval '\''$striplib \$dldir/$dlname'\'' || exit \$?; + fi' + postuninstall_cmds='dldll=`$SHELL 2>&1 -c '\''. $file; $ECHO \$dlname'\''`~ + dlpath=$dir/\$dldll~ + $RM \$dlpath' + ;; + +osf3* | osf4* | osf5*) + version_type=osf + need_lib_prefix=no + need_version=no + soname_spec='$libname$release$shared_ext$major' + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + shlibpath_var=LD_LIBRARY_PATH + sys_lib_search_path_spec="/usr/shlib /usr/ccs/lib /usr/lib/cmplrs/cc /usr/lib /usr/local/lib /var/shlib" + sys_lib_dlsearch_path_spec=$sys_lib_search_path_spec + ;; + +rdos*) + dynamic_linker=no + ;; + +solaris*) + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + # ldd complains unless libraries are executable + postinstall_cmds='chmod +x $lib' + ;; + +sunos4*) + version_type=sunos + library_names_spec='$libname$release$shared_ext$versuffix $libname$shared_ext$versuffix' + finish_cmds='PATH="\$PATH:/usr/etc" ldconfig $libdir' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + if test yes = "$with_gnu_ld"; then + need_lib_prefix=no + fi + need_version=yes + ;; + +sysv4 | sysv4.3*) + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + case $host_vendor in + sni) + shlibpath_overrides_runpath=no + need_lib_prefix=no + runpath_var=LD_RUN_PATH + ;; + siemens) + need_lib_prefix=no + ;; + motorola) + need_lib_prefix=no + need_version=no + shlibpath_overrides_runpath=no + sys_lib_search_path_spec='/lib /usr/lib /usr/ccs/lib' + ;; + esac + ;; + +sysv4*MP*) + if test -d /usr/nec; then + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$shared_ext.$versuffix $libname$shared_ext.$major $libname$shared_ext' + soname_spec='$libname$shared_ext.$major' + shlibpath_var=LD_LIBRARY_PATH + fi + ;; + +sysv5* | sco3.2v5* | sco5v6* | unixware* | OpenUNIX* | sysv4*uw2*) + version_type=sco + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=yes + hardcode_into_libs=yes + if test yes = "$with_gnu_ld"; then + sys_lib_search_path_spec='/usr/local/lib /usr/gnu/lib /usr/ccs/lib /usr/lib /lib' + else + sys_lib_search_path_spec='/usr/ccs/lib /usr/lib' + case $host_os in + sco3.2v5*) + sys_lib_search_path_spec="$sys_lib_search_path_spec /lib" + ;; + esac + fi + sys_lib_dlsearch_path_spec='/usr/lib' + ;; + +tpf*) + # TPF is a cross-target only. Preferred cross-host = GNU/Linux. + version_type=linux # correct to gnu/linux during the next big refactor + need_lib_prefix=no + need_version=no + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + ;; + +uts4*) + version_type=linux # correct to gnu/linux during the next big refactor + library_names_spec='$libname$release$shared_ext$versuffix $libname$release$shared_ext$major $libname$shared_ext' + soname_spec='$libname$release$shared_ext$major' + shlibpath_var=LD_LIBRARY_PATH + ;; + +*) + dynamic_linker=no + ;; +esac +AC_MSG_RESULT([$dynamic_linker]) +test no = "$dynamic_linker" && can_build_shared=no + +variables_saved_for_relink="PATH $shlibpath_var $runpath_var" +if test yes = "$GCC"; then + variables_saved_for_relink="$variables_saved_for_relink GCC_EXEC_PREFIX COMPILER_PATH LIBRARY_PATH" +fi + +if test set = "${lt_cv_sys_lib_search_path_spec+set}"; then + sys_lib_search_path_spec=$lt_cv_sys_lib_search_path_spec +fi + +if test set = "${lt_cv_sys_lib_dlsearch_path_spec+set}"; then + sys_lib_dlsearch_path_spec=$lt_cv_sys_lib_dlsearch_path_spec +fi + +# remember unaugmented sys_lib_dlsearch_path content for libtool script decls... +configure_time_dlsearch_path=$sys_lib_dlsearch_path_spec + +# ... but it needs LT_SYS_LIBRARY_PATH munging for other configure-time code +func_munge_path_list sys_lib_dlsearch_path_spec "$LT_SYS_LIBRARY_PATH" + +# to be used as default LT_SYS_LIBRARY_PATH value in generated libtool +configure_time_lt_sys_library_path=$LT_SYS_LIBRARY_PATH + +_LT_DECL([], [variables_saved_for_relink], [1], + [Variables whose values should be saved in libtool wrapper scripts and + restored at link time]) +_LT_DECL([], [need_lib_prefix], [0], + [Do we need the "lib" prefix for modules?]) +_LT_DECL([], [need_version], [0], [Do we need a version for libraries?]) +_LT_DECL([], [version_type], [0], [Library versioning type]) +_LT_DECL([], [runpath_var], [0], [Shared library runtime path variable]) +_LT_DECL([], [shlibpath_var], [0],[Shared library path variable]) +_LT_DECL([], [shlibpath_overrides_runpath], [0], + [Is shlibpath searched before the hard-coded library search path?]) +_LT_DECL([], [libname_spec], [1], [Format of library name prefix]) +_LT_DECL([], [library_names_spec], [1], + [[List of archive names. First name is the real one, the rest are links. + The last name is the one that the linker finds with -lNAME]]) +_LT_DECL([], [soname_spec], [1], + [[The coded name of the library, if different from the real name]]) +_LT_DECL([], [install_override_mode], [1], + [Permission mode override for installation of shared libraries]) +_LT_DECL([], [postinstall_cmds], [2], + [Command to use after installation of a shared archive]) +_LT_DECL([], [postuninstall_cmds], [2], + [Command to use after uninstallation of a shared archive]) +_LT_DECL([], [finish_cmds], [2], + [Commands used to finish a libtool library installation in a directory]) +_LT_DECL([], [finish_eval], [1], + [[As "finish_cmds", except a single script fragment to be evaled but + not shown]]) +_LT_DECL([], [hardcode_into_libs], [0], + [Whether we should hardcode library paths into libraries]) +_LT_DECL([], [sys_lib_search_path_spec], [2], + [Compile-time system search path for libraries]) +_LT_DECL([sys_lib_dlsearch_path_spec], [configure_time_dlsearch_path], [2], + [Detected run-time system search path for libraries]) +_LT_DECL([], [configure_time_lt_sys_library_path], [2], + [Explicit LT_SYS_LIBRARY_PATH set during ./configure time]) +])# _LT_SYS_DYNAMIC_LINKER + + +# _LT_PATH_TOOL_PREFIX(TOOL) +# -------------------------- +# find a file program that can recognize shared library +AC_DEFUN([_LT_PATH_TOOL_PREFIX], +[m4_require([_LT_DECL_EGREP])dnl +AC_MSG_CHECKING([for $1]) +AC_CACHE_VAL(lt_cv_path_MAGIC_CMD, +[case $MAGIC_CMD in +[[\\/*] | ?:[\\/]*]) + lt_cv_path_MAGIC_CMD=$MAGIC_CMD # Let the user override the test with a path. + ;; +*) + lt_save_MAGIC_CMD=$MAGIC_CMD + lt_save_ifs=$IFS; IFS=$PATH_SEPARATOR +dnl $ac_dummy forces splitting on constant user-supplied paths. +dnl POSIX.2 word splitting is done only on the output of word expansions, +dnl not every word. This closes a longstanding sh security hole. + ac_dummy="m4_if([$2], , $PATH, [$2])" + for ac_dir in $ac_dummy; do + IFS=$lt_save_ifs + test -z "$ac_dir" && ac_dir=. + if test -f "$ac_dir/$1"; then + lt_cv_path_MAGIC_CMD=$ac_dir/"$1" + if test -n "$file_magic_test_file"; then + case $deplibs_check_method in + "file_magic "*) + file_magic_regex=`expr "$deplibs_check_method" : "file_magic \(.*\)"` + MAGIC_CMD=$lt_cv_path_MAGIC_CMD + if eval $file_magic_cmd \$file_magic_test_file 2> /dev/null | + $EGREP "$file_magic_regex" > /dev/null; then + : + else + cat <<_LT_EOF 1>&2 + +*** Warning: the command libtool uses to detect shared libraries, +*** $file_magic_cmd, produces output that libtool cannot recognize. +*** The result is that libtool may fail to recognize shared libraries +*** as such. This will affect the creation of libtool libraries that +*** depend on shared libraries, but programs linked with such libtool +*** libraries will work regardless of this problem. Nevertheless, you +*** may want to report the problem to your system manager and/or to +*** bug-libtool@gnu.org + +_LT_EOF + fi ;; + esac + fi + break + fi + done + IFS=$lt_save_ifs + MAGIC_CMD=$lt_save_MAGIC_CMD + ;; +esac]) +MAGIC_CMD=$lt_cv_path_MAGIC_CMD +if test -n "$MAGIC_CMD"; then + AC_MSG_RESULT($MAGIC_CMD) +else + AC_MSG_RESULT(no) +fi +_LT_DECL([], [MAGIC_CMD], [0], + [Used to examine libraries when file_magic_cmd begins with "file"])dnl +])# _LT_PATH_TOOL_PREFIX + +# Old name: +AU_ALIAS([AC_PATH_TOOL_PREFIX], [_LT_PATH_TOOL_PREFIX]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_PATH_TOOL_PREFIX], []) + + +# _LT_PATH_MAGIC +# -------------- +# find a file program that can recognize a shared library +m4_defun([_LT_PATH_MAGIC], +[_LT_PATH_TOOL_PREFIX(${ac_tool_prefix}file, /usr/bin$PATH_SEPARATOR$PATH) +if test -z "$lt_cv_path_MAGIC_CMD"; then + if test -n "$ac_tool_prefix"; then + _LT_PATH_TOOL_PREFIX(file, /usr/bin$PATH_SEPARATOR$PATH) + else + MAGIC_CMD=: + fi +fi +])# _LT_PATH_MAGIC + + +# LT_PATH_LD +# ---------- +# find the pathname to the GNU or non-GNU linker +AC_DEFUN([LT_PATH_LD], +[AC_REQUIRE([AC_PROG_CC])dnl +AC_REQUIRE([AC_CANONICAL_HOST])dnl +AC_REQUIRE([AC_CANONICAL_BUILD])dnl +m4_require([_LT_DECL_SED])dnl +m4_require([_LT_DECL_EGREP])dnl +m4_require([_LT_PROG_ECHO_BACKSLASH])dnl + +AC_ARG_WITH([gnu-ld], + [AS_HELP_STRING([--with-gnu-ld], + [assume the C compiler uses GNU ld @<:@default=no@:>@])], + [test no = "$withval" || with_gnu_ld=yes], + [with_gnu_ld=no])dnl + +ac_prog=ld +if test yes = "$GCC"; then + # Check if gcc -print-prog-name=ld gives a path. + AC_MSG_CHECKING([for ld used by $CC]) + case $host in + *-*-mingw*) + # gcc leaves a trailing carriage return, which upsets mingw + ac_prog=`($CC -print-prog-name=ld) 2>&5 | tr -d '\015'` ;; + *) + ac_prog=`($CC -print-prog-name=ld) 2>&5` ;; + esac + case $ac_prog in + # Accept absolute paths. + [[\\/]]* | ?:[[\\/]]*) + re_direlt='/[[^/]][[^/]]*/\.\./' + # Canonicalize the pathname of ld + ac_prog=`$ECHO "$ac_prog"| $SED 's%\\\\%/%g'` + while $ECHO "$ac_prog" | $GREP "$re_direlt" > /dev/null 2>&1; do + ac_prog=`$ECHO $ac_prog| $SED "s%$re_direlt%/%"` + done + test -z "$LD" && LD=$ac_prog + ;; + "") + # If it fails, then pretend we aren't using GCC. + ac_prog=ld + ;; + *) + # If it is relative, then search for the first ld in PATH. + with_gnu_ld=unknown + ;; + esac +elif test yes = "$with_gnu_ld"; then + AC_MSG_CHECKING([for GNU ld]) +else + AC_MSG_CHECKING([for non-GNU ld]) +fi +AC_CACHE_VAL(lt_cv_path_LD, +[if test -z "$LD"; then + lt_save_ifs=$IFS; IFS=$PATH_SEPARATOR + for ac_dir in $PATH; do + IFS=$lt_save_ifs + test -z "$ac_dir" && ac_dir=. + if test -f "$ac_dir/$ac_prog" || test -f "$ac_dir/$ac_prog$ac_exeext"; then + lt_cv_path_LD=$ac_dir/$ac_prog + # Check to see if the program is GNU ld. I'd rather use --version, + # but apparently some variants of GNU ld only accept -v. + # Break only if it was the GNU/non-GNU ld that we prefer. + case `"$lt_cv_path_LD" -v 2>&1 &1 conftest.i +cat conftest.i conftest.i >conftest2.i +: ${lt_DD:=$DD} +AC_PATH_PROGS_FEATURE_CHECK([lt_DD], [dd], +[if "$ac_path_lt_DD" bs=32 count=1 conftest.out 2>/dev/null; then + cmp -s conftest.i conftest.out \ + && ac_cv_path_lt_DD="$ac_path_lt_DD" ac_path_lt_DD_found=: +fi]) +rm -f conftest.i conftest2.i conftest.out]) +])# _LT_PATH_DD + + +# _LT_CMD_TRUNCATE +# ---------------- +# find command to truncate a binary pipe +m4_defun([_LT_CMD_TRUNCATE], +[m4_require([_LT_PATH_DD]) +AC_CACHE_CHECK([how to truncate binary pipes], [lt_cv_truncate_bin], +[printf 0123456789abcdef0123456789abcdef >conftest.i +cat conftest.i conftest.i >conftest2.i +lt_cv_truncate_bin= +if "$ac_cv_path_lt_DD" bs=32 count=1 conftest.out 2>/dev/null; then + cmp -s conftest.i conftest.out \ + && lt_cv_truncate_bin="$ac_cv_path_lt_DD bs=4096 count=1" +fi +rm -f conftest.i conftest2.i conftest.out +test -z "$lt_cv_truncate_bin" && lt_cv_truncate_bin="$SED -e 4q"]) +_LT_DECL([lt_truncate_bin], [lt_cv_truncate_bin], [1], + [Command to truncate a binary pipe]) +])# _LT_CMD_TRUNCATE + + +# _LT_CHECK_MAGIC_METHOD +# ---------------------- +# how to check for library dependencies +# -- PORTME fill in with the dynamic library characteristics +m4_defun([_LT_CHECK_MAGIC_METHOD], +[m4_require([_LT_DECL_EGREP]) +m4_require([_LT_DECL_OBJDUMP]) +AC_CACHE_CHECK([how to recognize dependent libraries], +lt_cv_deplibs_check_method, +[lt_cv_file_magic_cmd='$MAGIC_CMD' +lt_cv_file_magic_test_file= +lt_cv_deplibs_check_method='unknown' +# Need to set the preceding variable on all platforms that support +# interlibrary dependencies. +# 'none' -- dependencies not supported. +# 'unknown' -- same as none, but documents that we really don't know. +# 'pass_all' -- all dependencies passed with no checks. +# 'test_compile' -- check by making test program. +# 'file_magic [[regex]]' -- check by looking for files in library path +# that responds to the $file_magic_cmd with a given extended regex. +# If you have 'file' or equivalent on your system and you're not sure +# whether 'pass_all' will *always* work, you probably want this one. + +case $host_os in +aix[[4-9]]*) + lt_cv_deplibs_check_method=pass_all + ;; + +beos*) + lt_cv_deplibs_check_method=pass_all + ;; + +bsdi[[45]]*) + lt_cv_deplibs_check_method='file_magic ELF [[0-9]][[0-9]]*-bit [[ML]]SB (shared object|dynamic lib)' + lt_cv_file_magic_cmd='/usr/bin/file -L' + lt_cv_file_magic_test_file=/shlib/libc.so + ;; + +cygwin*) + # func_win32_libid is a shell function defined in ltmain.sh + lt_cv_deplibs_check_method='file_magic ^x86 archive import|^x86 DLL' + lt_cv_file_magic_cmd='func_win32_libid' + ;; + +mingw* | pw32*) + # Base MSYS/MinGW do not provide the 'file' command needed by + # func_win32_libid shell function, so use a weaker test based on 'objdump', + # unless we find 'file', for example because we are cross-compiling. + if ( file / ) >/dev/null 2>&1; then + lt_cv_deplibs_check_method='file_magic ^x86 archive import|^x86 DLL' + lt_cv_file_magic_cmd='func_win32_libid' + else + # Keep this pattern in sync with the one in func_win32_libid. + lt_cv_deplibs_check_method='file_magic file format (pei*-i386(.*architecture: i386)?|pe-arm-wince|pe-x86-64)' + lt_cv_file_magic_cmd='$OBJDUMP -f' + fi + ;; + +cegcc*) + # use the weaker test based on 'objdump'. See mingw*. + lt_cv_deplibs_check_method='file_magic file format pe-arm-.*little(.*architecture: arm)?' + lt_cv_file_magic_cmd='$OBJDUMP -f' + ;; + +darwin* | rhapsody*) + lt_cv_deplibs_check_method=pass_all + ;; + +freebsd* | dragonfly*) + if echo __ELF__ | $CC -E - | $GREP __ELF__ > /dev/null; then + case $host_cpu in + i*86 ) + # Not sure whether the presence of OpenBSD here was a mistake. + # Let's accept both of them until this is cleared up. + lt_cv_deplibs_check_method='file_magic (FreeBSD|OpenBSD|DragonFly)/i[[3-9]]86 (compact )?demand paged shared library' + lt_cv_file_magic_cmd=/usr/bin/file + lt_cv_file_magic_test_file=`echo /usr/lib/libc.so.*` + ;; + esac + else + lt_cv_deplibs_check_method=pass_all + fi + ;; + +haiku*) + lt_cv_deplibs_check_method=pass_all + ;; + +hpux10.20* | hpux11*) + lt_cv_file_magic_cmd=/usr/bin/file + case $host_cpu in + ia64*) + lt_cv_deplibs_check_method='file_magic (s[[0-9]][[0-9]][[0-9]]|ELF-[[0-9]][[0-9]]) shared object file - IA64' + lt_cv_file_magic_test_file=/usr/lib/hpux32/libc.so + ;; + hppa*64*) + [lt_cv_deplibs_check_method='file_magic (s[0-9][0-9][0-9]|ELF[ -][0-9][0-9])(-bit)?( [LM]SB)? shared object( file)?[, -]* PA-RISC [0-9]\.[0-9]'] + lt_cv_file_magic_test_file=/usr/lib/pa20_64/libc.sl + ;; + *) + lt_cv_deplibs_check_method='file_magic (s[[0-9]][[0-9]][[0-9]]|PA-RISC[[0-9]]\.[[0-9]]) shared library' + lt_cv_file_magic_test_file=/usr/lib/libc.sl + ;; + esac + ;; + +interix[[3-9]]*) + # PIC code is broken on Interix 3.x, that's why |\.a not |_pic\.a here + lt_cv_deplibs_check_method='match_pattern /lib[[^/]]+(\.so|\.a)$' + ;; + +irix5* | irix6* | nonstopux*) + case $LD in + *-32|*"-32 ") libmagic=32-bit;; + *-n32|*"-n32 ") libmagic=N32;; + *-64|*"-64 ") libmagic=64-bit;; + *) libmagic=never-match;; + esac + lt_cv_deplibs_check_method=pass_all + ;; + +# This must be glibc/ELF. +linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) + lt_cv_deplibs_check_method=pass_all + ;; + +netbsd* | netbsdelf*-gnu) + if echo __ELF__ | $CC -E - | $GREP __ELF__ > /dev/null; then + lt_cv_deplibs_check_method='match_pattern /lib[[^/]]+(\.so\.[[0-9]]+\.[[0-9]]+|_pic\.a)$' + else + lt_cv_deplibs_check_method='match_pattern /lib[[^/]]+(\.so|_pic\.a)$' + fi + ;; + +newos6*) + lt_cv_deplibs_check_method='file_magic ELF [[0-9]][[0-9]]*-bit [[ML]]SB (executable|dynamic lib)' + lt_cv_file_magic_cmd=/usr/bin/file + lt_cv_file_magic_test_file=/usr/lib/libnls.so + ;; + +*nto* | *qnx*) + lt_cv_deplibs_check_method=pass_all + ;; + +openbsd* | bitrig*) + if test -z "`echo __ELF__ | $CC -E - | $GREP __ELF__`"; then + lt_cv_deplibs_check_method='match_pattern /lib[[^/]]+(\.so\.[[0-9]]+\.[[0-9]]+|\.so|_pic\.a)$' + else + lt_cv_deplibs_check_method='match_pattern /lib[[^/]]+(\.so\.[[0-9]]+\.[[0-9]]+|_pic\.a)$' + fi + ;; + +osf3* | osf4* | osf5*) + lt_cv_deplibs_check_method=pass_all + ;; + +rdos*) + lt_cv_deplibs_check_method=pass_all + ;; + +solaris*) + lt_cv_deplibs_check_method=pass_all + ;; + +sysv5* | sco3.2v5* | sco5v6* | unixware* | OpenUNIX* | sysv4*uw2*) + lt_cv_deplibs_check_method=pass_all + ;; + +sysv4 | sysv4.3*) + case $host_vendor in + motorola) + lt_cv_deplibs_check_method='file_magic ELF [[0-9]][[0-9]]*-bit [[ML]]SB (shared object|dynamic lib) M[[0-9]][[0-9]]* Version [[0-9]]' + lt_cv_file_magic_test_file=`echo /usr/lib/libc.so*` + ;; + ncr) + lt_cv_deplibs_check_method=pass_all + ;; + sequent) + lt_cv_file_magic_cmd='/bin/file' + lt_cv_deplibs_check_method='file_magic ELF [[0-9]][[0-9]]*-bit [[LM]]SB (shared object|dynamic lib )' + ;; + sni) + lt_cv_file_magic_cmd='/bin/file' + lt_cv_deplibs_check_method="file_magic ELF [[0-9]][[0-9]]*-bit [[LM]]SB dynamic lib" + lt_cv_file_magic_test_file=/lib/libc.so + ;; + siemens) + lt_cv_deplibs_check_method=pass_all + ;; + pc) + lt_cv_deplibs_check_method=pass_all + ;; + esac + ;; + +tpf*) + lt_cv_deplibs_check_method=pass_all + ;; +os2*) + lt_cv_deplibs_check_method=pass_all + ;; +esac +]) + +file_magic_glob= +want_nocaseglob=no +if test "$build" = "$host"; then + case $host_os in + mingw* | pw32*) + if ( shopt | grep nocaseglob ) >/dev/null 2>&1; then + want_nocaseglob=yes + else + file_magic_glob=`echo aAbBcCdDeEfFgGhHiIjJkKlLmMnNoOpPqQrRsStTuUvVwWxXyYzZ | $SED -e "s/\(..\)/s\/[[\1]]\/[[\1]]\/g;/g"` + fi + ;; + esac +fi + +file_magic_cmd=$lt_cv_file_magic_cmd +deplibs_check_method=$lt_cv_deplibs_check_method +test -z "$deplibs_check_method" && deplibs_check_method=unknown + +_LT_DECL([], [deplibs_check_method], [1], + [Method to check whether dependent libraries are shared objects]) +_LT_DECL([], [file_magic_cmd], [1], + [Command to use when deplibs_check_method = "file_magic"]) +_LT_DECL([], [file_magic_glob], [1], + [How to find potential files when deplibs_check_method = "file_magic"]) +_LT_DECL([], [want_nocaseglob], [1], + [Find potential files using nocaseglob when deplibs_check_method = "file_magic"]) +])# _LT_CHECK_MAGIC_METHOD + + +# LT_PATH_NM +# ---------- +# find the pathname to a BSD- or MS-compatible name lister +AC_DEFUN([LT_PATH_NM], +[AC_REQUIRE([AC_PROG_CC])dnl +AC_CACHE_CHECK([for BSD- or MS-compatible name lister (nm)], lt_cv_path_NM, +[if test -n "$NM"; then + # Let the user override the test. + lt_cv_path_NM=$NM +else + lt_nm_to_check=${ac_tool_prefix}nm + if test -n "$ac_tool_prefix" && test "$build" = "$host"; then + lt_nm_to_check="$lt_nm_to_check nm" + fi + for lt_tmp_nm in $lt_nm_to_check; do + lt_save_ifs=$IFS; IFS=$PATH_SEPARATOR + for ac_dir in $PATH /usr/ccs/bin/elf /usr/ccs/bin /usr/ucb /bin; do + IFS=$lt_save_ifs + test -z "$ac_dir" && ac_dir=. + tmp_nm=$ac_dir/$lt_tmp_nm + if test -f "$tmp_nm" || test -f "$tmp_nm$ac_exeext"; then + # Check to see if the nm accepts a BSD-compat flag. + # Adding the 'sed 1q' prevents false positives on HP-UX, which says: + # nm: unknown option "B" ignored + # Tru64's nm complains that /dev/null is an invalid object file + # MSYS converts /dev/null to NUL, MinGW nm treats NUL as empty + case $build_os in + mingw*) lt_bad_file=conftest.nm/nofile ;; + *) lt_bad_file=/dev/null ;; + esac + case `"$tmp_nm" -B $lt_bad_file 2>&1 | sed '1q'` in + *$lt_bad_file* | *'Invalid file or object type'*) + lt_cv_path_NM="$tmp_nm -B" + break 2 + ;; + *) + case `"$tmp_nm" -p /dev/null 2>&1 | sed '1q'` in + */dev/null*) + lt_cv_path_NM="$tmp_nm -p" + break 2 + ;; + *) + lt_cv_path_NM=${lt_cv_path_NM="$tmp_nm"} # keep the first match, but + continue # so that we can try to find one that supports BSD flags + ;; + esac + ;; + esac + fi + done + IFS=$lt_save_ifs + done + : ${lt_cv_path_NM=no} +fi]) +if test no != "$lt_cv_path_NM"; then + NM=$lt_cv_path_NM +else + # Didn't find any BSD compatible name lister, look for dumpbin. + if test -n "$DUMPBIN"; then : + # Let the user override the test. + else + AC_CHECK_TOOLS(DUMPBIN, [dumpbin "link -dump"], :) + case `$DUMPBIN -symbols -headers /dev/null 2>&1 | sed '1q'` in + *COFF*) + DUMPBIN="$DUMPBIN -symbols -headers" + ;; + *) + DUMPBIN=: + ;; + esac + fi + AC_SUBST([DUMPBIN]) + if test : != "$DUMPBIN"; then + NM=$DUMPBIN + fi +fi +test -z "$NM" && NM=nm +AC_SUBST([NM]) +_LT_DECL([], [NM], [1], [A BSD- or MS-compatible name lister])dnl + +AC_CACHE_CHECK([the name lister ($NM) interface], [lt_cv_nm_interface], + [lt_cv_nm_interface="BSD nm" + echo "int some_variable = 0;" > conftest.$ac_ext + (eval echo "\"\$as_me:$LINENO: $ac_compile\"" >&AS_MESSAGE_LOG_FD) + (eval "$ac_compile" 2>conftest.err) + cat conftest.err >&AS_MESSAGE_LOG_FD + (eval echo "\"\$as_me:$LINENO: $NM \\\"conftest.$ac_objext\\\"\"" >&AS_MESSAGE_LOG_FD) + (eval "$NM \"conftest.$ac_objext\"" 2>conftest.err > conftest.out) + cat conftest.err >&AS_MESSAGE_LOG_FD + (eval echo "\"\$as_me:$LINENO: output\"" >&AS_MESSAGE_LOG_FD) + cat conftest.out >&AS_MESSAGE_LOG_FD + if $GREP 'External.*some_variable' conftest.out > /dev/null; then + lt_cv_nm_interface="MS dumpbin" + fi + rm -f conftest*]) +])# LT_PATH_NM + +# Old names: +AU_ALIAS([AM_PROG_NM], [LT_PATH_NM]) +AU_ALIAS([AC_PROG_NM], [LT_PATH_NM]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AM_PROG_NM], []) +dnl AC_DEFUN([AC_PROG_NM], []) + +# _LT_CHECK_SHAREDLIB_FROM_LINKLIB +# -------------------------------- +# how to determine the name of the shared library +# associated with a specific link library. +# -- PORTME fill in with the dynamic library characteristics +m4_defun([_LT_CHECK_SHAREDLIB_FROM_LINKLIB], +[m4_require([_LT_DECL_EGREP]) +m4_require([_LT_DECL_OBJDUMP]) +m4_require([_LT_DECL_DLLTOOL]) +AC_CACHE_CHECK([how to associate runtime and link libraries], +lt_cv_sharedlib_from_linklib_cmd, +[lt_cv_sharedlib_from_linklib_cmd='unknown' + +case $host_os in +cygwin* | mingw* | pw32* | cegcc*) + # two different shell functions defined in ltmain.sh; + # decide which one to use based on capabilities of $DLLTOOL + case `$DLLTOOL --help 2>&1` in + *--identify-strict*) + lt_cv_sharedlib_from_linklib_cmd=func_cygming_dll_for_implib + ;; + *) + lt_cv_sharedlib_from_linklib_cmd=func_cygming_dll_for_implib_fallback + ;; + esac + ;; +*) + # fallback: assume linklib IS sharedlib + lt_cv_sharedlib_from_linklib_cmd=$ECHO + ;; +esac +]) +sharedlib_from_linklib_cmd=$lt_cv_sharedlib_from_linklib_cmd +test -z "$sharedlib_from_linklib_cmd" && sharedlib_from_linklib_cmd=$ECHO + +_LT_DECL([], [sharedlib_from_linklib_cmd], [1], + [Command to associate shared and link libraries]) +])# _LT_CHECK_SHAREDLIB_FROM_LINKLIB + + +# _LT_PATH_MANIFEST_TOOL +# ---------------------- +# locate the manifest tool +m4_defun([_LT_PATH_MANIFEST_TOOL], +[AC_CHECK_TOOL(MANIFEST_TOOL, mt, :) +test -z "$MANIFEST_TOOL" && MANIFEST_TOOL=mt +AC_CACHE_CHECK([if $MANIFEST_TOOL is a manifest tool], [lt_cv_path_mainfest_tool], + [lt_cv_path_mainfest_tool=no + echo "$as_me:$LINENO: $MANIFEST_TOOL '-?'" >&AS_MESSAGE_LOG_FD + $MANIFEST_TOOL '-?' 2>conftest.err > conftest.out + cat conftest.err >&AS_MESSAGE_LOG_FD + if $GREP 'Manifest Tool' conftest.out > /dev/null; then + lt_cv_path_mainfest_tool=yes + fi + rm -f conftest*]) +if test yes != "$lt_cv_path_mainfest_tool"; then + MANIFEST_TOOL=: +fi +_LT_DECL([], [MANIFEST_TOOL], [1], [Manifest tool])dnl +])# _LT_PATH_MANIFEST_TOOL + + +# _LT_DLL_DEF_P([FILE]) +# --------------------- +# True iff FILE is a Windows DLL '.def' file. +# Keep in sync with func_dll_def_p in the libtool script +AC_DEFUN([_LT_DLL_DEF_P], +[dnl + test DEF = "`$SED -n dnl + -e '\''s/^[[ ]]*//'\'' dnl Strip leading whitespace + -e '\''/^\(;.*\)*$/d'\'' dnl Delete empty lines and comments + -e '\''s/^\(EXPORTS\|LIBRARY\)\([[ ]].*\)*$/DEF/p'\'' dnl + -e q dnl Only consider the first "real" line + $1`" dnl +])# _LT_DLL_DEF_P + + +# LT_LIB_M +# -------- +# check for math library +AC_DEFUN([LT_LIB_M], +[AC_REQUIRE([AC_CANONICAL_HOST])dnl +LIBM= +case $host in +*-*-beos* | *-*-cegcc* | *-*-cygwin* | *-*-haiku* | *-*-pw32* | *-*-darwin*) + # These system don't have libm, or don't need it + ;; +*-ncr-sysv4.3*) + AC_CHECK_LIB(mw, _mwvalidcheckl, LIBM=-lmw) + AC_CHECK_LIB(m, cos, LIBM="$LIBM -lm") + ;; +*) + AC_CHECK_LIB(m, cos, LIBM=-lm) + ;; +esac +AC_SUBST([LIBM]) +])# LT_LIB_M + +# Old name: +AU_ALIAS([AC_CHECK_LIBM], [LT_LIB_M]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_CHECK_LIBM], []) + + +# _LT_COMPILER_NO_RTTI([TAGNAME]) +# ------------------------------- +m4_defun([_LT_COMPILER_NO_RTTI], +[m4_require([_LT_TAG_COMPILER])dnl + +_LT_TAGVAR(lt_prog_compiler_no_builtin_flag, $1)= + +if test yes = "$GCC"; then + case $cc_basename in + nvcc*) + _LT_TAGVAR(lt_prog_compiler_no_builtin_flag, $1)=' -Xcompiler -fno-builtin' ;; + *) + _LT_TAGVAR(lt_prog_compiler_no_builtin_flag, $1)=' -fno-builtin' ;; + esac + + _LT_COMPILER_OPTION([if $compiler supports -fno-rtti -fno-exceptions], + lt_cv_prog_compiler_rtti_exceptions, + [-fno-rtti -fno-exceptions], [], + [_LT_TAGVAR(lt_prog_compiler_no_builtin_flag, $1)="$_LT_TAGVAR(lt_prog_compiler_no_builtin_flag, $1) -fno-rtti -fno-exceptions"]) +fi +_LT_TAGDECL([no_builtin_flag], [lt_prog_compiler_no_builtin_flag], [1], + [Compiler flag to turn off builtin functions]) +])# _LT_COMPILER_NO_RTTI + + +# _LT_CMD_GLOBAL_SYMBOLS +# ---------------------- +m4_defun([_LT_CMD_GLOBAL_SYMBOLS], +[AC_REQUIRE([AC_CANONICAL_HOST])dnl +AC_REQUIRE([AC_PROG_CC])dnl +AC_REQUIRE([AC_PROG_AWK])dnl +AC_REQUIRE([LT_PATH_NM])dnl +AC_REQUIRE([LT_PATH_LD])dnl +m4_require([_LT_DECL_SED])dnl +m4_require([_LT_DECL_EGREP])dnl +m4_require([_LT_TAG_COMPILER])dnl + +# Check for command to grab the raw symbol name followed by C symbol from nm. +AC_MSG_CHECKING([command to parse $NM output from $compiler object]) +AC_CACHE_VAL([lt_cv_sys_global_symbol_pipe], +[ +# These are sane defaults that work on at least a few old systems. +# [They come from Ultrix. What could be older than Ultrix?!! ;)] + +# Character class describing NM global symbol codes. +symcode='[[BCDEGRST]]' + +# Regexp to match symbols that can be accessed directly from C. +sympat='\([[_A-Za-z]][[_A-Za-z0-9]]*\)' + +# Define system-specific variables. +case $host_os in +aix*) + symcode='[[BCDT]]' + ;; +cygwin* | mingw* | pw32* | cegcc*) + symcode='[[ABCDGISTW]]' + ;; +hpux*) + if test ia64 = "$host_cpu"; then + symcode='[[ABCDEGRST]]' + fi + ;; +irix* | nonstopux*) + symcode='[[BCDEGRST]]' + ;; +osf*) + symcode='[[BCDEGQRST]]' + ;; +solaris*) + symcode='[[BDRT]]' + ;; +sco3.2v5*) + symcode='[[DT]]' + ;; +sysv4.2uw2*) + symcode='[[DT]]' + ;; +sysv5* | sco5v6* | unixware* | OpenUNIX*) + symcode='[[ABDT]]' + ;; +sysv4) + symcode='[[DFNSTU]]' + ;; +esac + +# If we're using GNU nm, then use its standard symbol codes. +case `$NM -V 2>&1` in +*GNU* | *'with BFD'*) + symcode='[[ABCDGIRSTW]]' ;; +esac + +if test "$lt_cv_nm_interface" = "MS dumpbin"; then + # Gets list of data symbols to import. + lt_cv_sys_global_symbol_to_import="sed -n -e 's/^I .* \(.*\)$/\1/p'" + # Adjust the below global symbol transforms to fixup imported variables. + lt_cdecl_hook=" -e 's/^I .* \(.*\)$/extern __declspec(dllimport) char \1;/p'" + lt_c_name_hook=" -e 's/^I .* \(.*\)$/ {\"\1\", (void *) 0},/p'" + lt_c_name_lib_hook="\ + -e 's/^I .* \(lib.*\)$/ {\"\1\", (void *) 0},/p'\ + -e 's/^I .* \(.*\)$/ {\"lib\1\", (void *) 0},/p'" +else + # Disable hooks by default. + lt_cv_sys_global_symbol_to_import= + lt_cdecl_hook= + lt_c_name_hook= + lt_c_name_lib_hook= +fi + +# Transform an extracted symbol line into a proper C declaration. +# Some systems (esp. on ia64) link data and code symbols differently, +# so use this general approach. +lt_cv_sys_global_symbol_to_cdecl="sed -n"\ +$lt_cdecl_hook\ +" -e 's/^T .* \(.*\)$/extern int \1();/p'"\ +" -e 's/^$symcode$symcode* .* \(.*\)$/extern char \1;/p'" + +# Transform an extracted symbol line into symbol name and symbol address +lt_cv_sys_global_symbol_to_c_name_address="sed -n"\ +$lt_c_name_hook\ +" -e 's/^: \(.*\) .*$/ {\"\1\", (void *) 0},/p'"\ +" -e 's/^$symcode$symcode* .* \(.*\)$/ {\"\1\", (void *) \&\1},/p'" + +# Transform an extracted symbol line into symbol name with lib prefix and +# symbol address. +lt_cv_sys_global_symbol_to_c_name_address_lib_prefix="sed -n"\ +$lt_c_name_lib_hook\ +" -e 's/^: \(.*\) .*$/ {\"\1\", (void *) 0},/p'"\ +" -e 's/^$symcode$symcode* .* \(lib.*\)$/ {\"\1\", (void *) \&\1},/p'"\ +" -e 's/^$symcode$symcode* .* \(.*\)$/ {\"lib\1\", (void *) \&\1},/p'" + +# Handle CRLF in mingw tool chain +opt_cr= +case $build_os in +mingw*) + opt_cr=`$ECHO 'x\{0,1\}' | tr x '\015'` # option cr in regexp + ;; +esac + +# Try without a prefix underscore, then with it. +for ac_symprfx in "" "_"; do + + # Transform symcode, sympat, and symprfx into a raw symbol and a C symbol. + symxfrm="\\1 $ac_symprfx\\2 \\2" + + # Write the raw and C identifiers. + if test "$lt_cv_nm_interface" = "MS dumpbin"; then + # Fake it for dumpbin and say T for any non-static function, + # D for any global variable and I for any imported variable. + # Also find C++ and __fastcall symbols from MSVC++, + # which start with @ or ?. + lt_cv_sys_global_symbol_pipe="$AWK ['"\ +" {last_section=section; section=\$ 3};"\ +" /^COFF SYMBOL TABLE/{for(i in hide) delete hide[i]};"\ +" /Section length .*#relocs.*(pick any)/{hide[last_section]=1};"\ +" /^ *Symbol name *: /{split(\$ 0,sn,\":\"); si=substr(sn[2],2)};"\ +" /^ *Type *: code/{print \"T\",si,substr(si,length(prfx))};"\ +" /^ *Type *: data/{print \"I\",si,substr(si,length(prfx))};"\ +" \$ 0!~/External *\|/{next};"\ +" / 0+ UNDEF /{next}; / UNDEF \([^|]\)*()/{next};"\ +" {if(hide[section]) next};"\ +" {f=\"D\"}; \$ 0~/\(\).*\|/{f=\"T\"};"\ +" {split(\$ 0,a,/\||\r/); split(a[2],s)};"\ +" s[1]~/^[@?]/{print f,s[1],s[1]; next};"\ +" s[1]~prfx {split(s[1],t,\"@\"); print f,t[1],substr(t[1],length(prfx))}"\ +" ' prfx=^$ac_symprfx]" + else + lt_cv_sys_global_symbol_pipe="sed -n -e 's/^.*[[ ]]\($symcode$symcode*\)[[ ]][[ ]]*$ac_symprfx$sympat$opt_cr$/$symxfrm/p'" + fi + lt_cv_sys_global_symbol_pipe="$lt_cv_sys_global_symbol_pipe | sed '/ __gnu_lto/d'" + + # Check to see that the pipe works correctly. + pipe_works=no + + rm -f conftest* + cat > conftest.$ac_ext <<_LT_EOF +#ifdef __cplusplus +extern "C" { +#endif +char nm_test_var; +void nm_test_func(void); +void nm_test_func(void){} +#ifdef __cplusplus +} +#endif +int main(){nm_test_var='a';nm_test_func();return(0);} +_LT_EOF + + if AC_TRY_EVAL(ac_compile); then + # Now try to grab the symbols. + nlist=conftest.nm + if AC_TRY_EVAL(NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist) && test -s "$nlist"; then + # Try sorting and uniquifying the output. + if sort "$nlist" | uniq > "$nlist"T; then + mv -f "$nlist"T "$nlist" + else + rm -f "$nlist"T + fi + + # Make sure that we snagged all the symbols we need. + if $GREP ' nm_test_var$' "$nlist" >/dev/null; then + if $GREP ' nm_test_func$' "$nlist" >/dev/null; then + cat <<_LT_EOF > conftest.$ac_ext +/* Keep this code in sync between libtool.m4, ltmain, lt_system.h, and tests. */ +#if defined _WIN32 || defined __CYGWIN__ || defined _WIN32_WCE +/* DATA imports from DLLs on WIN32 can't be const, because runtime + relocations are performed -- see ld's documentation on pseudo-relocs. */ +# define LT@&t@_DLSYM_CONST +#elif defined __osf__ +/* This system does not cope well with relocations in const data. */ +# define LT@&t@_DLSYM_CONST +#else +# define LT@&t@_DLSYM_CONST const +#endif + +#ifdef __cplusplus +extern "C" { +#endif + +_LT_EOF + # Now generate the symbol file. + eval "$lt_cv_sys_global_symbol_to_cdecl"' < "$nlist" | $GREP -v main >> conftest.$ac_ext' + + cat <<_LT_EOF >> conftest.$ac_ext + +/* The mapping between symbol names and symbols. */ +LT@&t@_DLSYM_CONST struct { + const char *name; + void *address; +} +lt__PROGRAM__LTX_preloaded_symbols[[]] = +{ + { "@PROGRAM@", (void *) 0 }, +_LT_EOF + $SED "s/^$symcode$symcode* .* \(.*\)$/ {\"\1\", (void *) \&\1},/" < "$nlist" | $GREP -v main >> conftest.$ac_ext + cat <<\_LT_EOF >> conftest.$ac_ext + {0, (void *) 0} +}; + +/* This works around a problem in FreeBSD linker */ +#ifdef FREEBSD_WORKAROUND +static const void *lt_preloaded_setup() { + return lt__PROGRAM__LTX_preloaded_symbols; +} +#endif + +#ifdef __cplusplus +} +#endif +_LT_EOF + # Now try linking the two files. + mv conftest.$ac_objext conftstm.$ac_objext + lt_globsym_save_LIBS=$LIBS + lt_globsym_save_CFLAGS=$CFLAGS + LIBS=conftstm.$ac_objext + CFLAGS="$CFLAGS$_LT_TAGVAR(lt_prog_compiler_no_builtin_flag, $1)" + if AC_TRY_EVAL(ac_link) && test -s conftest$ac_exeext; then + pipe_works=yes + fi + LIBS=$lt_globsym_save_LIBS + CFLAGS=$lt_globsym_save_CFLAGS + else + echo "cannot find nm_test_func in $nlist" >&AS_MESSAGE_LOG_FD + fi + else + echo "cannot find nm_test_var in $nlist" >&AS_MESSAGE_LOG_FD + fi + else + echo "cannot run $lt_cv_sys_global_symbol_pipe" >&AS_MESSAGE_LOG_FD + fi + else + echo "$progname: failed program was:" >&AS_MESSAGE_LOG_FD + cat conftest.$ac_ext >&5 + fi + rm -rf conftest* conftst* + + # Do not use the global_symbol_pipe unless it works. + if test yes = "$pipe_works"; then + break + else + lt_cv_sys_global_symbol_pipe= + fi +done +]) +if test -z "$lt_cv_sys_global_symbol_pipe"; then + lt_cv_sys_global_symbol_to_cdecl= +fi +if test -z "$lt_cv_sys_global_symbol_pipe$lt_cv_sys_global_symbol_to_cdecl"; then + AC_MSG_RESULT(failed) +else + AC_MSG_RESULT(ok) +fi + +# Response file support. +if test "$lt_cv_nm_interface" = "MS dumpbin"; then + nm_file_list_spec='@' +elif $NM --help 2>/dev/null | grep '[[@]]FILE' >/dev/null; then + nm_file_list_spec='@' +fi + +_LT_DECL([global_symbol_pipe], [lt_cv_sys_global_symbol_pipe], [1], + [Take the output of nm and produce a listing of raw symbols and C names]) +_LT_DECL([global_symbol_to_cdecl], [lt_cv_sys_global_symbol_to_cdecl], [1], + [Transform the output of nm in a proper C declaration]) +_LT_DECL([global_symbol_to_import], [lt_cv_sys_global_symbol_to_import], [1], + [Transform the output of nm into a list of symbols to manually relocate]) +_LT_DECL([global_symbol_to_c_name_address], + [lt_cv_sys_global_symbol_to_c_name_address], [1], + [Transform the output of nm in a C name address pair]) +_LT_DECL([global_symbol_to_c_name_address_lib_prefix], + [lt_cv_sys_global_symbol_to_c_name_address_lib_prefix], [1], + [Transform the output of nm in a C name address pair when lib prefix is needed]) +_LT_DECL([nm_interface], [lt_cv_nm_interface], [1], + [The name lister interface]) +_LT_DECL([], [nm_file_list_spec], [1], + [Specify filename containing input files for $NM]) +]) # _LT_CMD_GLOBAL_SYMBOLS + + +# _LT_COMPILER_PIC([TAGNAME]) +# --------------------------- +m4_defun([_LT_COMPILER_PIC], +[m4_require([_LT_TAG_COMPILER])dnl +_LT_TAGVAR(lt_prog_compiler_wl, $1)= +_LT_TAGVAR(lt_prog_compiler_pic, $1)= +_LT_TAGVAR(lt_prog_compiler_static, $1)= + +m4_if([$1], [CXX], [ + # C++ specific cases for pic, static, wl, etc. + if test yes = "$GXX"; then + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + + case $host_os in + aix*) + # All AIX code is PIC. + if test ia64 = "$host_cpu"; then + # AIX 5 now supports IA64 processor + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + fi + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + + amigaos*) + case $host_cpu in + powerpc) + # see comment about AmigaOS4 .so support + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + m68k) + # FIXME: we need at least 68020 code to build shared libraries, but + # adding the '-m68020' flag to GCC prevents building anything better, + # like '-m68040'. + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-m68020 -resident32 -malways-restore-a4' + ;; + esac + ;; + + beos* | irix5* | irix6* | nonstopux* | osf3* | osf4* | osf5*) + # PIC is the default for these OSes. + ;; + mingw* | cygwin* | os2* | pw32* | cegcc*) + # This hack is so that the source file can tell whether it is being + # built for inclusion in a dll (and should export symbols for example). + # Although the cygwin gcc ignores -fPIC, still need this for old-style + # (--disable-auto-import) libraries + m4_if([$1], [GCJ], [], + [_LT_TAGVAR(lt_prog_compiler_pic, $1)='-DDLL_EXPORT']) + case $host_os in + os2*) + _LT_TAGVAR(lt_prog_compiler_static, $1)='$wl-static' + ;; + esac + ;; + darwin* | rhapsody*) + # PIC is the default on this platform + # Common symbols not allowed in MH_DYLIB files + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fno-common' + ;; + *djgpp*) + # DJGPP does not support shared libraries at all + _LT_TAGVAR(lt_prog_compiler_pic, $1)= + ;; + haiku*) + # PIC is the default for Haiku. + # The "-static" flag exists, but is broken. + _LT_TAGVAR(lt_prog_compiler_static, $1)= + ;; + interix[[3-9]]*) + # Interix 3.x gcc -fpic/-fPIC options generate broken code. + # Instead, we relocate shared libraries at runtime. + ;; + sysv4*MP*) + if test -d /usr/nec; then + _LT_TAGVAR(lt_prog_compiler_pic, $1)=-Kconform_pic + fi + ;; + hpux*) + # PIC is the default for 64-bit PA HP-UX, but not for 32-bit + # PA HP-UX. On IA64 HP-UX, PIC is the default but the pic flag + # sets the default TLS model and affects inlining. + case $host_cpu in + hppa*64*) + ;; + *) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + esac + ;; + *qnx* | *nto*) + # QNX uses GNU C++, but need to define -shared option too, otherwise + # it will coredump. + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC -shared' + ;; + *) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + esac + else + case $host_os in + aix[[4-9]]*) + # All AIX code is PIC. + if test ia64 = "$host_cpu"; then + # AIX 5 now supports IA64 processor + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + else + _LT_TAGVAR(lt_prog_compiler_static, $1)='-bnso -bI:/lib/syscalls.exp' + fi + ;; + chorus*) + case $cc_basename in + cxch68*) + # Green Hills C++ Compiler + # _LT_TAGVAR(lt_prog_compiler_static, $1)="--no_auto_instantiation -u __main -u __premain -u _abort -r $COOL_DIR/lib/libOrb.a $MVME_DIR/lib/CC/libC.a $MVME_DIR/lib/classix/libcx.s.a" + ;; + esac + ;; + mingw* | cygwin* | os2* | pw32* | cegcc*) + # This hack is so that the source file can tell whether it is being + # built for inclusion in a dll (and should export symbols for example). + m4_if([$1], [GCJ], [], + [_LT_TAGVAR(lt_prog_compiler_pic, $1)='-DDLL_EXPORT']) + ;; + dgux*) + case $cc_basename in + ec++*) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + ;; + ghcx*) + # Green Hills C++ Compiler + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-pic' + ;; + *) + ;; + esac + ;; + freebsd* | dragonfly*) + # FreeBSD uses GNU C++ + ;; + hpux9* | hpux10* | hpux11*) + case $cc_basename in + CC*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_static, $1)='$wl-a ${wl}archive' + if test ia64 != "$host_cpu"; then + _LT_TAGVAR(lt_prog_compiler_pic, $1)='+Z' + fi + ;; + aCC*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_static, $1)='$wl-a ${wl}archive' + case $host_cpu in + hppa*64*|ia64*) + # +Z the default + ;; + *) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='+Z' + ;; + esac + ;; + *) + ;; + esac + ;; + interix*) + # This is c89, which is MS Visual C++ (no shared libs) + # Anyone wants to do a port? + ;; + irix5* | irix6* | nonstopux*) + case $cc_basename in + CC*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-non_shared' + # CC pic flag -KPIC is the default. + ;; + *) + ;; + esac + ;; + linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) + case $cc_basename in + KCC*) + # KAI C++ Compiler + _LT_TAGVAR(lt_prog_compiler_wl, $1)='--backend -Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + ecpc* ) + # old Intel C++ for x86_64, which still supported -KPIC. + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + ;; + icpc* ) + # Intel C++, used to be incompatible with GCC. + # ICC 10 doesn't accept -KPIC any more. + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + ;; + pgCC* | pgcpp*) + # Portland Group C++ compiler + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fpic' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + cxx*) + # Compaq C++ + # Make sure the PIC flag is empty. It appears that all Alpha + # Linux and Compaq Tru64 Unix objects are PIC. + _LT_TAGVAR(lt_prog_compiler_pic, $1)= + _LT_TAGVAR(lt_prog_compiler_static, $1)='-non_shared' + ;; + xlc* | xlC* | bgxl[[cC]]* | mpixl[[cC]]*) + # IBM XL 8.0, 9.0 on PPC and BlueGene + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-qpic' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-qstaticlink' + ;; + *) + case `$CC -V 2>&1 | sed 5q` in + *Sun\ C*) + # Sun C++ 5.9 + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Qoption ld ' + ;; + esac + ;; + esac + ;; + lynxos*) + ;; + m88k*) + ;; + mvs*) + case $cc_basename in + cxx*) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-W c,exportall' + ;; + *) + ;; + esac + ;; + netbsd* | netbsdelf*-gnu) + ;; + *qnx* | *nto*) + # QNX uses GNU C++, but need to define -shared option too, otherwise + # it will coredump. + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC -shared' + ;; + osf3* | osf4* | osf5*) + case $cc_basename in + KCC*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='--backend -Wl,' + ;; + RCC*) + # Rational C++ 2.4.1 + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-pic' + ;; + cxx*) + # Digital/Compaq C++ + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + # Make sure the PIC flag is empty. It appears that all Alpha + # Linux and Compaq Tru64 Unix objects are PIC. + _LT_TAGVAR(lt_prog_compiler_pic, $1)= + _LT_TAGVAR(lt_prog_compiler_static, $1)='-non_shared' + ;; + *) + ;; + esac + ;; + psos*) + ;; + solaris*) + case $cc_basename in + CC* | sunCC*) + # Sun C++ 4.2, 5.x and Centerline C++ + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Qoption ld ' + ;; + gcx*) + # Green Hills C++ Compiler + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-PIC' + ;; + *) + ;; + esac + ;; + sunos4*) + case $cc_basename in + CC*) + # Sun C++ 4.x + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-pic' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + lcc*) + # Lucid + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-pic' + ;; + *) + ;; + esac + ;; + sysv5* | unixware* | sco3.2v5* | sco5v6* | OpenUNIX*) + case $cc_basename in + CC*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + esac + ;; + tandem*) + case $cc_basename in + NCC*) + # NonStop-UX NCC 3.20 + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + ;; + *) + ;; + esac + ;; + vxworks*) + ;; + *) + _LT_TAGVAR(lt_prog_compiler_can_build_shared, $1)=no + ;; + esac + fi +], +[ + if test yes = "$GCC"; then + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + + case $host_os in + aix*) + # All AIX code is PIC. + if test ia64 = "$host_cpu"; then + # AIX 5 now supports IA64 processor + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + fi + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + + amigaos*) + case $host_cpu in + powerpc) + # see comment about AmigaOS4 .so support + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + m68k) + # FIXME: we need at least 68020 code to build shared libraries, but + # adding the '-m68020' flag to GCC prevents building anything better, + # like '-m68040'. + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-m68020 -resident32 -malways-restore-a4' + ;; + esac + ;; + + beos* | irix5* | irix6* | nonstopux* | osf3* | osf4* | osf5*) + # PIC is the default for these OSes. + ;; + + mingw* | cygwin* | pw32* | os2* | cegcc*) + # This hack is so that the source file can tell whether it is being + # built for inclusion in a dll (and should export symbols for example). + # Although the cygwin gcc ignores -fPIC, still need this for old-style + # (--disable-auto-import) libraries + m4_if([$1], [GCJ], [], + [_LT_TAGVAR(lt_prog_compiler_pic, $1)='-DDLL_EXPORT']) + case $host_os in + os2*) + _LT_TAGVAR(lt_prog_compiler_static, $1)='$wl-static' + ;; + esac + ;; + + darwin* | rhapsody*) + # PIC is the default on this platform + # Common symbols not allowed in MH_DYLIB files + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fno-common' + ;; + + haiku*) + # PIC is the default for Haiku. + # The "-static" flag exists, but is broken. + _LT_TAGVAR(lt_prog_compiler_static, $1)= + ;; + + hpux*) + # PIC is the default for 64-bit PA HP-UX, but not for 32-bit + # PA HP-UX. On IA64 HP-UX, PIC is the default but the pic flag + # sets the default TLS model and affects inlining. + case $host_cpu in + hppa*64*) + # +Z the default + ;; + *) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + esac + ;; + + interix[[3-9]]*) + # Interix 3.x gcc -fpic/-fPIC options generate broken code. + # Instead, we relocate shared libraries at runtime. + ;; + + msdosdjgpp*) + # Just because we use GCC doesn't mean we suddenly get shared libraries + # on systems that don't support them. + _LT_TAGVAR(lt_prog_compiler_can_build_shared, $1)=no + enable_shared=no + ;; + + *nto* | *qnx*) + # QNX uses GNU C++, but need to define -shared option too, otherwise + # it will coredump. + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC -shared' + ;; + + sysv4*MP*) + if test -d /usr/nec; then + _LT_TAGVAR(lt_prog_compiler_pic, $1)=-Kconform_pic + fi + ;; + + *) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + ;; + esac + + case $cc_basename in + nvcc*) # Cuda Compiler Driver 2.2 + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Xlinker ' + if test -n "$_LT_TAGVAR(lt_prog_compiler_pic, $1)"; then + _LT_TAGVAR(lt_prog_compiler_pic, $1)="-Xcompiler $_LT_TAGVAR(lt_prog_compiler_pic, $1)" + fi + ;; + esac + else + # PORTME Check for flag to pass linker flags through the system compiler. + case $host_os in + aix*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + if test ia64 = "$host_cpu"; then + # AIX 5 now supports IA64 processor + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + else + _LT_TAGVAR(lt_prog_compiler_static, $1)='-bnso -bI:/lib/syscalls.exp' + fi + ;; + + darwin* | rhapsody*) + # PIC is the default on this platform + # Common symbols not allowed in MH_DYLIB files + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fno-common' + case $cc_basename in + nagfor*) + # NAG Fortran compiler + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,-Wl,,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-PIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + esac + ;; + + mingw* | cygwin* | pw32* | os2* | cegcc*) + # This hack is so that the source file can tell whether it is being + # built for inclusion in a dll (and should export symbols for example). + m4_if([$1], [GCJ], [], + [_LT_TAGVAR(lt_prog_compiler_pic, $1)='-DDLL_EXPORT']) + case $host_os in + os2*) + _LT_TAGVAR(lt_prog_compiler_static, $1)='$wl-static' + ;; + esac + ;; + + hpux9* | hpux10* | hpux11*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + # PIC is the default for IA64 HP-UX and 64-bit HP-UX, but + # not for PA HP-UX. + case $host_cpu in + hppa*64*|ia64*) + # +Z the default + ;; + *) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='+Z' + ;; + esac + # Is there a better lt_prog_compiler_static that works with the bundled CC? + _LT_TAGVAR(lt_prog_compiler_static, $1)='$wl-a ${wl}archive' + ;; + + irix5* | irix6* | nonstopux*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + # PIC (with -KPIC) is the default. + _LT_TAGVAR(lt_prog_compiler_static, $1)='-non_shared' + ;; + + linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) + case $cc_basename in + # old Intel for x86_64, which still supported -KPIC. + ecc*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + ;; + # icc used to be incompatible with GCC. + # ICC 10 doesn't accept -KPIC any more. + icc* | ifort*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + ;; + # Lahey Fortran 8.1. + lf95*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='--shared' + _LT_TAGVAR(lt_prog_compiler_static, $1)='--static' + ;; + nagfor*) + # NAG Fortran compiler + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,-Wl,,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-PIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + tcc*) + # Fabrice Bellard et al's Tiny C Compiler + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + ;; + pgcc* | pgf77* | pgf90* | pgf95* | pgfortran*) + # Portland Group compilers (*not* the Pentium gcc compiler, + # which looks to be a dead project) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fpic' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + ccc*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + # All Alpha code is PIC. + _LT_TAGVAR(lt_prog_compiler_static, $1)='-non_shared' + ;; + xl* | bgxl* | bgf* | mpixl*) + # IBM XL C 8.0/Fortran 10.1, 11.1 on PPC and BlueGene + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-qpic' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-qstaticlink' + ;; + *) + case `$CC -V 2>&1 | sed 5q` in + *Sun\ Ceres\ Fortran* | *Sun*Fortran*\ [[1-7]].* | *Sun*Fortran*\ 8.[[0-3]]*) + # Sun Fortran 8.3 passes all unrecognized flags to the linker + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + _LT_TAGVAR(lt_prog_compiler_wl, $1)='' + ;; + *Sun\ F* | *Sun*Fortran*) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Qoption ld ' + ;; + *Sun\ C*) + # Sun C 5.9 + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + ;; + *Intel*\ [[CF]]*Compiler*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + ;; + *Portland\ Group*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fpic' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + esac + ;; + esac + ;; + + newsos6) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + + *nto* | *qnx*) + # QNX uses GNU C++, but need to define -shared option too, otherwise + # it will coredump. + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC -shared' + ;; + + osf3* | osf4* | osf5*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + # All OSF/1 code is PIC. + _LT_TAGVAR(lt_prog_compiler_static, $1)='-non_shared' + ;; + + rdos*) + _LT_TAGVAR(lt_prog_compiler_static, $1)='-non_shared' + ;; + + solaris*) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + case $cc_basename in + f77* | f90* | f95* | sunf77* | sunf90* | sunf95*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Qoption ld ';; + *) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,';; + esac + ;; + + sunos4*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Qoption ld ' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-PIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + + sysv4 | sysv4.2uw2* | sysv4.3*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + + sysv4*MP*) + if test -d /usr/nec; then + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-Kconform_pic' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + fi + ;; + + sysv5* | unixware* | sco3.2v5* | sco5v6* | OpenUNIX*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + + unicos*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_can_build_shared, $1)=no + ;; + + uts4*) + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-pic' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-Bstatic' + ;; + + *) + _LT_TAGVAR(lt_prog_compiler_can_build_shared, $1)=no + ;; + esac + fi +]) +case $host_os in + # For platforms that do not support PIC, -DPIC is meaningless: + *djgpp*) + _LT_TAGVAR(lt_prog_compiler_pic, $1)= + ;; + *) + _LT_TAGVAR(lt_prog_compiler_pic, $1)="$_LT_TAGVAR(lt_prog_compiler_pic, $1)@&t@m4_if([$1],[],[ -DPIC],[m4_if([$1],[CXX],[ -DPIC],[])])" + ;; +esac + +AC_CACHE_CHECK([for $compiler option to produce PIC], + [_LT_TAGVAR(lt_cv_prog_compiler_pic, $1)], + [_LT_TAGVAR(lt_cv_prog_compiler_pic, $1)=$_LT_TAGVAR(lt_prog_compiler_pic, $1)]) +_LT_TAGVAR(lt_prog_compiler_pic, $1)=$_LT_TAGVAR(lt_cv_prog_compiler_pic, $1) + +# +# Check to make sure the PIC flag actually works. +# +if test -n "$_LT_TAGVAR(lt_prog_compiler_pic, $1)"; then + _LT_COMPILER_OPTION([if $compiler PIC flag $_LT_TAGVAR(lt_prog_compiler_pic, $1) works], + [_LT_TAGVAR(lt_cv_prog_compiler_pic_works, $1)], + [$_LT_TAGVAR(lt_prog_compiler_pic, $1)@&t@m4_if([$1],[],[ -DPIC],[m4_if([$1],[CXX],[ -DPIC],[])])], [], + [case $_LT_TAGVAR(lt_prog_compiler_pic, $1) in + "" | " "*) ;; + *) _LT_TAGVAR(lt_prog_compiler_pic, $1)=" $_LT_TAGVAR(lt_prog_compiler_pic, $1)" ;; + esac], + [_LT_TAGVAR(lt_prog_compiler_pic, $1)= + _LT_TAGVAR(lt_prog_compiler_can_build_shared, $1)=no]) +fi +_LT_TAGDECL([pic_flag], [lt_prog_compiler_pic], [1], + [Additional compiler flags for building library objects]) + +_LT_TAGDECL([wl], [lt_prog_compiler_wl], [1], + [How to pass a linker flag through the compiler]) +# +# Check to make sure the static flag actually works. +# +wl=$_LT_TAGVAR(lt_prog_compiler_wl, $1) eval lt_tmp_static_flag=\"$_LT_TAGVAR(lt_prog_compiler_static, $1)\" +_LT_LINKER_OPTION([if $compiler static flag $lt_tmp_static_flag works], + _LT_TAGVAR(lt_cv_prog_compiler_static_works, $1), + $lt_tmp_static_flag, + [], + [_LT_TAGVAR(lt_prog_compiler_static, $1)=]) +_LT_TAGDECL([link_static_flag], [lt_prog_compiler_static], [1], + [Compiler flag to prevent dynamic linking]) +])# _LT_COMPILER_PIC + + +# _LT_LINKER_SHLIBS([TAGNAME]) +# ---------------------------- +# See if the linker supports building shared libraries. +m4_defun([_LT_LINKER_SHLIBS], +[AC_REQUIRE([LT_PATH_LD])dnl +AC_REQUIRE([LT_PATH_NM])dnl +m4_require([_LT_PATH_MANIFEST_TOOL])dnl +m4_require([_LT_FILEUTILS_DEFAULTS])dnl +m4_require([_LT_DECL_EGREP])dnl +m4_require([_LT_DECL_SED])dnl +m4_require([_LT_CMD_GLOBAL_SYMBOLS])dnl +m4_require([_LT_TAG_COMPILER])dnl +AC_MSG_CHECKING([whether the $compiler linker ($LD) supports shared libraries]) +m4_if([$1], [CXX], [ + _LT_TAGVAR(export_symbols_cmds, $1)='$NM $libobjs $convenience | $global_symbol_pipe | $SED '\''s/.* //'\'' | sort | uniq > $export_symbols' + _LT_TAGVAR(exclude_expsyms, $1)=['_GLOBAL_OFFSET_TABLE_|_GLOBAL__F[ID]_.*'] + case $host_os in + aix[[4-9]]*) + # If we're using GNU nm, then we don't want the "-C" option. + # -C means demangle to GNU nm, but means don't demangle to AIX nm. + # Without the "-l" option, or with the "-B" option, AIX nm treats + # weak defined symbols like other global defined symbols, whereas + # GNU nm marks them as "W". + # While the 'weak' keyword is ignored in the Export File, we need + # it in the Import File for the 'aix-soname' feature, so we have + # to replace the "-B" option with "-P" for AIX nm. + if $NM -V 2>&1 | $GREP 'GNU' > /dev/null; then + _LT_TAGVAR(export_symbols_cmds, $1)='$NM -Bpg $libobjs $convenience | awk '\''{ if (((\$ 2 == "T") || (\$ 2 == "D") || (\$ 2 == "B") || (\$ 2 == "W")) && ([substr](\$ 3,1,1) != ".")) { if (\$ 2 == "W") { print \$ 3 " weak" } else { print \$ 3 } } }'\'' | sort -u > $export_symbols' + else + _LT_TAGVAR(export_symbols_cmds, $1)='`func_echo_all $NM | $SED -e '\''s/B\([[^B]]*\)$/P\1/'\''` -PCpgl $libobjs $convenience | awk '\''{ if (((\$ 2 == "T") || (\$ 2 == "D") || (\$ 2 == "B") || (\$ 2 == "W") || (\$ 2 == "V") || (\$ 2 == "Z")) && ([substr](\$ 1,1,1) != ".")) { if ((\$ 2 == "W") || (\$ 2 == "V") || (\$ 2 == "Z")) { print \$ 1 " weak" } else { print \$ 1 } } }'\'' | sort -u > $export_symbols' + fi + ;; + pw32*) + _LT_TAGVAR(export_symbols_cmds, $1)=$ltdll_cmds + ;; + cygwin* | mingw* | cegcc*) + case $cc_basename in + cl*) + _LT_TAGVAR(exclude_expsyms, $1)='_NULL_IMPORT_DESCRIPTOR|_IMPORT_DESCRIPTOR_.*' + ;; + *) + _LT_TAGVAR(export_symbols_cmds, $1)='$NM $libobjs $convenience | $global_symbol_pipe | $SED -e '\''/^[[BCDGRS]][[ ]]/s/.*[[ ]]\([[^ ]]*\)/\1 DATA/;s/^.*[[ ]]__nm__\([[^ ]]*\)[[ ]][[^ ]]*/\1 DATA/;/^I[[ ]]/d;/^[[AITW]][[ ]]/s/.* //'\'' | sort | uniq > $export_symbols' + _LT_TAGVAR(exclude_expsyms, $1)=['[_]+GLOBAL_OFFSET_TABLE_|[_]+GLOBAL__[FID]_.*|[_]+head_[A-Za-z0-9_]+_dll|[A-Za-z0-9_]+_dll_iname'] + ;; + esac + ;; + linux* | k*bsd*-gnu | gnu*) + _LT_TAGVAR(link_all_deplibs, $1)=no + ;; + *) + _LT_TAGVAR(export_symbols_cmds, $1)='$NM $libobjs $convenience | $global_symbol_pipe | $SED '\''s/.* //'\'' | sort | uniq > $export_symbols' + ;; + esac +], [ + runpath_var= + _LT_TAGVAR(allow_undefined_flag, $1)= + _LT_TAGVAR(always_export_symbols, $1)=no + _LT_TAGVAR(archive_cmds, $1)= + _LT_TAGVAR(archive_expsym_cmds, $1)= + _LT_TAGVAR(compiler_needs_object, $1)=no + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=no + _LT_TAGVAR(export_dynamic_flag_spec, $1)= + _LT_TAGVAR(export_symbols_cmds, $1)='$NM $libobjs $convenience | $global_symbol_pipe | $SED '\''s/.* //'\'' | sort | uniq > $export_symbols' + _LT_TAGVAR(hardcode_automatic, $1)=no + _LT_TAGVAR(hardcode_direct, $1)=no + _LT_TAGVAR(hardcode_direct_absolute, $1)=no + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)= + _LT_TAGVAR(hardcode_libdir_separator, $1)= + _LT_TAGVAR(hardcode_minus_L, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=unsupported + _LT_TAGVAR(inherit_rpath, $1)=no + _LT_TAGVAR(link_all_deplibs, $1)=unknown + _LT_TAGVAR(module_cmds, $1)= + _LT_TAGVAR(module_expsym_cmds, $1)= + _LT_TAGVAR(old_archive_from_new_cmds, $1)= + _LT_TAGVAR(old_archive_from_expsyms_cmds, $1)= + _LT_TAGVAR(thread_safe_flag_spec, $1)= + _LT_TAGVAR(whole_archive_flag_spec, $1)= + # include_expsyms should be a list of space-separated symbols to be *always* + # included in the symbol list + _LT_TAGVAR(include_expsyms, $1)= + # exclude_expsyms can be an extended regexp of symbols to exclude + # it will be wrapped by ' (' and ')$', so one must not match beginning or + # end of line. Example: 'a|bc|.*d.*' will exclude the symbols 'a' and 'bc', + # as well as any symbol that contains 'd'. + _LT_TAGVAR(exclude_expsyms, $1)=['_GLOBAL_OFFSET_TABLE_|_GLOBAL__F[ID]_.*'] + # Although _GLOBAL_OFFSET_TABLE_ is a valid symbol C name, most a.out + # platforms (ab)use it in PIC code, but their linkers get confused if + # the symbol is explicitly referenced. Since portable code cannot + # rely on this symbol name, it's probably fine to never include it in + # preloaded symbol tables. + # Exclude shared library initialization/finalization symbols. +dnl Note also adjust exclude_expsyms for C++ above. + extract_expsyms_cmds= + + case $host_os in + cygwin* | mingw* | pw32* | cegcc*) + # FIXME: the MSVC++ port hasn't been tested in a loooong time + # When not using gcc, we currently assume that we are using + # Microsoft Visual C++. + if test yes != "$GCC"; then + with_gnu_ld=no + fi + ;; + interix*) + # we just hope/assume this is gcc and not c89 (= MSVC++) + with_gnu_ld=yes + ;; + openbsd* | bitrig*) + with_gnu_ld=no + ;; + linux* | k*bsd*-gnu | gnu*) + _LT_TAGVAR(link_all_deplibs, $1)=no + ;; + esac + + _LT_TAGVAR(ld_shlibs, $1)=yes + + # On some targets, GNU ld is compatible enough with the native linker + # that we're better off using the native interface for both. + lt_use_gnu_ld_interface=no + if test yes = "$with_gnu_ld"; then + case $host_os in + aix*) + # The AIX port of GNU ld has always aspired to compatibility + # with the native linker. However, as the warning in the GNU ld + # block says, versions before 2.19.5* couldn't really create working + # shared libraries, regardless of the interface used. + case `$LD -v 2>&1` in + *\ \(GNU\ Binutils\)\ 2.19.5*) ;; + *\ \(GNU\ Binutils\)\ 2.[[2-9]]*) ;; + *\ \(GNU\ Binutils\)\ [[3-9]]*) ;; + *) + lt_use_gnu_ld_interface=yes + ;; + esac + ;; + *) + lt_use_gnu_ld_interface=yes + ;; + esac + fi + + if test yes = "$lt_use_gnu_ld_interface"; then + # If archive_cmds runs LD, not CC, wlarc should be empty + wlarc='$wl' + + # Set some defaults for GNU ld with shared library support. These + # are reset later if shared libraries are not supported. Putting them + # here allows them to be overridden if necessary. + runpath_var=LD_RUN_PATH + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl--export-dynamic' + # ancient GNU ld didn't support --whole-archive et. al. + if $LD --help 2>&1 | $GREP 'no-whole-archive' > /dev/null; then + _LT_TAGVAR(whole_archive_flag_spec, $1)=$wlarc'--whole-archive$convenience '$wlarc'--no-whole-archive' + else + _LT_TAGVAR(whole_archive_flag_spec, $1)= + fi + supports_anon_versioning=no + case `$LD -v | $SED -e 's/([^)]\+)\s\+//' 2>&1` in + *GNU\ gold*) supports_anon_versioning=yes ;; + *\ [[01]].* | *\ 2.[[0-9]].* | *\ 2.10.*) ;; # catch versions < 2.11 + *\ 2.11.93.0.2\ *) supports_anon_versioning=yes ;; # RH7.3 ... + *\ 2.11.92.0.12\ *) supports_anon_versioning=yes ;; # Mandrake 8.2 ... + *\ 2.11.*) ;; # other 2.11 versions + *) supports_anon_versioning=yes ;; + esac + + # See if GNU ld supports shared libraries. + case $host_os in + aix[[3-9]]*) + # On AIX/PPC, the GNU linker is very broken + if test ia64 != "$host_cpu"; then + _LT_TAGVAR(ld_shlibs, $1)=no + cat <<_LT_EOF 1>&2 + +*** Warning: the GNU linker, at least up to release 2.19, is reported +*** to be unable to reliably create shared libraries on AIX. +*** Therefore, libtool is disabling shared libraries support. If you +*** really care for shared libraries, you may want to install binutils +*** 2.20 or above, or modify your PATH so that a non-GNU linker is found. +*** You will then need to restart the configuration process. + +_LT_EOF + fi + ;; + + amigaos*) + case $host_cpu in + powerpc) + # see comment about AmigaOS4 .so support + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='' + ;; + m68k) + _LT_TAGVAR(archive_cmds, $1)='$RM $output_objdir/a2ixlibrary.data~$ECHO "#define NAME $libname" > $output_objdir/a2ixlibrary.data~$ECHO "#define LIBRARY_ID 1" >> $output_objdir/a2ixlibrary.data~$ECHO "#define VERSION $major" >> $output_objdir/a2ixlibrary.data~$ECHO "#define REVISION $revision" >> $output_objdir/a2ixlibrary.data~$AR $AR_FLAGS $lib $libobjs~$RANLIB $lib~(cd $output_objdir && a2ixlibrary -32)' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_minus_L, $1)=yes + ;; + esac + ;; + + beos*) + if $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + # Joseph Beckenbach says some releases of gcc + # support --undefined. This deserves some investigation. FIXME + _LT_TAGVAR(archive_cmds, $1)='$CC -nostart $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + + cygwin* | mingw* | pw32* | cegcc*) + # _LT_TAGVAR(hardcode_libdir_flag_spec, $1) is actually meaningless, + # as there is no search path for DLLs. + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl--export-all-symbols' + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + _LT_TAGVAR(always_export_symbols, $1)=no + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=yes + _LT_TAGVAR(export_symbols_cmds, $1)='$NM $libobjs $convenience | $global_symbol_pipe | $SED -e '\''/^[[BCDGRS]][[ ]]/s/.*[[ ]]\([[^ ]]*\)/\1 DATA/;s/^.*[[ ]]__nm__\([[^ ]]*\)[[ ]][[^ ]]*/\1 DATA/;/^I[[ ]]/d;/^[[AITW]][[ ]]/s/.* //'\'' | sort | uniq > $export_symbols' + _LT_TAGVAR(exclude_expsyms, $1)=['[_]+GLOBAL_OFFSET_TABLE_|[_]+GLOBAL__[FID]_.*|[_]+head_[A-Za-z0-9_]+_dll|[A-Za-z0-9_]+_dll_iname'] + + if $LD --help 2>&1 | $GREP 'auto-import' > /dev/null; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags -o $output_objdir/$soname $wl--enable-auto-image-base -Xlinker --out-implib -Xlinker $lib' + # If the export-symbols file already is a .def file, use it as + # is; otherwise, prepend EXPORTS... + _LT_TAGVAR(archive_expsym_cmds, $1)='if _LT_DLL_DEF_P([$export_symbols]); then + cp $export_symbols $output_objdir/$soname.def; + else + echo EXPORTS > $output_objdir/$soname.def; + cat $export_symbols >> $output_objdir/$soname.def; + fi~ + $CC -shared $output_objdir/$soname.def $libobjs $deplibs $compiler_flags -o $output_objdir/$soname $wl--enable-auto-image-base -Xlinker --out-implib -Xlinker $lib' + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + + haiku*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(link_all_deplibs, $1)=yes + ;; + + os2*) + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_minus_L, $1)=yes + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + shrext_cmds=.dll + _LT_TAGVAR(archive_cmds, $1)='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + emxexp $libobjs | $SED /"_DLL_InitTerm"/d >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + _LT_TAGVAR(archive_expsym_cmds, $1)='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + prefix_cmds="$SED"~ + if test EXPORTS = "`$SED 1q $export_symbols`"; then + prefix_cmds="$prefix_cmds -e 1d"; + fi~ + prefix_cmds="$prefix_cmds -e \"s/^\(.*\)$/_\1/g\""~ + cat $export_symbols | $prefix_cmds >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + _LT_TAGVAR(old_archive_From_new_cmds, $1)='emximp -o $output_objdir/${libname}_dll.a $output_objdir/$libname.def' + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=yes + ;; + + interix[[3-9]]*) + _LT_TAGVAR(hardcode_direct, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath,$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + # Hack: On Interix 3.x, we cannot compile PIC because of a broken gcc. + # Instead, shared libraries are loaded at an image base (0x10000000 by + # default) and relocated if they conflict, which is a slow very memory + # consuming and fragmenting process. To avoid this, we pick a random, + # 256 KiB-aligned image base between 0x50000000 and 0x6FFC0000 at link + # time. Moving up from 0x10000000 also allows more sbrk(2) space. + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-h,$soname $wl--image-base,`expr ${RANDOM-$$} % 4096 / 2 \* 262144 + 1342177280` -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='sed "s|^|_|" $export_symbols >$output_objdir/$soname.expsym~$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-h,$soname $wl--retain-symbols-file,$output_objdir/$soname.expsym $wl--image-base,`expr ${RANDOM-$$} % 4096 / 2 \* 262144 + 1342177280` -o $lib' + ;; + + gnu* | linux* | tpf* | k*bsd*-gnu | kopensolaris*-gnu) + tmp_diet=no + if test linux-dietlibc = "$host_os"; then + case $cc_basename in + diet\ *) tmp_diet=yes;; # linux-dietlibc with static linking (!diet-dyn) + esac + fi + if $LD --help 2>&1 | $EGREP ': supported targets:.* elf' > /dev/null \ + && test no = "$tmp_diet" + then + tmp_addflag=' $pic_flag' + tmp_sharedflag='-shared' + case $cc_basename,$host_cpu in + pgcc*) # Portland Group C compiler + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + tmp_addflag=' $pic_flag' + ;; + pgf77* | pgf90* | pgf95* | pgfortran*) + # Portland Group f77 and f90 compilers + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + tmp_addflag=' $pic_flag -Mnomain' ;; + ecc*,ia64* | icc*,ia64*) # Intel C compiler on ia64 + tmp_addflag=' -i_dynamic' ;; + efc*,ia64* | ifort*,ia64*) # Intel Fortran compiler on ia64 + tmp_addflag=' -i_dynamic -nofor_main' ;; + ifc* | ifort*) # Intel Fortran compiler + tmp_addflag=' -nofor_main' ;; + lf95*) # Lahey Fortran 8.1 + _LT_TAGVAR(whole_archive_flag_spec, $1)= + tmp_sharedflag='--shared' ;; + nagfor*) # NAGFOR 5.3 + tmp_sharedflag='-Wl,-shared' ;; + xl[[cC]]* | bgxl[[cC]]* | mpixl[[cC]]*) # IBM XL C 8.0 on PPC (deal with xlf below) + tmp_sharedflag='-qmkshrobj' + tmp_addflag= ;; + nvcc*) # Cuda Compiler Driver 2.2 + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + _LT_TAGVAR(compiler_needs_object, $1)=yes + ;; + esac + case `$CC -V 2>&1 | sed 5q` in + *Sun\ C*) # Sun C 5.9 + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive`new_convenience=; for conv in $convenience\"\"; do test -z \"$conv\" || new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + _LT_TAGVAR(compiler_needs_object, $1)=yes + tmp_sharedflag='-G' ;; + *Sun\ F*) # Sun Fortran 8.3 + tmp_sharedflag='-G' ;; + esac + _LT_TAGVAR(archive_cmds, $1)='$CC '"$tmp_sharedflag""$tmp_addflag"' $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + + if test yes = "$supports_anon_versioning"; then + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $output_objdir/$libname.ver~ + cat $export_symbols | sed -e "s/\(.*\)/\1;/" >> $output_objdir/$libname.ver~ + echo "local: *; };" >> $output_objdir/$libname.ver~ + $CC '"$tmp_sharedflag""$tmp_addflag"' $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-version-script $wl$output_objdir/$libname.ver -o $lib' + fi + + case $cc_basename in + tcc*) + _LT_TAGVAR(export_dynamic_flag_spec, $1)='-rdynamic' + ;; + xlf* | bgf* | bgxlf* | mpixlf*) + # IBM XL Fortran 10.1 on PPC cannot create shared libs itself + _LT_TAGVAR(whole_archive_flag_spec, $1)='--whole-archive$convenience --no-whole-archive' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(archive_cmds, $1)='$LD -shared $libobjs $deplibs $linker_flags -soname $soname -o $lib' + if test yes = "$supports_anon_versioning"; then + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $output_objdir/$libname.ver~ + cat $export_symbols | sed -e "s/\(.*\)/\1;/" >> $output_objdir/$libname.ver~ + echo "local: *; };" >> $output_objdir/$libname.ver~ + $LD -shared $libobjs $deplibs $linker_flags -soname $soname -version-script $output_objdir/$libname.ver -o $lib' + fi + ;; + esac + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + + netbsd* | netbsdelf*-gnu) + if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then + _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' + wlarc= + else + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + fi + ;; + + solaris*) + if $LD -v 2>&1 | $GREP 'BFD 2\.8' > /dev/null; then + _LT_TAGVAR(ld_shlibs, $1)=no + cat <<_LT_EOF 1>&2 + +*** Warning: The releases 2.8.* of the GNU linker cannot reliably +*** create shared libraries on Solaris systems. Therefore, libtool +*** is disabling shared libraries support. We urge you to upgrade GNU +*** binutils to release 2.9.1 or newer. Another option is to modify +*** your PATH or compiler configuration so that the native linker is +*** used, and then restart. + +_LT_EOF + elif $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + + sysv5* | sco3.2v5* | sco5v6* | unixware* | OpenUNIX*) + case `$LD -v 2>&1` in + *\ [[01]].* | *\ 2.[[0-9]].* | *\ 2.1[[0-5]].*) + _LT_TAGVAR(ld_shlibs, $1)=no + cat <<_LT_EOF 1>&2 + +*** Warning: Releases of the GNU linker prior to 2.16.91.0.3 cannot +*** reliably create shared libraries on SCO systems. Therefore, libtool +*** is disabling shared libraries support. We urge you to upgrade GNU +*** binutils to release 2.16.91.0.3 or newer. Another option is to modify +*** your PATH or compiler configuration so that the native linker is +*** used, and then restart. + +_LT_EOF + ;; + *) + # For security reasons, it is highly recommended that you always + # use absolute paths for naming shared libraries, and exclude the + # DT_RUNPATH tag from executables and libraries. But doing so + # requires that you compile everything twice, which is a pain. + if $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + esac + ;; + + sunos4*) + _LT_TAGVAR(archive_cmds, $1)='$LD -assert pure-text -Bshareable -o $lib $libobjs $deplibs $linker_flags' + wlarc= + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + *) + if $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + esac + + if test no = "$_LT_TAGVAR(ld_shlibs, $1)"; then + runpath_var= + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)= + _LT_TAGVAR(export_dynamic_flag_spec, $1)= + _LT_TAGVAR(whole_archive_flag_spec, $1)= + fi + else + # PORTME fill in a description of your system's linker (not GNU ld) + case $host_os in + aix3*) + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + _LT_TAGVAR(always_export_symbols, $1)=yes + _LT_TAGVAR(archive_expsym_cmds, $1)='$LD -o $output_objdir/$soname $libobjs $deplibs $linker_flags -bE:$export_symbols -T512 -H512 -bM:SRE~$AR $AR_FLAGS $lib $output_objdir/$soname' + # Note: this linker hardcodes the directories in LIBPATH if there + # are no directories specified by -L. + _LT_TAGVAR(hardcode_minus_L, $1)=yes + if test yes = "$GCC" && test -z "$lt_prog_compiler_static"; then + # Neither direct hardcoding nor static linking is supported with a + # broken collect2. + _LT_TAGVAR(hardcode_direct, $1)=unsupported + fi + ;; + + aix[[4-9]]*) + if test ia64 = "$host_cpu"; then + # On IA64, the linker does run time linking by default, so we don't + # have to do anything special. + aix_use_runtimelinking=no + exp_sym_flag='-Bexport' + no_entry_flag= + else + # If we're using GNU nm, then we don't want the "-C" option. + # -C means demangle to GNU nm, but means don't demangle to AIX nm. + # Without the "-l" option, or with the "-B" option, AIX nm treats + # weak defined symbols like other global defined symbols, whereas + # GNU nm marks them as "W". + # While the 'weak' keyword is ignored in the Export File, we need + # it in the Import File for the 'aix-soname' feature, so we have + # to replace the "-B" option with "-P" for AIX nm. + if $NM -V 2>&1 | $GREP 'GNU' > /dev/null; then + _LT_TAGVAR(export_symbols_cmds, $1)='$NM -Bpg $libobjs $convenience | awk '\''{ if (((\$ 2 == "T") || (\$ 2 == "D") || (\$ 2 == "B") || (\$ 2 == "W")) && ([substr](\$ 3,1,1) != ".")) { if (\$ 2 == "W") { print \$ 3 " weak" } else { print \$ 3 } } }'\'' | sort -u > $export_symbols' + else + _LT_TAGVAR(export_symbols_cmds, $1)='`func_echo_all $NM | $SED -e '\''s/B\([[^B]]*\)$/P\1/'\''` -PCpgl $libobjs $convenience | awk '\''{ if (((\$ 2 == "T") || (\$ 2 == "D") || (\$ 2 == "B") || (\$ 2 == "W") || (\$ 2 == "V") || (\$ 2 == "Z")) && ([substr](\$ 1,1,1) != ".")) { if ((\$ 2 == "W") || (\$ 2 == "V") || (\$ 2 == "Z")) { print \$ 1 " weak" } else { print \$ 1 } } }'\'' | sort -u > $export_symbols' + fi + aix_use_runtimelinking=no + + # Test if we are trying to use run time linking or normal + # AIX style linking. If -brtl is somewhere in LDFLAGS, we + # have runtime linking enabled, and use it for executables. + # For shared libraries, we enable/disable runtime linking + # depending on the kind of the shared library created - + # when "with_aix_soname,aix_use_runtimelinking" is: + # "aix,no" lib.a(lib.so.V) shared, rtl:no, for executables + # "aix,yes" lib.so shared, rtl:yes, for executables + # lib.a static archive + # "both,no" lib.so.V(shr.o) shared, rtl:yes + # lib.a(lib.so.V) shared, rtl:no, for executables + # "both,yes" lib.so.V(shr.o) shared, rtl:yes, for executables + # lib.a(lib.so.V) shared, rtl:no + # "svr4,*" lib.so.V(shr.o) shared, rtl:yes, for executables + # lib.a static archive + case $host_os in aix4.[[23]]|aix4.[[23]].*|aix[[5-9]]*) + for ld_flag in $LDFLAGS; do + if (test x-brtl = "x$ld_flag" || test x-Wl,-brtl = "x$ld_flag"); then + aix_use_runtimelinking=yes + break + fi + done + if test svr4,no = "$with_aix_soname,$aix_use_runtimelinking"; then + # With aix-soname=svr4, we create the lib.so.V shared archives only, + # so we don't have lib.a shared libs to link our executables. + # We have to force runtime linking in this case. + aix_use_runtimelinking=yes + LDFLAGS="$LDFLAGS -Wl,-brtl" + fi + ;; + esac + + exp_sym_flag='-bexport' + no_entry_flag='-bnoentry' + fi + + # When large executables or shared objects are built, AIX ld can + # have problems creating the table of contents. If linking a library + # or program results in "error TOC overflow" add -mminimal-toc to + # CXXFLAGS/CFLAGS for g++/gcc. In the cases where that is not + # enough to fix the problem, add -Wl,-bbigtoc to LDFLAGS. + + _LT_TAGVAR(archive_cmds, $1)='' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_direct_absolute, $1)=yes + _LT_TAGVAR(hardcode_libdir_separator, $1)=':' + _LT_TAGVAR(link_all_deplibs, $1)=yes + _LT_TAGVAR(file_list_spec, $1)='$wl-f,' + case $with_aix_soname,$aix_use_runtimelinking in + aix,*) ;; # traditional, no import file + svr4,* | *,yes) # use import file + # The Import File defines what to hardcode. + _LT_TAGVAR(hardcode_direct, $1)=no + _LT_TAGVAR(hardcode_direct_absolute, $1)=no + ;; + esac + + if test yes = "$GCC"; then + case $host_os in aix4.[[012]]|aix4.[[012]].*) + # We only want to do this on AIX 4.2 and lower, the check + # below for broken collect2 doesn't work under 4.3+ + collect2name=`$CC -print-prog-name=collect2` + if test -f "$collect2name" && + strings "$collect2name" | $GREP resolve_lib_name >/dev/null + then + # We have reworked collect2 + : + else + # We have old collect2 + _LT_TAGVAR(hardcode_direct, $1)=unsupported + # It fails to find uninstalled libraries when the uninstalled + # path is not listed in the libpath. Setting hardcode_minus_L + # to unsupported forces relinking + _LT_TAGVAR(hardcode_minus_L, $1)=yes + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)= + fi + ;; + esac + shared_flag='-shared' + if test yes = "$aix_use_runtimelinking"; then + shared_flag="$shared_flag "'$wl-G' + fi + # Need to ensure runtime linking is disabled for the traditional + # shared library, or the linker may eventually find shared libraries + # /with/ Import File - we do not want to mix them. + shared_flag_aix='-shared' + shared_flag_svr4='-shared $wl-G' + else + # not using gcc + if test ia64 = "$host_cpu"; then + # VisualAge C++, Version 5.5 for AIX 5L for IA-64, Beta 3 Release + # chokes on -Wl,-G. The following line is correct: + shared_flag='-G' + else + if test yes = "$aix_use_runtimelinking"; then + shared_flag='$wl-G' + else + shared_flag='$wl-bM:SRE' + fi + shared_flag_aix='$wl-bM:SRE' + shared_flag_svr4='$wl-G' + fi + fi + + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-bexpall' + # It seems that -bexpall does not export symbols beginning with + # underscore (_), so it is better to generate a list of symbols to export. + _LT_TAGVAR(always_export_symbols, $1)=yes + if test aix,yes = "$with_aix_soname,$aix_use_runtimelinking"; then + # Warning - without using the other runtime loading flags (-brtl), + # -berok will link without error, but may produce a broken library. + _LT_TAGVAR(allow_undefined_flag, $1)='-berok' + # Determine the default libpath from the value encoded in an + # empty executable. + _LT_SYS_MODULE_PATH_AIX([$1]) + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-blibpath:$libdir:'"$aix_libpath" + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -o $output_objdir/$soname $libobjs $deplibs $wl'$no_entry_flag' $compiler_flags `if test -n "$allow_undefined_flag"; then func_echo_all "$wl$allow_undefined_flag"; else :; fi` $wl'$exp_sym_flag:\$export_symbols' '$shared_flag + else + if test ia64 = "$host_cpu"; then + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-R $libdir:/usr/lib:/lib' + _LT_TAGVAR(allow_undefined_flag, $1)="-z nodefs" + _LT_TAGVAR(archive_expsym_cmds, $1)="\$CC $shared_flag"' -o $output_objdir/$soname $libobjs $deplibs '"\$wl$no_entry_flag"' $compiler_flags $wl$allow_undefined_flag '"\$wl$exp_sym_flag:\$export_symbols" + else + # Determine the default libpath from the value encoded in an + # empty executable. + _LT_SYS_MODULE_PATH_AIX([$1]) + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-blibpath:$libdir:'"$aix_libpath" + # Warning - without using the other run time loading flags, + # -berok will link without error, but may produce a broken library. + _LT_TAGVAR(no_undefined_flag, $1)=' $wl-bernotok' + _LT_TAGVAR(allow_undefined_flag, $1)=' $wl-berok' + if test yes = "$with_gnu_ld"; then + # We only use this code for GNU lds that support --whole-archive. + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive$convenience $wl--no-whole-archive' + else + # Exported symbols can be pulled into shared objects from archives + _LT_TAGVAR(whole_archive_flag_spec, $1)='$convenience' + fi + _LT_TAGVAR(archive_cmds_need_lc, $1)=yes + _LT_TAGVAR(archive_expsym_cmds, $1)='$RM -r $output_objdir/$realname.d~$MKDIR $output_objdir/$realname.d' + # -brtl affects multiple linker settings, -berok does not and is overridden later + compiler_flags_filtered='`func_echo_all "$compiler_flags " | $SED -e "s%-brtl\\([[, ]]\\)%-berok\\1%g"`' + if test svr4 != "$with_aix_soname"; then + # This is similar to how AIX traditionally builds its shared libraries. + _LT_TAGVAR(archive_expsym_cmds, $1)="$_LT_TAGVAR(archive_expsym_cmds, $1)"'~$CC '$shared_flag_aix' -o $output_objdir/$realname.d/$soname $libobjs $deplibs $wl-bnoentry '$compiler_flags_filtered'$wl-bE:$export_symbols$allow_undefined_flag~$AR $AR_FLAGS $output_objdir/$libname$release.a $output_objdir/$realname.d/$soname' + fi + if test aix != "$with_aix_soname"; then + _LT_TAGVAR(archive_expsym_cmds, $1)="$_LT_TAGVAR(archive_expsym_cmds, $1)"'~$CC '$shared_flag_svr4' -o $output_objdir/$realname.d/$shared_archive_member_spec.o $libobjs $deplibs $wl-bnoentry '$compiler_flags_filtered'$wl-bE:$export_symbols$allow_undefined_flag~$STRIP -e $output_objdir/$realname.d/$shared_archive_member_spec.o~( func_echo_all "#! $soname($shared_archive_member_spec.o)"; if test shr_64 = "$shared_archive_member_spec"; then func_echo_all "# 64"; else func_echo_all "# 32"; fi; cat $export_symbols ) > $output_objdir/$realname.d/$shared_archive_member_spec.imp~$AR $AR_FLAGS $output_objdir/$soname $output_objdir/$realname.d/$shared_archive_member_spec.o $output_objdir/$realname.d/$shared_archive_member_spec.imp' + else + # used by -dlpreopen to get the symbols + _LT_TAGVAR(archive_expsym_cmds, $1)="$_LT_TAGVAR(archive_expsym_cmds, $1)"'~$MV $output_objdir/$realname.d/$soname $output_objdir' + fi + _LT_TAGVAR(archive_expsym_cmds, $1)="$_LT_TAGVAR(archive_expsym_cmds, $1)"'~$RM -r $output_objdir/$realname.d' + fi + fi + ;; + + amigaos*) + case $host_cpu in + powerpc) + # see comment about AmigaOS4 .so support + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='' + ;; + m68k) + _LT_TAGVAR(archive_cmds, $1)='$RM $output_objdir/a2ixlibrary.data~$ECHO "#define NAME $libname" > $output_objdir/a2ixlibrary.data~$ECHO "#define LIBRARY_ID 1" >> $output_objdir/a2ixlibrary.data~$ECHO "#define VERSION $major" >> $output_objdir/a2ixlibrary.data~$ECHO "#define REVISION $revision" >> $output_objdir/a2ixlibrary.data~$AR $AR_FLAGS $lib $libobjs~$RANLIB $lib~(cd $output_objdir && a2ixlibrary -32)' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_minus_L, $1)=yes + ;; + esac + ;; + + bsdi[[45]]*) + _LT_TAGVAR(export_dynamic_flag_spec, $1)=-rdynamic + ;; + + cygwin* | mingw* | pw32* | cegcc*) + # When not using gcc, we currently assume that we are using + # Microsoft Visual C++. + # hardcode_libdir_flag_spec is actually meaningless, as there is + # no search path for DLLs. + case $cc_basename in + cl*) + # Native MSVC + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)=' ' + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + _LT_TAGVAR(always_export_symbols, $1)=yes + _LT_TAGVAR(file_list_spec, $1)='@' + # Tell ltmain to make .lib files, not .a files. + libext=lib + # Tell ltmain to make .dll files, not .so files. + shrext_cmds=.dll + # FIXME: Setting linknames here is a bad hack. + _LT_TAGVAR(archive_cmds, $1)='$CC -o $output_objdir/$soname $libobjs $compiler_flags $deplibs -Wl,-DLL,-IMPLIB:"$tool_output_objdir$libname.dll.lib"~linknames=' + _LT_TAGVAR(archive_expsym_cmds, $1)='if _LT_DLL_DEF_P([$export_symbols]); then + cp "$export_symbols" "$output_objdir/$soname.def"; + echo "$tool_output_objdir$soname.def" > "$output_objdir/$soname.exp"; + else + $SED -e '\''s/^/-link -EXPORT:/'\'' < $export_symbols > $output_objdir/$soname.exp; + fi~ + $CC -o $tool_output_objdir$soname $libobjs $compiler_flags $deplibs "@$tool_output_objdir$soname.exp" -Wl,-DLL,-IMPLIB:"$tool_output_objdir$libname.dll.lib"~ + linknames=' + # The linker will not automatically build a static lib if we build a DLL. + # _LT_TAGVAR(old_archive_from_new_cmds, $1)='true' + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=yes + _LT_TAGVAR(exclude_expsyms, $1)='_NULL_IMPORT_DESCRIPTOR|_IMPORT_DESCRIPTOR_.*' + _LT_TAGVAR(export_symbols_cmds, $1)='$NM $libobjs $convenience | $global_symbol_pipe | $SED -e '\''/^[[BCDGRS]][[ ]]/s/.*[[ ]]\([[^ ]]*\)/\1,DATA/'\'' | $SED -e '\''/^[[AITW]][[ ]]/s/.*[[ ]]//'\'' | sort | uniq > $export_symbols' + # Don't use ranlib + _LT_TAGVAR(old_postinstall_cmds, $1)='chmod 644 $oldlib' + _LT_TAGVAR(postlink_cmds, $1)='lt_outputfile="@OUTPUT@"~ + lt_tool_outputfile="@TOOL_OUTPUT@"~ + case $lt_outputfile in + *.exe|*.EXE) ;; + *) + lt_outputfile=$lt_outputfile.exe + lt_tool_outputfile=$lt_tool_outputfile.exe + ;; + esac~ + if test : != "$MANIFEST_TOOL" && test -f "$lt_outputfile.manifest"; then + $MANIFEST_TOOL -manifest "$lt_tool_outputfile.manifest" -outputresource:"$lt_tool_outputfile" || exit 1; + $RM "$lt_outputfile.manifest"; + fi' + ;; + *) + # Assume MSVC wrapper + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)=' ' + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + # Tell ltmain to make .lib files, not .a files. + libext=lib + # Tell ltmain to make .dll files, not .so files. + shrext_cmds=.dll + # FIXME: Setting linknames here is a bad hack. + _LT_TAGVAR(archive_cmds, $1)='$CC -o $lib $libobjs $compiler_flags `func_echo_all "$deplibs" | $SED '\''s/ -lc$//'\''` -link -dll~linknames=' + # The linker will automatically build a .lib file if we build a DLL. + _LT_TAGVAR(old_archive_from_new_cmds, $1)='true' + # FIXME: Should let the user specify the lib program. + _LT_TAGVAR(old_archive_cmds, $1)='lib -OUT:$oldlib$oldobjs$old_deplibs' + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=yes + ;; + esac + ;; + + darwin* | rhapsody*) + _LT_DARWIN_LINKER_FEATURES($1) + ;; + + dgux*) + _LT_TAGVAR(archive_cmds, $1)='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + # FreeBSD 2.2.[012] allows us to include c++rt0.o to get C++ constructor + # support. Future versions do this automatically, but an explicit c++rt0.o + # does not break anything, and helps significantly (at the cost of a little + # extra space). + freebsd2.2*) + _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags /usr/lib/c++rt0.o' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-R$libdir' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + # Unfortunately, older versions of FreeBSD 2 do not have this feature. + freebsd2.*) + _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_minus_L, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + # FreeBSD 3 and greater uses gcc -shared to do shared libraries. + freebsd* | dragonfly*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-R$libdir' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + hpux9*) + if test yes = "$GCC"; then + _LT_TAGVAR(archive_cmds, $1)='$RM $output_objdir/$soname~$CC -shared $pic_flag $wl+b $wl$install_libdir -o $output_objdir/$soname $libobjs $deplibs $compiler_flags~test "x$output_objdir/$soname" = "x$lib" || mv $output_objdir/$soname $lib' + else + _LT_TAGVAR(archive_cmds, $1)='$RM $output_objdir/$soname~$LD -b +b $install_libdir -o $output_objdir/$soname $libobjs $deplibs $linker_flags~test "x$output_objdir/$soname" = "x$lib" || mv $output_objdir/$soname $lib' + fi + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl+b $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + _LT_TAGVAR(hardcode_direct, $1)=yes + + # hardcode_minus_L: Not really in the search PATH, + # but as the default location of the library. + _LT_TAGVAR(hardcode_minus_L, $1)=yes + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + ;; + + hpux10*) + if test yes,no = "$GCC,$with_gnu_ld"; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $libobjs $deplibs $compiler_flags' + else + _LT_TAGVAR(archive_cmds, $1)='$LD -b +h $soname +b $install_libdir -o $lib $libobjs $deplibs $linker_flags' + fi + if test no = "$with_gnu_ld"; then + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl+b $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_direct_absolute, $1)=yes + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + # hardcode_minus_L: Not really in the search PATH, + # but as the default location of the library. + _LT_TAGVAR(hardcode_minus_L, $1)=yes + fi + ;; + + hpux11*) + if test yes,no = "$GCC,$with_gnu_ld"; then + case $host_cpu in + hppa*64*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $wl+h $wl$soname -o $lib $libobjs $deplibs $compiler_flags' + ;; + ia64*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $wl+h $wl$soname $wl+nodefaultrpath -o $lib $libobjs $deplibs $compiler_flags' + ;; + *) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $libobjs $deplibs $compiler_flags' + ;; + esac + else + case $host_cpu in + hppa*64*) + _LT_TAGVAR(archive_cmds, $1)='$CC -b $wl+h $wl$soname -o $lib $libobjs $deplibs $compiler_flags' + ;; + ia64*) + _LT_TAGVAR(archive_cmds, $1)='$CC -b $wl+h $wl$soname $wl+nodefaultrpath -o $lib $libobjs $deplibs $compiler_flags' + ;; + *) + m4_if($1, [], [ + # Older versions of the 11.00 compiler do not understand -b yet + # (HP92453-01 A.11.01.20 doesn't, HP92453-01 B.11.X.35175-35176.GP does) + _LT_LINKER_OPTION([if $CC understands -b], + _LT_TAGVAR(lt_cv_prog_compiler__b, $1), [-b], + [_LT_TAGVAR(archive_cmds, $1)='$CC -b $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $libobjs $deplibs $compiler_flags'], + [_LT_TAGVAR(archive_cmds, $1)='$LD -b +h $soname +b $install_libdir -o $lib $libobjs $deplibs $linker_flags'])], + [_LT_TAGVAR(archive_cmds, $1)='$CC -b $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $libobjs $deplibs $compiler_flags']) + ;; + esac + fi + if test no = "$with_gnu_ld"; then + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl+b $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + + case $host_cpu in + hppa*64*|ia64*) + _LT_TAGVAR(hardcode_direct, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + *) + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_direct_absolute, $1)=yes + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + + # hardcode_minus_L: Not really in the search PATH, + # but as the default location of the library. + _LT_TAGVAR(hardcode_minus_L, $1)=yes + ;; + esac + fi + ;; + + irix5* | irix6* | nonstopux*) + if test yes = "$GCC"; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + # Try to use the -exported_symbol ld option, if it does not + # work, assume that -exports_file does not work either and + # implicitly export all symbols. + # This should be the same for all languages, so no per-tag cache variable. + AC_CACHE_CHECK([whether the $host_os linker accepts -exported_symbol], + [lt_cv_irix_exported_symbol], + [save_LDFLAGS=$LDFLAGS + LDFLAGS="$LDFLAGS -shared $wl-exported_symbol ${wl}foo $wl-update_registry $wl/dev/null" + AC_LINK_IFELSE( + [AC_LANG_SOURCE( + [AC_LANG_CASE([C], [[int foo (void) { return 0; }]], + [C++], [[int foo (void) { return 0; }]], + [Fortran 77], [[ + subroutine foo + end]], + [Fortran], [[ + subroutine foo + end]])])], + [lt_cv_irix_exported_symbol=yes], + [lt_cv_irix_exported_symbol=no]) + LDFLAGS=$save_LDFLAGS]) + if test yes = "$lt_cv_irix_exported_symbol"; then + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' + fi + _LT_TAGVAR(link_all_deplibs, $1)=no + else + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' + fi + _LT_TAGVAR(archive_cmds_need_lc, $1)='no' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + _LT_TAGVAR(inherit_rpath, $1)=yes + _LT_TAGVAR(link_all_deplibs, $1)=yes + ;; + + linux*) + case $cc_basename in + tcc*) + # Fabrice Bellard et al's Tiny C Compiler + _LT_TAGVAR(ld_shlibs, $1)=yes + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags' + ;; + esac + ;; + + netbsd* | netbsdelf*-gnu) + if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then + _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out + else + _LT_TAGVAR(archive_cmds, $1)='$LD -shared -o $lib $libobjs $deplibs $linker_flags' # ELF + fi + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-R$libdir' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + newsos6) + _LT_TAGVAR(archive_cmds, $1)='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + *nto* | *qnx*) + ;; + + openbsd* | bitrig*) + if test -f /usr/libexec/ld.so; then + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + _LT_TAGVAR(hardcode_direct_absolute, $1)=yes + if test -z "`echo __ELF__ | $CC -E - | $GREP __ELF__`"; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags $wl-retain-symbols-file,$export_symbols' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath,$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + else + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath,$libdir' + fi + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + + os2*) + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_minus_L, $1)=yes + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + shrext_cmds=.dll + _LT_TAGVAR(archive_cmds, $1)='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + emxexp $libobjs | $SED /"_DLL_InitTerm"/d >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + _LT_TAGVAR(archive_expsym_cmds, $1)='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + prefix_cmds="$SED"~ + if test EXPORTS = "`$SED 1q $export_symbols`"; then + prefix_cmds="$prefix_cmds -e 1d"; + fi~ + prefix_cmds="$prefix_cmds -e \"s/^\(.*\)$/_\1/g\""~ + cat $export_symbols | $prefix_cmds >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + _LT_TAGVAR(old_archive_From_new_cmds, $1)='emximp -o $output_objdir/${libname}_dll.a $output_objdir/$libname.def' + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=yes + ;; + + osf3*) + if test yes = "$GCC"; then + _LT_TAGVAR(allow_undefined_flag, $1)=' $wl-expect_unresolved $wl\*' + _LT_TAGVAR(archive_cmds, $1)='$CC -shared$allow_undefined_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + else + _LT_TAGVAR(allow_undefined_flag, $1)=' -expect_unresolved \*' + _LT_TAGVAR(archive_cmds, $1)='$CC -shared$allow_undefined_flag $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + fi + _LT_TAGVAR(archive_cmds_need_lc, $1)='no' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + ;; + + osf4* | osf5*) # as osf3* with the addition of -msym flag + if test yes = "$GCC"; then + _LT_TAGVAR(allow_undefined_flag, $1)=' $wl-expect_unresolved $wl\*' + _LT_TAGVAR(archive_cmds, $1)='$CC -shared$allow_undefined_flag $pic_flag $libobjs $deplibs $compiler_flags $wl-msym $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + else + _LT_TAGVAR(allow_undefined_flag, $1)=' -expect_unresolved \*' + _LT_TAGVAR(archive_cmds, $1)='$CC -shared$allow_undefined_flag $libobjs $deplibs $compiler_flags -msym -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='for i in `cat $export_symbols`; do printf "%s %s\\n" -exported_symbol "\$i" >> $lib.exp; done; printf "%s\\n" "-hidden">> $lib.exp~ + $CC -shared$allow_undefined_flag $wl-input $wl$lib.exp $compiler_flags $libobjs $deplibs -soname $soname `test -n "$verstring" && $ECHO "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib~$RM $lib.exp' + + # Both c and cxx compiler support -rpath directly + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-rpath $libdir' + fi + _LT_TAGVAR(archive_cmds_need_lc, $1)='no' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + ;; + + solaris*) + _LT_TAGVAR(no_undefined_flag, $1)=' -z defs' + if test yes = "$GCC"; then + wlarc='$wl' + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $wl-z ${wl}text $wl-h $wl$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $CC -shared $pic_flag $wl-z ${wl}text $wl-M $wl$lib.exp $wl-h $wl$soname -o $lib $libobjs $deplibs $compiler_flags~$RM $lib.exp' + else + case `$CC -V 2>&1` in + *"Compilers 5.0"*) + wlarc='' + _LT_TAGVAR(archive_cmds, $1)='$LD -G$allow_undefined_flag -h $soname -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $LD -G$allow_undefined_flag -M $lib.exp -h $soname -o $lib $libobjs $deplibs $linker_flags~$RM $lib.exp' + ;; + *) + wlarc='$wl' + _LT_TAGVAR(archive_cmds, $1)='$CC -G$allow_undefined_flag -h $soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $CC -G$allow_undefined_flag -M $lib.exp -h $soname -o $lib $libobjs $deplibs $compiler_flags~$RM $lib.exp' + ;; + esac + fi + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-R$libdir' + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + case $host_os in + solaris2.[[0-5]] | solaris2.[[0-5]].*) ;; + *) + # The compiler driver will combine and reorder linker options, + # but understands '-z linker_flag'. GCC discards it without '$wl', + # but is careful enough not to reorder. + # Supported since Solaris 2.6 (maybe 2.5.1?) + if test yes = "$GCC"; then + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl-z ${wl}allextract$convenience $wl-z ${wl}defaultextract' + else + _LT_TAGVAR(whole_archive_flag_spec, $1)='-z allextract$convenience -z defaultextract' + fi + ;; + esac + _LT_TAGVAR(link_all_deplibs, $1)=yes + ;; + + sunos4*) + if test sequent = "$host_vendor"; then + # Use $CC to link under sequent, because it throws in some extra .o + # files that make .init and .fini sections work. + _LT_TAGVAR(archive_cmds, $1)='$CC -G $wl-h $soname -o $lib $libobjs $deplibs $compiler_flags' + else + _LT_TAGVAR(archive_cmds, $1)='$LD -assert pure-text -Bstatic -o $lib $libobjs $deplibs $linker_flags' + fi + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_minus_L, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + sysv4) + case $host_vendor in + sni) + _LT_TAGVAR(archive_cmds, $1)='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(hardcode_direct, $1)=yes # is this really true??? + ;; + siemens) + ## LD is ld it makes a PLAMLIB + ## CC just makes a GrossModule. + _LT_TAGVAR(archive_cmds, $1)='$LD -G -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(reload_cmds, $1)='$CC -r -o $output$reload_objs' + _LT_TAGVAR(hardcode_direct, $1)=no + ;; + motorola) + _LT_TAGVAR(archive_cmds, $1)='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(hardcode_direct, $1)=no #Motorola manual says yes, but my tests say they lie + ;; + esac + runpath_var='LD_RUN_PATH' + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + sysv4.3*) + _LT_TAGVAR(archive_cmds, $1)='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + _LT_TAGVAR(export_dynamic_flag_spec, $1)='-Bexport' + ;; + + sysv4*MP*) + if test -d /usr/nec; then + _LT_TAGVAR(archive_cmds, $1)='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + runpath_var=LD_RUN_PATH + hardcode_runpath_var=yes + _LT_TAGVAR(ld_shlibs, $1)=yes + fi + ;; + + sysv4*uw2* | sysv5OpenUNIX* | sysv5UnixWare7.[[01]].[[10]]* | unixware7* | sco3.2v5.0.[[024]]*) + _LT_TAGVAR(no_undefined_flag, $1)='$wl-z,text' + _LT_TAGVAR(archive_cmds_need_lc, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + runpath_var='LD_RUN_PATH' + + if test yes = "$GCC"; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + else + _LT_TAGVAR(archive_cmds, $1)='$CC -G $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -G $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + fi + ;; + + sysv5* | sco3.2v5* | sco5v6*) + # Note: We CANNOT use -z defs as we might desire, because we do not + # link with -lc, and that would cause any symbols used from libc to + # always be unresolved, which means just about no library would + # ever link correctly. If we're not using GNU ld we use -z text + # though, which does catch some bad symbols but isn't as heavy-handed + # as -z defs. + _LT_TAGVAR(no_undefined_flag, $1)='$wl-z,text' + _LT_TAGVAR(allow_undefined_flag, $1)='$wl-z,nodefs' + _LT_TAGVAR(archive_cmds_need_lc, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-R,$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=':' + _LT_TAGVAR(link_all_deplibs, $1)=yes + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-Bexport' + runpath_var='LD_RUN_PATH' + + if test yes = "$GCC"; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + else + _LT_TAGVAR(archive_cmds, $1)='$CC -G $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -G $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + fi + ;; + + uts4*) + _LT_TAGVAR(archive_cmds, $1)='$LD -G -h $soname -o $lib $libobjs $deplibs $linker_flags' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + + *) + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + esac + + if test sni = "$host_vendor"; then + case $host in + sysv4 | sysv4.2uw2* | sysv4.3* | sysv5*) + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-Blargedynsym' + ;; + esac + fi + fi +]) +AC_MSG_RESULT([$_LT_TAGVAR(ld_shlibs, $1)]) +test no = "$_LT_TAGVAR(ld_shlibs, $1)" && can_build_shared=no + +_LT_TAGVAR(with_gnu_ld, $1)=$with_gnu_ld + +_LT_DECL([], [libext], [0], [Old archive suffix (normally "a")])dnl +_LT_DECL([], [shrext_cmds], [1], [Shared library suffix (normally ".so")])dnl +_LT_DECL([], [extract_expsyms_cmds], [2], + [The commands to extract the exported symbol list from a shared archive]) + +# +# Do we need to explicitly link libc? +# +case "x$_LT_TAGVAR(archive_cmds_need_lc, $1)" in +x|xyes) + # Assume -lc should be added + _LT_TAGVAR(archive_cmds_need_lc, $1)=yes + + if test yes,yes = "$GCC,$enable_shared"; then + case $_LT_TAGVAR(archive_cmds, $1) in + *'~'*) + # FIXME: we may have to deal with multi-command sequences. + ;; + '$CC '*) + # Test whether the compiler implicitly links with -lc since on some + # systems, -lgcc has to come before -lc. If gcc already passes -lc + # to ld, don't add -lc before -lgcc. + AC_CACHE_CHECK([whether -lc should be explicitly linked in], + [lt_cv_]_LT_TAGVAR(archive_cmds_need_lc, $1), + [$RM conftest* + echo "$lt_simple_compile_test_code" > conftest.$ac_ext + + if AC_TRY_EVAL(ac_compile) 2>conftest.err; then + soname=conftest + lib=conftest + libobjs=conftest.$ac_objext + deplibs= + wl=$_LT_TAGVAR(lt_prog_compiler_wl, $1) + pic_flag=$_LT_TAGVAR(lt_prog_compiler_pic, $1) + compiler_flags=-v + linker_flags=-v + verstring= + output_objdir=. + libname=conftest + lt_save_allow_undefined_flag=$_LT_TAGVAR(allow_undefined_flag, $1) + _LT_TAGVAR(allow_undefined_flag, $1)= + if AC_TRY_EVAL(_LT_TAGVAR(archive_cmds, $1) 2\>\&1 \| $GREP \" -lc \" \>/dev/null 2\>\&1) + then + lt_cv_[]_LT_TAGVAR(archive_cmds_need_lc, $1)=no + else + lt_cv_[]_LT_TAGVAR(archive_cmds_need_lc, $1)=yes + fi + _LT_TAGVAR(allow_undefined_flag, $1)=$lt_save_allow_undefined_flag + else + cat conftest.err 1>&5 + fi + $RM conftest* + ]) + _LT_TAGVAR(archive_cmds_need_lc, $1)=$lt_cv_[]_LT_TAGVAR(archive_cmds_need_lc, $1) + ;; + esac + fi + ;; +esac + +_LT_TAGDECL([build_libtool_need_lc], [archive_cmds_need_lc], [0], + [Whether or not to add -lc for building shared libraries]) +_LT_TAGDECL([allow_libtool_libs_with_static_runtimes], + [enable_shared_with_static_runtimes], [0], + [Whether or not to disallow shared libs when runtime libs are static]) +_LT_TAGDECL([], [export_dynamic_flag_spec], [1], + [Compiler flag to allow reflexive dlopens]) +_LT_TAGDECL([], [whole_archive_flag_spec], [1], + [Compiler flag to generate shared objects directly from archives]) +_LT_TAGDECL([], [compiler_needs_object], [1], + [Whether the compiler copes with passing no objects directly]) +_LT_TAGDECL([], [old_archive_from_new_cmds], [2], + [Create an old-style archive from a shared archive]) +_LT_TAGDECL([], [old_archive_from_expsyms_cmds], [2], + [Create a temporary old-style archive to link instead of a shared archive]) +_LT_TAGDECL([], [archive_cmds], [2], [Commands used to build a shared archive]) +_LT_TAGDECL([], [archive_expsym_cmds], [2]) +_LT_TAGDECL([], [module_cmds], [2], + [Commands used to build a loadable module if different from building + a shared archive.]) +_LT_TAGDECL([], [module_expsym_cmds], [2]) +_LT_TAGDECL([], [with_gnu_ld], [1], + [Whether we are building with GNU ld or not]) +_LT_TAGDECL([], [allow_undefined_flag], [1], + [Flag that allows shared libraries with undefined symbols to be built]) +_LT_TAGDECL([], [no_undefined_flag], [1], + [Flag that enforces no undefined symbols]) +_LT_TAGDECL([], [hardcode_libdir_flag_spec], [1], + [Flag to hardcode $libdir into a binary during linking. + This must work even if $libdir does not exist]) +_LT_TAGDECL([], [hardcode_libdir_separator], [1], + [Whether we need a single "-rpath" flag with a separated argument]) +_LT_TAGDECL([], [hardcode_direct], [0], + [Set to "yes" if using DIR/libNAME$shared_ext during linking hardcodes + DIR into the resulting binary]) +_LT_TAGDECL([], [hardcode_direct_absolute], [0], + [Set to "yes" if using DIR/libNAME$shared_ext during linking hardcodes + DIR into the resulting binary and the resulting library dependency is + "absolute", i.e impossible to change by setting $shlibpath_var if the + library is relocated]) +_LT_TAGDECL([], [hardcode_minus_L], [0], + [Set to "yes" if using the -LDIR flag during linking hardcodes DIR + into the resulting binary]) +_LT_TAGDECL([], [hardcode_shlibpath_var], [0], + [Set to "yes" if using SHLIBPATH_VAR=DIR during linking hardcodes DIR + into the resulting binary]) +_LT_TAGDECL([], [hardcode_automatic], [0], + [Set to "yes" if building a shared library automatically hardcodes DIR + into the library and all subsequent libraries and executables linked + against it]) +_LT_TAGDECL([], [inherit_rpath], [0], + [Set to yes if linker adds runtime paths of dependent libraries + to runtime path list]) +_LT_TAGDECL([], [link_all_deplibs], [0], + [Whether libtool must link a program against all its dependency libraries]) +_LT_TAGDECL([], [always_export_symbols], [0], + [Set to "yes" if exported symbols are required]) +_LT_TAGDECL([], [export_symbols_cmds], [2], + [The commands to list exported symbols]) +_LT_TAGDECL([], [exclude_expsyms], [1], + [Symbols that should not be listed in the preloaded symbols]) +_LT_TAGDECL([], [include_expsyms], [1], + [Symbols that must always be exported]) +_LT_TAGDECL([], [prelink_cmds], [2], + [Commands necessary for linking programs (against libraries) with templates]) +_LT_TAGDECL([], [postlink_cmds], [2], + [Commands necessary for finishing linking programs]) +_LT_TAGDECL([], [file_list_spec], [1], + [Specify filename containing input files]) +dnl FIXME: Not yet implemented +dnl _LT_TAGDECL([], [thread_safe_flag_spec], [1], +dnl [Compiler flag to generate thread safe objects]) +])# _LT_LINKER_SHLIBS + + +# _LT_LANG_C_CONFIG([TAG]) +# ------------------------ +# Ensure that the configuration variables for a C compiler are suitably +# defined. These variables are subsequently used by _LT_CONFIG to write +# the compiler configuration to 'libtool'. +m4_defun([_LT_LANG_C_CONFIG], +[m4_require([_LT_DECL_EGREP])dnl +lt_save_CC=$CC +AC_LANG_PUSH(C) + +# Source file extension for C test sources. +ac_ext=c + +# Object file extension for compiled C test sources. +objext=o +_LT_TAGVAR(objext, $1)=$objext + +# Code to be used in simple compile tests +lt_simple_compile_test_code="int some_variable = 0;" + +# Code to be used in simple link tests +lt_simple_link_test_code='int main(){return(0);}' + +_LT_TAG_COMPILER +# Save the default compiler, since it gets overwritten when the other +# tags are being tested, and _LT_TAGVAR(compiler, []) is a NOP. +compiler_DEFAULT=$CC + +# save warnings/boilerplate of simple test code +_LT_COMPILER_BOILERPLATE +_LT_LINKER_BOILERPLATE + +## CAVEAT EMPTOR: +## There is no encapsulation within the following macros, do not change +## the running order or otherwise move them around unless you know exactly +## what you are doing... +if test -n "$compiler"; then + _LT_COMPILER_NO_RTTI($1) + _LT_COMPILER_PIC($1) + _LT_COMPILER_C_O($1) + _LT_COMPILER_FILE_LOCKS($1) + _LT_LINKER_SHLIBS($1) + _LT_SYS_DYNAMIC_LINKER($1) + _LT_LINKER_HARDCODE_LIBPATH($1) + LT_SYS_DLOPEN_SELF + _LT_CMD_STRIPLIB + + # Report what library types will actually be built + AC_MSG_CHECKING([if libtool supports shared libraries]) + AC_MSG_RESULT([$can_build_shared]) + + AC_MSG_CHECKING([whether to build shared libraries]) + test no = "$can_build_shared" && enable_shared=no + + # On AIX, shared libraries and static libraries use the same namespace, and + # are all built from PIC. + case $host_os in + aix3*) + test yes = "$enable_shared" && enable_static=no + if test -n "$RANLIB"; then + archive_cmds="$archive_cmds~\$RANLIB \$lib" + postinstall_cmds='$RANLIB $lib' + fi + ;; + + aix[[4-9]]*) + if test ia64 != "$host_cpu"; then + case $enable_shared,$with_aix_soname,$aix_use_runtimelinking in + yes,aix,yes) ;; # shared object as lib.so file only + yes,svr4,*) ;; # shared object as lib.so archive member only + yes,*) enable_static=no ;; # shared object in lib.a archive as well + esac + fi + ;; + esac + AC_MSG_RESULT([$enable_shared]) + + AC_MSG_CHECKING([whether to build static libraries]) + # Make sure either enable_shared or enable_static is yes. + test yes = "$enable_shared" || enable_static=yes + AC_MSG_RESULT([$enable_static]) + + _LT_CONFIG($1) +fi +AC_LANG_POP +CC=$lt_save_CC +])# _LT_LANG_C_CONFIG + + +# _LT_LANG_CXX_CONFIG([TAG]) +# -------------------------- +# Ensure that the configuration variables for a C++ compiler are suitably +# defined. These variables are subsequently used by _LT_CONFIG to write +# the compiler configuration to 'libtool'. +m4_defun([_LT_LANG_CXX_CONFIG], +[m4_require([_LT_FILEUTILS_DEFAULTS])dnl +m4_require([_LT_DECL_EGREP])dnl +m4_require([_LT_PATH_MANIFEST_TOOL])dnl +if test -n "$CXX" && ( test no != "$CXX" && + ( (test g++ = "$CXX" && `g++ -v >/dev/null 2>&1` ) || + (test g++ != "$CXX"))); then + AC_PROG_CXXCPP +else + _lt_caught_CXX_error=yes +fi + +AC_LANG_PUSH(C++) +_LT_TAGVAR(archive_cmds_need_lc, $1)=no +_LT_TAGVAR(allow_undefined_flag, $1)= +_LT_TAGVAR(always_export_symbols, $1)=no +_LT_TAGVAR(archive_expsym_cmds, $1)= +_LT_TAGVAR(compiler_needs_object, $1)=no +_LT_TAGVAR(export_dynamic_flag_spec, $1)= +_LT_TAGVAR(hardcode_direct, $1)=no +_LT_TAGVAR(hardcode_direct_absolute, $1)=no +_LT_TAGVAR(hardcode_libdir_flag_spec, $1)= +_LT_TAGVAR(hardcode_libdir_separator, $1)= +_LT_TAGVAR(hardcode_minus_L, $1)=no +_LT_TAGVAR(hardcode_shlibpath_var, $1)=unsupported +_LT_TAGVAR(hardcode_automatic, $1)=no +_LT_TAGVAR(inherit_rpath, $1)=no +_LT_TAGVAR(module_cmds, $1)= +_LT_TAGVAR(module_expsym_cmds, $1)= +_LT_TAGVAR(link_all_deplibs, $1)=unknown +_LT_TAGVAR(old_archive_cmds, $1)=$old_archive_cmds +_LT_TAGVAR(reload_flag, $1)=$reload_flag +_LT_TAGVAR(reload_cmds, $1)=$reload_cmds +_LT_TAGVAR(no_undefined_flag, $1)= +_LT_TAGVAR(whole_archive_flag_spec, $1)= +_LT_TAGVAR(enable_shared_with_static_runtimes, $1)=no + +# Source file extension for C++ test sources. +ac_ext=cpp + +# Object file extension for compiled C++ test sources. +objext=o +_LT_TAGVAR(objext, $1)=$objext + +# No sense in running all these tests if we already determined that +# the CXX compiler isn't working. Some variables (like enable_shared) +# are currently assumed to apply to all compilers on this platform, +# and will be corrupted by setting them based on a non-working compiler. +if test yes != "$_lt_caught_CXX_error"; then + # Code to be used in simple compile tests + lt_simple_compile_test_code="int some_variable = 0;" + + # Code to be used in simple link tests + lt_simple_link_test_code='int main(int, char *[[]]) { return(0); }' + + # ltmain only uses $CC for tagged configurations so make sure $CC is set. + _LT_TAG_COMPILER + + # save warnings/boilerplate of simple test code + _LT_COMPILER_BOILERPLATE + _LT_LINKER_BOILERPLATE + + # Allow CC to be a program name with arguments. + lt_save_CC=$CC + lt_save_CFLAGS=$CFLAGS + lt_save_LD=$LD + lt_save_GCC=$GCC + GCC=$GXX + lt_save_with_gnu_ld=$with_gnu_ld + lt_save_path_LD=$lt_cv_path_LD + if test -n "${lt_cv_prog_gnu_ldcxx+set}"; then + lt_cv_prog_gnu_ld=$lt_cv_prog_gnu_ldcxx + else + $as_unset lt_cv_prog_gnu_ld + fi + if test -n "${lt_cv_path_LDCXX+set}"; then + lt_cv_path_LD=$lt_cv_path_LDCXX + else + $as_unset lt_cv_path_LD + fi + test -z "${LDCXX+set}" || LD=$LDCXX + CC=${CXX-"c++"} + CFLAGS=$CXXFLAGS + compiler=$CC + _LT_TAGVAR(compiler, $1)=$CC + _LT_CC_BASENAME([$compiler]) + + if test -n "$compiler"; then + # We don't want -fno-exception when compiling C++ code, so set the + # no_builtin_flag separately + if test yes = "$GXX"; then + _LT_TAGVAR(lt_prog_compiler_no_builtin_flag, $1)=' -fno-builtin' + else + _LT_TAGVAR(lt_prog_compiler_no_builtin_flag, $1)= + fi + + if test yes = "$GXX"; then + # Set up default GNU C++ configuration + + LT_PATH_LD + + # Check if GNU C++ uses GNU ld as the underlying linker, since the + # archiving commands below assume that GNU ld is being used. + if test yes = "$with_gnu_ld"; then + _LT_TAGVAR(archive_cmds, $1)='$CC $pic_flag -shared -nostdlib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC $pic_flag -shared -nostdlib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl--export-dynamic' + + # If archive_cmds runs LD, not CC, wlarc should be empty + # XXX I think wlarc can be eliminated in ltcf-cxx, but I need to + # investigate it a little bit more. (MM) + wlarc='$wl' + + # ancient GNU ld didn't support --whole-archive et. al. + if eval "`$CC -print-prog-name=ld` --help 2>&1" | + $GREP 'no-whole-archive' > /dev/null; then + _LT_TAGVAR(whole_archive_flag_spec, $1)=$wlarc'--whole-archive$convenience '$wlarc'--no-whole-archive' + else + _LT_TAGVAR(whole_archive_flag_spec, $1)= + fi + else + with_gnu_ld=no + wlarc= + + # A generic and very simple default shared library creation + # command for GNU C++ for the case where it uses the native + # linker, instead of GNU ld. If possible, this setting should + # overridden to take advantage of the native linker features on + # the platform it is being used on. + _LT_TAGVAR(archive_cmds, $1)='$CC -shared -nostdlib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags -o $lib' + fi + + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + + else + GXX=no + with_gnu_ld=no + wlarc= + fi + + # PORTME: fill in a description of your system's C++ link characteristics + AC_MSG_CHECKING([whether the $compiler linker ($LD) supports shared libraries]) + _LT_TAGVAR(ld_shlibs, $1)=yes + case $host_os in + aix3*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + aix[[4-9]]*) + if test ia64 = "$host_cpu"; then + # On IA64, the linker does run time linking by default, so we don't + # have to do anything special. + aix_use_runtimelinking=no + exp_sym_flag='-Bexport' + no_entry_flag= + else + aix_use_runtimelinking=no + + # Test if we are trying to use run time linking or normal + # AIX style linking. If -brtl is somewhere in LDFLAGS, we + # have runtime linking enabled, and use it for executables. + # For shared libraries, we enable/disable runtime linking + # depending on the kind of the shared library created - + # when "with_aix_soname,aix_use_runtimelinking" is: + # "aix,no" lib.a(lib.so.V) shared, rtl:no, for executables + # "aix,yes" lib.so shared, rtl:yes, for executables + # lib.a static archive + # "both,no" lib.so.V(shr.o) shared, rtl:yes + # lib.a(lib.so.V) shared, rtl:no, for executables + # "both,yes" lib.so.V(shr.o) shared, rtl:yes, for executables + # lib.a(lib.so.V) shared, rtl:no + # "svr4,*" lib.so.V(shr.o) shared, rtl:yes, for executables + # lib.a static archive + case $host_os in aix4.[[23]]|aix4.[[23]].*|aix[[5-9]]*) + for ld_flag in $LDFLAGS; do + case $ld_flag in + *-brtl*) + aix_use_runtimelinking=yes + break + ;; + esac + done + if test svr4,no = "$with_aix_soname,$aix_use_runtimelinking"; then + # With aix-soname=svr4, we create the lib.so.V shared archives only, + # so we don't have lib.a shared libs to link our executables. + # We have to force runtime linking in this case. + aix_use_runtimelinking=yes + LDFLAGS="$LDFLAGS -Wl,-brtl" + fi + ;; + esac + + exp_sym_flag='-bexport' + no_entry_flag='-bnoentry' + fi + + # When large executables or shared objects are built, AIX ld can + # have problems creating the table of contents. If linking a library + # or program results in "error TOC overflow" add -mminimal-toc to + # CXXFLAGS/CFLAGS for g++/gcc. In the cases where that is not + # enough to fix the problem, add -Wl,-bbigtoc to LDFLAGS. + + _LT_TAGVAR(archive_cmds, $1)='' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_direct_absolute, $1)=yes + _LT_TAGVAR(hardcode_libdir_separator, $1)=':' + _LT_TAGVAR(link_all_deplibs, $1)=yes + _LT_TAGVAR(file_list_spec, $1)='$wl-f,' + case $with_aix_soname,$aix_use_runtimelinking in + aix,*) ;; # no import file + svr4,* | *,yes) # use import file + # The Import File defines what to hardcode. + _LT_TAGVAR(hardcode_direct, $1)=no + _LT_TAGVAR(hardcode_direct_absolute, $1)=no + ;; + esac + + if test yes = "$GXX"; then + case $host_os in aix4.[[012]]|aix4.[[012]].*) + # We only want to do this on AIX 4.2 and lower, the check + # below for broken collect2 doesn't work under 4.3+ + collect2name=`$CC -print-prog-name=collect2` + if test -f "$collect2name" && + strings "$collect2name" | $GREP resolve_lib_name >/dev/null + then + # We have reworked collect2 + : + else + # We have old collect2 + _LT_TAGVAR(hardcode_direct, $1)=unsupported + # It fails to find uninstalled libraries when the uninstalled + # path is not listed in the libpath. Setting hardcode_minus_L + # to unsupported forces relinking + _LT_TAGVAR(hardcode_minus_L, $1)=yes + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)= + fi + esac + shared_flag='-shared' + if test yes = "$aix_use_runtimelinking"; then + shared_flag=$shared_flag' $wl-G' + fi + # Need to ensure runtime linking is disabled for the traditional + # shared library, or the linker may eventually find shared libraries + # /with/ Import File - we do not want to mix them. + shared_flag_aix='-shared' + shared_flag_svr4='-shared $wl-G' + else + # not using gcc + if test ia64 = "$host_cpu"; then + # VisualAge C++, Version 5.5 for AIX 5L for IA-64, Beta 3 Release + # chokes on -Wl,-G. The following line is correct: + shared_flag='-G' + else + if test yes = "$aix_use_runtimelinking"; then + shared_flag='$wl-G' + else + shared_flag='$wl-bM:SRE' + fi + shared_flag_aix='$wl-bM:SRE' + shared_flag_svr4='$wl-G' + fi + fi + + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-bexpall' + # It seems that -bexpall does not export symbols beginning with + # underscore (_), so it is better to generate a list of symbols to + # export. + _LT_TAGVAR(always_export_symbols, $1)=yes + if test aix,yes = "$with_aix_soname,$aix_use_runtimelinking"; then + # Warning - without using the other runtime loading flags (-brtl), + # -berok will link without error, but may produce a broken library. + # The "-G" linker flag allows undefined symbols. + _LT_TAGVAR(no_undefined_flag, $1)='-bernotok' + # Determine the default libpath from the value encoded in an empty + # executable. + _LT_SYS_MODULE_PATH_AIX([$1]) + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-blibpath:$libdir:'"$aix_libpath" + + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -o $output_objdir/$soname $libobjs $deplibs $wl'$no_entry_flag' $compiler_flags `if test -n "$allow_undefined_flag"; then func_echo_all "$wl$allow_undefined_flag"; else :; fi` $wl'$exp_sym_flag:\$export_symbols' '$shared_flag + else + if test ia64 = "$host_cpu"; then + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-R $libdir:/usr/lib:/lib' + _LT_TAGVAR(allow_undefined_flag, $1)="-z nodefs" + _LT_TAGVAR(archive_expsym_cmds, $1)="\$CC $shared_flag"' -o $output_objdir/$soname $libobjs $deplibs '"\$wl$no_entry_flag"' $compiler_flags $wl$allow_undefined_flag '"\$wl$exp_sym_flag:\$export_symbols" + else + # Determine the default libpath from the value encoded in an + # empty executable. + _LT_SYS_MODULE_PATH_AIX([$1]) + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-blibpath:$libdir:'"$aix_libpath" + # Warning - without using the other run time loading flags, + # -berok will link without error, but may produce a broken library. + _LT_TAGVAR(no_undefined_flag, $1)=' $wl-bernotok' + _LT_TAGVAR(allow_undefined_flag, $1)=' $wl-berok' + if test yes = "$with_gnu_ld"; then + # We only use this code for GNU lds that support --whole-archive. + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive$convenience $wl--no-whole-archive' + else + # Exported symbols can be pulled into shared objects from archives + _LT_TAGVAR(whole_archive_flag_spec, $1)='$convenience' + fi + _LT_TAGVAR(archive_cmds_need_lc, $1)=yes + _LT_TAGVAR(archive_expsym_cmds, $1)='$RM -r $output_objdir/$realname.d~$MKDIR $output_objdir/$realname.d' + # -brtl affects multiple linker settings, -berok does not and is overridden later + compiler_flags_filtered='`func_echo_all "$compiler_flags " | $SED -e "s%-brtl\\([[, ]]\\)%-berok\\1%g"`' + if test svr4 != "$with_aix_soname"; then + # This is similar to how AIX traditionally builds its shared + # libraries. Need -bnortl late, we may have -brtl in LDFLAGS. + _LT_TAGVAR(archive_expsym_cmds, $1)="$_LT_TAGVAR(archive_expsym_cmds, $1)"'~$CC '$shared_flag_aix' -o $output_objdir/$realname.d/$soname $libobjs $deplibs $wl-bnoentry '$compiler_flags_filtered'$wl-bE:$export_symbols$allow_undefined_flag~$AR $AR_FLAGS $output_objdir/$libname$release.a $output_objdir/$realname.d/$soname' + fi + if test aix != "$with_aix_soname"; then + _LT_TAGVAR(archive_expsym_cmds, $1)="$_LT_TAGVAR(archive_expsym_cmds, $1)"'~$CC '$shared_flag_svr4' -o $output_objdir/$realname.d/$shared_archive_member_spec.o $libobjs $deplibs $wl-bnoentry '$compiler_flags_filtered'$wl-bE:$export_symbols$allow_undefined_flag~$STRIP -e $output_objdir/$realname.d/$shared_archive_member_spec.o~( func_echo_all "#! $soname($shared_archive_member_spec.o)"; if test shr_64 = "$shared_archive_member_spec"; then func_echo_all "# 64"; else func_echo_all "# 32"; fi; cat $export_symbols ) > $output_objdir/$realname.d/$shared_archive_member_spec.imp~$AR $AR_FLAGS $output_objdir/$soname $output_objdir/$realname.d/$shared_archive_member_spec.o $output_objdir/$realname.d/$shared_archive_member_spec.imp' + else + # used by -dlpreopen to get the symbols + _LT_TAGVAR(archive_expsym_cmds, $1)="$_LT_TAGVAR(archive_expsym_cmds, $1)"'~$MV $output_objdir/$realname.d/$soname $output_objdir' + fi + _LT_TAGVAR(archive_expsym_cmds, $1)="$_LT_TAGVAR(archive_expsym_cmds, $1)"'~$RM -r $output_objdir/$realname.d' + fi + fi + ;; + + beos*) + if $LD --help 2>&1 | $GREP ': supported targets:.* elf' > /dev/null; then + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + # Joseph Beckenbach says some releases of gcc + # support --undefined. This deserves some investigation. FIXME + _LT_TAGVAR(archive_cmds, $1)='$CC -nostart $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + + chorus*) + case $cc_basename in + *) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + esac + ;; + + cygwin* | mingw* | pw32* | cegcc*) + case $GXX,$cc_basename in + ,cl* | no,cl*) + # Native MSVC + # hardcode_libdir_flag_spec is actually meaningless, as there is + # no search path for DLLs. + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)=' ' + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + _LT_TAGVAR(always_export_symbols, $1)=yes + _LT_TAGVAR(file_list_spec, $1)='@' + # Tell ltmain to make .lib files, not .a files. + libext=lib + # Tell ltmain to make .dll files, not .so files. + shrext_cmds=.dll + # FIXME: Setting linknames here is a bad hack. + _LT_TAGVAR(archive_cmds, $1)='$CC -o $output_objdir/$soname $libobjs $compiler_flags $deplibs -Wl,-DLL,-IMPLIB:"$tool_output_objdir$libname.dll.lib"~linknames=' + _LT_TAGVAR(archive_expsym_cmds, $1)='if _LT_DLL_DEF_P([$export_symbols]); then + cp "$export_symbols" "$output_objdir/$soname.def"; + echo "$tool_output_objdir$soname.def" > "$output_objdir/$soname.exp"; + else + $SED -e '\''s/^/-link -EXPORT:/'\'' < $export_symbols > $output_objdir/$soname.exp; + fi~ + $CC -o $tool_output_objdir$soname $libobjs $compiler_flags $deplibs "@$tool_output_objdir$soname.exp" -Wl,-DLL,-IMPLIB:"$tool_output_objdir$libname.dll.lib"~ + linknames=' + # The linker will not automatically build a static lib if we build a DLL. + # _LT_TAGVAR(old_archive_from_new_cmds, $1)='true' + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=yes + # Don't use ranlib + _LT_TAGVAR(old_postinstall_cmds, $1)='chmod 644 $oldlib' + _LT_TAGVAR(postlink_cmds, $1)='lt_outputfile="@OUTPUT@"~ + lt_tool_outputfile="@TOOL_OUTPUT@"~ + case $lt_outputfile in + *.exe|*.EXE) ;; + *) + lt_outputfile=$lt_outputfile.exe + lt_tool_outputfile=$lt_tool_outputfile.exe + ;; + esac~ + func_to_tool_file "$lt_outputfile"~ + if test : != "$MANIFEST_TOOL" && test -f "$lt_outputfile.manifest"; then + $MANIFEST_TOOL -manifest "$lt_tool_outputfile.manifest" -outputresource:"$lt_tool_outputfile" || exit 1; + $RM "$lt_outputfile.manifest"; + fi' + ;; + *) + # g++ + # _LT_TAGVAR(hardcode_libdir_flag_spec, $1) is actually meaningless, + # as there is no search path for DLLs. + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl--export-all-symbols' + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + _LT_TAGVAR(always_export_symbols, $1)=no + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=yes + + if $LD --help 2>&1 | $GREP 'auto-import' > /dev/null; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared -nostdlib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags -o $output_objdir/$soname $wl--enable-auto-image-base -Xlinker --out-implib -Xlinker $lib' + # If the export-symbols file already is a .def file, use it as + # is; otherwise, prepend EXPORTS... + _LT_TAGVAR(archive_expsym_cmds, $1)='if _LT_DLL_DEF_P([$export_symbols]); then + cp $export_symbols $output_objdir/$soname.def; + else + echo EXPORTS > $output_objdir/$soname.def; + cat $export_symbols >> $output_objdir/$soname.def; + fi~ + $CC -shared -nostdlib $output_objdir/$soname.def $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags -o $output_objdir/$soname $wl--enable-auto-image-base -Xlinker --out-implib -Xlinker $lib' + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + esac + ;; + darwin* | rhapsody*) + _LT_DARWIN_LINKER_FEATURES($1) + ;; + + os2*) + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-L$libdir' + _LT_TAGVAR(hardcode_minus_L, $1)=yes + _LT_TAGVAR(allow_undefined_flag, $1)=unsupported + shrext_cmds=.dll + _LT_TAGVAR(archive_cmds, $1)='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + emxexp $libobjs | $SED /"_DLL_InitTerm"/d >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + _LT_TAGVAR(archive_expsym_cmds, $1)='$ECHO "LIBRARY ${soname%$shared_ext} INITINSTANCE TERMINSTANCE" > $output_objdir/$libname.def~ + $ECHO "DESCRIPTION \"$libname\"" >> $output_objdir/$libname.def~ + $ECHO "DATA MULTIPLE NONSHARED" >> $output_objdir/$libname.def~ + $ECHO EXPORTS >> $output_objdir/$libname.def~ + prefix_cmds="$SED"~ + if test EXPORTS = "`$SED 1q $export_symbols`"; then + prefix_cmds="$prefix_cmds -e 1d"; + fi~ + prefix_cmds="$prefix_cmds -e \"s/^\(.*\)$/_\1/g\""~ + cat $export_symbols | $prefix_cmds >> $output_objdir/$libname.def~ + $CC -Zdll -Zcrtdll -o $output_objdir/$soname $libobjs $deplibs $compiler_flags $output_objdir/$libname.def~ + emximp -o $lib $output_objdir/$libname.def' + _LT_TAGVAR(old_archive_From_new_cmds, $1)='emximp -o $output_objdir/${libname}_dll.a $output_objdir/$libname.def' + _LT_TAGVAR(enable_shared_with_static_runtimes, $1)=yes + ;; + + dgux*) + case $cc_basename in + ec++*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + ghcx*) + # Green Hills C++ Compiler + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + *) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + esac + ;; + + freebsd2.*) + # C++ shared libraries reported to be fairly broken before + # switch to ELF + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + + freebsd-elf*) + _LT_TAGVAR(archive_cmds_need_lc, $1)=no + ;; + + freebsd* | dragonfly*) + # FreeBSD 3 and later use GNU C++ and GNU ld with standard ELF + # conventions + _LT_TAGVAR(ld_shlibs, $1)=yes + ;; + + haiku*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(link_all_deplibs, $1)=yes + ;; + + hpux9*) + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl+b $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_minus_L, $1)=yes # Not in the search PATH, + # but as the default + # location of the library. + + case $cc_basename in + CC*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + aCC*) + _LT_TAGVAR(archive_cmds, $1)='$RM $output_objdir/$soname~$CC -b $wl+b $wl$install_libdir -o $output_objdir/$soname $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags~test "x$output_objdir/$soname" = "x$lib" || mv $output_objdir/$soname $lib' + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + # + # There doesn't appear to be a way to prevent this compiler from + # explicitly linking system object files so we need to strip them + # from the output so that they don't get included in the library + # dependencies. + output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $EGREP "\-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' + ;; + *) + if test yes = "$GXX"; then + _LT_TAGVAR(archive_cmds, $1)='$RM $output_objdir/$soname~$CC -shared -nostdlib $pic_flag $wl+b $wl$install_libdir -o $output_objdir/$soname $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags~test "x$output_objdir/$soname" = "x$lib" || mv $output_objdir/$soname $lib' + else + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + esac + ;; + + hpux10*|hpux11*) + if test no = "$with_gnu_ld"; then + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl+b $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + + case $host_cpu in + hppa*64*|ia64*) + ;; + *) + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + ;; + esac + fi + case $host_cpu in + hppa*64*|ia64*) + _LT_TAGVAR(hardcode_direct, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + ;; + *) + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_direct_absolute, $1)=yes + _LT_TAGVAR(hardcode_minus_L, $1)=yes # Not in the search PATH, + # but as the default + # location of the library. + ;; + esac + + case $cc_basename in + CC*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + aCC*) + case $host_cpu in + hppa*64*) + _LT_TAGVAR(archive_cmds, $1)='$CC -b $wl+h $wl$soname -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags' + ;; + ia64*) + _LT_TAGVAR(archive_cmds, $1)='$CC -b $wl+h $wl$soname $wl+nodefaultrpath -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags' + ;; + *) + _LT_TAGVAR(archive_cmds, $1)='$CC -b $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags' + ;; + esac + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + # + # There doesn't appear to be a way to prevent this compiler from + # explicitly linking system object files so we need to strip them + # from the output so that they don't get included in the library + # dependencies. + output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $GREP "\-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' + ;; + *) + if test yes = "$GXX"; then + if test no = "$with_gnu_ld"; then + case $host_cpu in + hppa*64*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared -nostdlib -fPIC $wl+h $wl$soname -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags' + ;; + ia64*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared -nostdlib $pic_flag $wl+h $wl$soname $wl+nodefaultrpath -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags' + ;; + *) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared -nostdlib $pic_flag $wl+h $wl$soname $wl+b $wl$install_libdir -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags' + ;; + esac + fi + else + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + esac + ;; + + interix[[3-9]]*) + _LT_TAGVAR(hardcode_direct, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath,$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + # Hack: On Interix 3.x, we cannot compile PIC because of a broken gcc. + # Instead, shared libraries are loaded at an image base (0x10000000 by + # default) and relocated if they conflict, which is a slow very memory + # consuming and fragmenting process. To avoid this, we pick a random, + # 256 KiB-aligned image base between 0x50000000 and 0x6FFC0000 at link + # time. Moving up from 0x10000000 also allows more sbrk(2) space. + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-h,$soname $wl--image-base,`expr ${RANDOM-$$} % 4096 / 2 \* 262144 + 1342177280` -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='sed "s|^|_|" $export_symbols >$output_objdir/$soname.expsym~$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-h,$soname $wl--retain-symbols-file,$output_objdir/$soname.expsym $wl--image-base,`expr ${RANDOM-$$} % 4096 / 2 \* 262144 + 1342177280` -o $lib' + ;; + irix5* | irix6*) + case $cc_basename in + CC*) + # SGI C++ + _LT_TAGVAR(archive_cmds, $1)='$CC -shared -all -multigot $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + + # Archives containing C++ object files must be created using + # "CC -ar", where "CC" is the IRIX C++ compiler. This is + # necessary to make sure instantiated templates are included + # in the archive. + _LT_TAGVAR(old_archive_cmds, $1)='$CC -ar -WR,-u -o $oldlib $oldobjs' + ;; + *) + if test yes = "$GXX"; then + if test no = "$with_gnu_ld"; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag -nostdlib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + else + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag -nostdlib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` -o $lib' + fi + fi + _LT_TAGVAR(link_all_deplibs, $1)=yes + ;; + esac + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + _LT_TAGVAR(inherit_rpath, $1)=yes + ;; + + linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) + case $cc_basename in + KCC*) + # Kuck and Associates, Inc. (KAI) C++ Compiler + + # KCC will only create a shared library if the output file + # ends with ".so" (or ".sl" for HP-UX), so rename the library + # to its proper name (with version) after linking. + _LT_TAGVAR(archive_cmds, $1)='tempext=`echo $shared_ext | $SED -e '\''s/\([[^()0-9A-Za-z{}]]\)/\\\\\1/g'\''`; templib=`echo $lib | $SED -e "s/\$tempext\..*/.so/"`; $CC $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags --soname $soname -o \$templib; mv \$templib $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='tempext=`echo $shared_ext | $SED -e '\''s/\([[^()0-9A-Za-z{}]]\)/\\\\\1/g'\''`; templib=`echo $lib | $SED -e "s/\$tempext\..*/.so/"`; $CC $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags --soname $soname -o \$templib $wl-retain-symbols-file,$export_symbols; mv \$templib $lib' + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + # + # There doesn't appear to be a way to prevent this compiler from + # explicitly linking system object files so we need to strip them + # from the output so that they don't get included in the library + # dependencies. + output_verbose_link_cmd='templist=`$CC $CFLAGS -v conftest.$objext -o libconftest$shared_ext 2>&1 | $GREP "ld"`; rm -f libconftest$shared_ext; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' + + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath,$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl--export-dynamic' + + # Archives containing C++ object files must be created using + # "CC -Bstatic", where "CC" is the KAI C++ compiler. + _LT_TAGVAR(old_archive_cmds, $1)='$CC -Bstatic -o $oldlib $oldobjs' + ;; + icpc* | ecpc* ) + # Intel C++ + with_gnu_ld=yes + # version 8.0 and above of icpc choke on multiply defined symbols + # if we add $predep_objects and $postdep_objects, however 7.1 and + # earlier do not add the objects themselves. + case `$CC -V 2>&1` in + *"Version 7."*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + ;; + *) # Version 8.0 or newer + tmp_idyn= + case $host_cpu in + ia64*) tmp_idyn=' -i_dynamic';; + esac + _LT_TAGVAR(archive_cmds, $1)='$CC -shared'"$tmp_idyn"' $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared'"$tmp_idyn"' $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + ;; + esac + _LT_TAGVAR(archive_cmds_need_lc, $1)=no + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath,$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl--export-dynamic' + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive$convenience $wl--no-whole-archive' + ;; + pgCC* | pgcpp*) + # Portland Group C++ compiler + case `$CC -V` in + *pgCC\ [[1-5]].* | *pgcpp\ [[1-5]].*) + _LT_TAGVAR(prelink_cmds, $1)='tpldir=Template.dir~ + rm -rf $tpldir~ + $CC --prelink_objects --instantiation_dir $tpldir $objs $libobjs $compile_deplibs~ + compile_command="$compile_command `find $tpldir -name \*.o | sort | $NL2SP`"' + _LT_TAGVAR(old_archive_cmds, $1)='tpldir=Template.dir~ + rm -rf $tpldir~ + $CC --prelink_objects --instantiation_dir $tpldir $oldobjs$old_deplibs~ + $AR $AR_FLAGS $oldlib$oldobjs$old_deplibs `find $tpldir -name \*.o | sort | $NL2SP`~ + $RANLIB $oldlib' + _LT_TAGVAR(archive_cmds, $1)='tpldir=Template.dir~ + rm -rf $tpldir~ + $CC --prelink_objects --instantiation_dir $tpldir $predep_objects $libobjs $deplibs $convenience $postdep_objects~ + $CC -shared $pic_flag $predep_objects $libobjs $deplibs `find $tpldir -name \*.o | sort | $NL2SP` $postdep_objects $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='tpldir=Template.dir~ + rm -rf $tpldir~ + $CC --prelink_objects --instantiation_dir $tpldir $predep_objects $libobjs $deplibs $convenience $postdep_objects~ + $CC -shared $pic_flag $predep_objects $libobjs $deplibs `find $tpldir -name \*.o | sort | $NL2SP` $postdep_objects $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + ;; + *) # Version 6 and above use weak symbols + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname $wl-retain-symbols-file $wl$export_symbols -o $lib' + ;; + esac + + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl--rpath $wl$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl--export-dynamic' + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive`for conv in $convenience\"\"; do test -n \"$conv\" && new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + ;; + cxx*) + # Compaq C++ + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname -o $lib $wl-retain-symbols-file $wl$export_symbols' + + runpath_var=LD_RUN_PATH + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-rpath $libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + # + # There doesn't appear to be a way to prevent this compiler from + # explicitly linking system object files so we need to strip them + # from the output so that they don't get included in the library + # dependencies. + output_verbose_link_cmd='templist=`$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP "ld"`; templist=`func_echo_all "$templist" | $SED "s/\(^.*ld.*\)\( .*ld .*$\)/\1/"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "X$list" | $Xsed' + ;; + xl* | mpixl* | bgxl*) + # IBM XL 8.0 on PPC, with GNU ld + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl--export-dynamic' + _LT_TAGVAR(archive_cmds, $1)='$CC -qmkshrobj $libobjs $deplibs $compiler_flags $wl-soname $wl$soname -o $lib' + if test yes = "$supports_anon_versioning"; then + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $output_objdir/$libname.ver~ + cat $export_symbols | sed -e "s/\(.*\)/\1;/" >> $output_objdir/$libname.ver~ + echo "local: *; };" >> $output_objdir/$libname.ver~ + $CC -qmkshrobj $libobjs $deplibs $compiler_flags $wl-soname $wl$soname $wl-version-script $wl$output_objdir/$libname.ver -o $lib' + fi + ;; + *) + case `$CC -V 2>&1 | sed 5q` in + *Sun\ C*) + # Sun C++ 5.9 + _LT_TAGVAR(no_undefined_flag, $1)=' -zdefs' + _LT_TAGVAR(archive_cmds, $1)='$CC -G$allow_undefined_flag -h$soname -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -G$allow_undefined_flag -h$soname -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-retain-symbols-file $wl$export_symbols' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-R$libdir' + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl--whole-archive`new_convenience=; for conv in $convenience\"\"; do test -z \"$conv\" || new_convenience=\"$new_convenience,$conv\"; done; func_echo_all \"$new_convenience\"` $wl--no-whole-archive' + _LT_TAGVAR(compiler_needs_object, $1)=yes + + # Not sure whether something based on + # $CC $CFLAGS -v conftest.$objext -o libconftest$shared_ext 2>&1 + # would be better. + output_verbose_link_cmd='func_echo_all' + + # Archives containing C++ object files must be created using + # "CC -xar", where "CC" is the Sun C++ compiler. This is + # necessary to make sure instantiated templates are included + # in the archive. + _LT_TAGVAR(old_archive_cmds, $1)='$CC -xar -o $oldlib $oldobjs' + ;; + esac + ;; + esac + ;; + + lynxos*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + + m88k*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + + mvs*) + case $cc_basename in + cxx*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + *) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + esac + ;; + + netbsd*) + if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then + _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable -o $lib $predep_objects $libobjs $deplibs $postdep_objects $linker_flags' + wlarc= + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-R$libdir' + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + fi + # Workaround some broken pre-1.5 toolchains + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP conftest.$objext | $SED -e "s:-lgcc -lc -lgcc::"' + ;; + + *nto* | *qnx*) + _LT_TAGVAR(ld_shlibs, $1)=yes + ;; + + openbsd* | bitrig*) + if test -f /usr/libexec/ld.so; then + _LT_TAGVAR(hardcode_direct, $1)=yes + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + _LT_TAGVAR(hardcode_direct_absolute, $1)=yes + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags -o $lib' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath,$libdir' + if test -z "`echo __ELF__ | $CC -E - | grep __ELF__`"; then + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-retain-symbols-file,$export_symbols -o $lib' + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-E' + _LT_TAGVAR(whole_archive_flag_spec, $1)=$wlarc'--whole-archive$convenience '$wlarc'--no-whole-archive' + fi + output_verbose_link_cmd=func_echo_all + else + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + + osf3* | osf4* | osf5*) + case $cc_basename in + KCC*) + # Kuck and Associates, Inc. (KAI) C++ Compiler + + # KCC will only create a shared library if the output file + # ends with ".so" (or ".sl" for HP-UX), so rename the library + # to its proper name (with version) after linking. + _LT_TAGVAR(archive_cmds, $1)='tempext=`echo $shared_ext | $SED -e '\''s/\([[^()0-9A-Za-z{}]]\)/\\\\\1/g'\''`; templib=`echo "$lib" | $SED -e "s/\$tempext\..*/.so/"`; $CC $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags --soname $soname -o \$templib; mv \$templib $lib' + + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath,$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + + # Archives containing C++ object files must be created using + # the KAI C++ compiler. + case $host in + osf3*) _LT_TAGVAR(old_archive_cmds, $1)='$CC -Bstatic -o $oldlib $oldobjs' ;; + *) _LT_TAGVAR(old_archive_cmds, $1)='$CC -o $oldlib $oldobjs' ;; + esac + ;; + RCC*) + # Rational C++ 2.4.1 + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + cxx*) + case $host in + osf3*) + _LT_TAGVAR(allow_undefined_flag, $1)=' $wl-expect_unresolved $wl\*' + _LT_TAGVAR(archive_cmds, $1)='$CC -shared$allow_undefined_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $soname `test -n "$verstring" && func_echo_all "$wl-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + ;; + *) + _LT_TAGVAR(allow_undefined_flag, $1)=' -expect_unresolved \*' + _LT_TAGVAR(archive_cmds, $1)='$CC -shared$allow_undefined_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags -msym -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='for i in `cat $export_symbols`; do printf "%s %s\\n" -exported_symbol "\$i" >> $lib.exp; done~ + echo "-hidden">> $lib.exp~ + $CC -shared$allow_undefined_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags -msym -soname $soname $wl-input $wl$lib.exp `test -n "$verstring" && $ECHO "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib~ + $RM $lib.exp' + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-rpath $libdir' + ;; + esac + + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + # + # There doesn't appear to be a way to prevent this compiler from + # explicitly linking system object files so we need to strip them + # from the output so that they don't get included in the library + # dependencies. + output_verbose_link_cmd='templist=`$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP "ld" | $GREP -v "ld:"`; templist=`func_echo_all "$templist" | $SED "s/\(^.*ld.*\)\( .*ld.*$\)/\1/"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' + ;; + *) + if test yes,no = "$GXX,$with_gnu_ld"; then + _LT_TAGVAR(allow_undefined_flag, $1)=' $wl-expect_unresolved $wl\*' + case $host in + osf3*) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared -nostdlib $allow_undefined_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + ;; + *) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag -nostdlib $allow_undefined_flag $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-msym $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations -o $lib' + ;; + esac + + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-rpath $wl$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=: + + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + + else + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + fi + ;; + esac + ;; + + psos*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + + sunos4*) + case $cc_basename in + CC*) + # Sun C++ 4.x + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + lcc*) + # Lucid + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + *) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + esac + ;; + + solaris*) + case $cc_basename in + CC* | sunCC*) + # Sun C++ 4.2, 5.x and Centerline C++ + _LT_TAGVAR(archive_cmds_need_lc,$1)=yes + _LT_TAGVAR(no_undefined_flag, $1)=' -zdefs' + _LT_TAGVAR(archive_cmds, $1)='$CC -G$allow_undefined_flag -h$soname -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $CC -G$allow_undefined_flag $wl-M $wl$lib.exp -h$soname -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags~$RM $lib.exp' + + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='-R$libdir' + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + case $host_os in + solaris2.[[0-5]] | solaris2.[[0-5]].*) ;; + *) + # The compiler driver will combine and reorder linker options, + # but understands '-z linker_flag'. + # Supported since Solaris 2.6 (maybe 2.5.1?) + _LT_TAGVAR(whole_archive_flag_spec, $1)='-z allextract$convenience -z defaultextract' + ;; + esac + _LT_TAGVAR(link_all_deplibs, $1)=yes + + output_verbose_link_cmd='func_echo_all' + + # Archives containing C++ object files must be created using + # "CC -xar", where "CC" is the Sun C++ compiler. This is + # necessary to make sure instantiated templates are included + # in the archive. + _LT_TAGVAR(old_archive_cmds, $1)='$CC -xar -o $oldlib $oldobjs' + ;; + gcx*) + # Green Hills C++ Compiler + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-h $wl$soname -o $lib' + + # The C++ compiler must be used to create the archive. + _LT_TAGVAR(old_archive_cmds, $1)='$CC $LDFLAGS -archive -o $oldlib $oldobjs' + ;; + *) + # GNU C++ compiler with Solaris linker + if test yes,no = "$GXX,$with_gnu_ld"; then + _LT_TAGVAR(no_undefined_flag, $1)=' $wl-z ${wl}defs' + if $CC --version | $GREP -v '^2\.7' > /dev/null; then + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $pic_flag -nostdlib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-h $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $CC -shared $pic_flag -nostdlib $wl-M $wl$lib.exp $wl-h $wl$soname -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags~$RM $lib.exp' + + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + else + # g++ 2.7 appears to require '-G' NOT '-shared' on this + # platform. + _LT_TAGVAR(archive_cmds, $1)='$CC -G -nostdlib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags $wl-h $wl$soname -o $lib' + _LT_TAGVAR(archive_expsym_cmds, $1)='echo "{ global:" > $lib.exp~cat $export_symbols | $SED -e "s/\(.*\)/\1;/" >> $lib.exp~echo "local: *; };" >> $lib.exp~ + $CC -G -nostdlib $wl-M $wl$lib.exp $wl-h $wl$soname -o $lib $predep_objects $libobjs $deplibs $postdep_objects $compiler_flags~$RM $lib.exp' + + # Commands to make compiler produce verbose output that lists + # what "hidden" libraries, object files and flags are used when + # linking a shared library. + output_verbose_link_cmd='$CC -G $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + fi + + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-R $wl$libdir' + case $host_os in + solaris2.[[0-5]] | solaris2.[[0-5]].*) ;; + *) + _LT_TAGVAR(whole_archive_flag_spec, $1)='$wl-z ${wl}allextract$convenience $wl-z ${wl}defaultextract' + ;; + esac + fi + ;; + esac + ;; + + sysv4*uw2* | sysv5OpenUNIX* | sysv5UnixWare7.[[01]].[[10]]* | unixware7* | sco3.2v5.0.[[024]]*) + _LT_TAGVAR(no_undefined_flag, $1)='$wl-z,text' + _LT_TAGVAR(archive_cmds_need_lc, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + runpath_var='LD_RUN_PATH' + + case $cc_basename in + CC*) + _LT_TAGVAR(archive_cmds, $1)='$CC -G $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -G $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + ;; + *) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + ;; + esac + ;; + + sysv5* | sco3.2v5* | sco5v6*) + # Note: We CANNOT use -z defs as we might desire, because we do not + # link with -lc, and that would cause any symbols used from libc to + # always be unresolved, which means just about no library would + # ever link correctly. If we're not using GNU ld we use -z text + # though, which does catch some bad symbols but isn't as heavy-handed + # as -z defs. + _LT_TAGVAR(no_undefined_flag, $1)='$wl-z,text' + _LT_TAGVAR(allow_undefined_flag, $1)='$wl-z,nodefs' + _LT_TAGVAR(archive_cmds_need_lc, $1)=no + _LT_TAGVAR(hardcode_shlibpath_var, $1)=no + _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-R,$libdir' + _LT_TAGVAR(hardcode_libdir_separator, $1)=':' + _LT_TAGVAR(link_all_deplibs, $1)=yes + _LT_TAGVAR(export_dynamic_flag_spec, $1)='$wl-Bexport' + runpath_var='LD_RUN_PATH' + + case $cc_basename in + CC*) + _LT_TAGVAR(archive_cmds, $1)='$CC -G $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -G $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(old_archive_cmds, $1)='$CC -Tprelink_objects $oldobjs~ + '"$_LT_TAGVAR(old_archive_cmds, $1)" + _LT_TAGVAR(reload_cmds, $1)='$CC -Tprelink_objects $reload_objs~ + '"$_LT_TAGVAR(reload_cmds, $1)" + ;; + *) + _LT_TAGVAR(archive_cmds, $1)='$CC -shared $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $wl-Bexport:$export_symbols $wl-h,$soname -o $lib $libobjs $deplibs $compiler_flags' + ;; + esac + ;; + + tandem*) + case $cc_basename in + NCC*) + # NonStop-UX NCC 3.20 + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + *) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + esac + ;; + + vxworks*) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + + *) + # FIXME: insert proper C++ library support + _LT_TAGVAR(ld_shlibs, $1)=no + ;; + esac + + AC_MSG_RESULT([$_LT_TAGVAR(ld_shlibs, $1)]) + test no = "$_LT_TAGVAR(ld_shlibs, $1)" && can_build_shared=no + + _LT_TAGVAR(GCC, $1)=$GXX + _LT_TAGVAR(LD, $1)=$LD + + ## CAVEAT EMPTOR: + ## There is no encapsulation within the following macros, do not change + ## the running order or otherwise move them around unless you know exactly + ## what you are doing... + _LT_SYS_HIDDEN_LIBDEPS($1) + _LT_COMPILER_PIC($1) + _LT_COMPILER_C_O($1) + _LT_COMPILER_FILE_LOCKS($1) + _LT_LINKER_SHLIBS($1) + _LT_SYS_DYNAMIC_LINKER($1) + _LT_LINKER_HARDCODE_LIBPATH($1) + + _LT_CONFIG($1) + fi # test -n "$compiler" + + CC=$lt_save_CC + CFLAGS=$lt_save_CFLAGS + LDCXX=$LD + LD=$lt_save_LD + GCC=$lt_save_GCC + with_gnu_ld=$lt_save_with_gnu_ld + lt_cv_path_LDCXX=$lt_cv_path_LD + lt_cv_path_LD=$lt_save_path_LD + lt_cv_prog_gnu_ldcxx=$lt_cv_prog_gnu_ld + lt_cv_prog_gnu_ld=$lt_save_with_gnu_ld +fi # test yes != "$_lt_caught_CXX_error" + +AC_LANG_POP +])# _LT_LANG_CXX_CONFIG + + +# _LT_FUNC_STRIPNAME_CNF +# ---------------------- +# func_stripname_cnf prefix suffix name +# strip PREFIX and SUFFIX off of NAME. +# PREFIX and SUFFIX must not contain globbing or regex special +# characters, hashes, percent signs, but SUFFIX may contain a leading +# dot (in which case that matches only a dot). +# +# This function is identical to the (non-XSI) version of func_stripname, +# except this one can be used by m4 code that may be executed by configure, +# rather than the libtool script. +m4_defun([_LT_FUNC_STRIPNAME_CNF],[dnl +AC_REQUIRE([_LT_DECL_SED]) +AC_REQUIRE([_LT_PROG_ECHO_BACKSLASH]) +func_stripname_cnf () +{ + case @S|@2 in + .*) func_stripname_result=`$ECHO "@S|@3" | $SED "s%^@S|@1%%; s%\\\\@S|@2\$%%"`;; + *) func_stripname_result=`$ECHO "@S|@3" | $SED "s%^@S|@1%%; s%@S|@2\$%%"`;; + esac +} # func_stripname_cnf +])# _LT_FUNC_STRIPNAME_CNF + + +# _LT_SYS_HIDDEN_LIBDEPS([TAGNAME]) +# --------------------------------- +# Figure out "hidden" library dependencies from verbose +# compiler output when linking a shared library. +# Parse the compiler output and extract the necessary +# objects, libraries and library flags. +m4_defun([_LT_SYS_HIDDEN_LIBDEPS], +[m4_require([_LT_FILEUTILS_DEFAULTS])dnl +AC_REQUIRE([_LT_FUNC_STRIPNAME_CNF])dnl +# Dependencies to place before and after the object being linked: +_LT_TAGVAR(predep_objects, $1)= +_LT_TAGVAR(postdep_objects, $1)= +_LT_TAGVAR(predeps, $1)= +_LT_TAGVAR(postdeps, $1)= +_LT_TAGVAR(compiler_lib_search_path, $1)= + +dnl we can't use the lt_simple_compile_test_code here, +dnl because it contains code intended for an executable, +dnl not a library. It's possible we should let each +dnl tag define a new lt_????_link_test_code variable, +dnl but it's only used here... +m4_if([$1], [], [cat > conftest.$ac_ext <<_LT_EOF +int a; +void foo (void) { a = 0; } +_LT_EOF +], [$1], [CXX], [cat > conftest.$ac_ext <<_LT_EOF +class Foo +{ +public: + Foo (void) { a = 0; } +private: + int a; +}; +_LT_EOF +], [$1], [F77], [cat > conftest.$ac_ext <<_LT_EOF + subroutine foo + implicit none + integer*4 a + a=0 + return + end +_LT_EOF +], [$1], [FC], [cat > conftest.$ac_ext <<_LT_EOF + subroutine foo + implicit none + integer a + a=0 + return + end +_LT_EOF +], [$1], [GCJ], [cat > conftest.$ac_ext <<_LT_EOF +public class foo { + private int a; + public void bar (void) { + a = 0; + } +}; +_LT_EOF +], [$1], [GO], [cat > conftest.$ac_ext <<_LT_EOF +package foo +func foo() { +} +_LT_EOF +]) + +_lt_libdeps_save_CFLAGS=$CFLAGS +case "$CC $CFLAGS " in #( +*\ -flto*\ *) CFLAGS="$CFLAGS -fno-lto" ;; +*\ -fwhopr*\ *) CFLAGS="$CFLAGS -fno-whopr" ;; +*\ -fuse-linker-plugin*\ *) CFLAGS="$CFLAGS -fno-use-linker-plugin" ;; +esac + +dnl Parse the compiler output and extract the necessary +dnl objects, libraries and library flags. +if AC_TRY_EVAL(ac_compile); then + # Parse the compiler output and extract the necessary + # objects, libraries and library flags. + + # Sentinel used to keep track of whether or not we are before + # the conftest object file. + pre_test_object_deps_done=no + + for p in `eval "$output_verbose_link_cmd"`; do + case $prev$p in + + -L* | -R* | -l*) + # Some compilers place space between "-{L,R}" and the path. + # Remove the space. + if test x-L = "$p" || + test x-R = "$p"; then + prev=$p + continue + fi + + # Expand the sysroot to ease extracting the directories later. + if test -z "$prev"; then + case $p in + -L*) func_stripname_cnf '-L' '' "$p"; prev=-L; p=$func_stripname_result ;; + -R*) func_stripname_cnf '-R' '' "$p"; prev=-R; p=$func_stripname_result ;; + -l*) func_stripname_cnf '-l' '' "$p"; prev=-l; p=$func_stripname_result ;; + esac + fi + case $p in + =*) func_stripname_cnf '=' '' "$p"; p=$lt_sysroot$func_stripname_result ;; + esac + if test no = "$pre_test_object_deps_done"; then + case $prev in + -L | -R) + # Internal compiler library paths should come after those + # provided the user. The postdeps already come after the + # user supplied libs so there is no need to process them. + if test -z "$_LT_TAGVAR(compiler_lib_search_path, $1)"; then + _LT_TAGVAR(compiler_lib_search_path, $1)=$prev$p + else + _LT_TAGVAR(compiler_lib_search_path, $1)="${_LT_TAGVAR(compiler_lib_search_path, $1)} $prev$p" + fi + ;; + # The "-l" case would never come before the object being + # linked, so don't bother handling this case. + esac + else + if test -z "$_LT_TAGVAR(postdeps, $1)"; then + _LT_TAGVAR(postdeps, $1)=$prev$p + else + _LT_TAGVAR(postdeps, $1)="${_LT_TAGVAR(postdeps, $1)} $prev$p" + fi + fi + prev= + ;; + + *.lto.$objext) ;; # Ignore GCC LTO objects + *.$objext) + # This assumes that the test object file only shows up + # once in the compiler output. + if test "$p" = "conftest.$objext"; then + pre_test_object_deps_done=yes + continue + fi + + if test no = "$pre_test_object_deps_done"; then + if test -z "$_LT_TAGVAR(predep_objects, $1)"; then + _LT_TAGVAR(predep_objects, $1)=$p + else + _LT_TAGVAR(predep_objects, $1)="$_LT_TAGVAR(predep_objects, $1) $p" + fi + else + if test -z "$_LT_TAGVAR(postdep_objects, $1)"; then + _LT_TAGVAR(postdep_objects, $1)=$p + else + _LT_TAGVAR(postdep_objects, $1)="$_LT_TAGVAR(postdep_objects, $1) $p" + fi + fi + ;; + + *) ;; # Ignore the rest. + + esac + done + + # Clean up. + rm -f a.out a.exe +else + echo "libtool.m4: error: problem compiling $1 test program" +fi + +$RM -f confest.$objext +CFLAGS=$_lt_libdeps_save_CFLAGS + +# PORTME: override above test on systems where it is broken +m4_if([$1], [CXX], +[case $host_os in +interix[[3-9]]*) + # Interix 3.5 installs completely hosed .la files for C++, so rather than + # hack all around it, let's just trust "g++" to DTRT. + _LT_TAGVAR(predep_objects,$1)= + _LT_TAGVAR(postdep_objects,$1)= + _LT_TAGVAR(postdeps,$1)= + ;; +esac +]) + +case " $_LT_TAGVAR(postdeps, $1) " in +*" -lc "*) _LT_TAGVAR(archive_cmds_need_lc, $1)=no ;; +esac + _LT_TAGVAR(compiler_lib_search_dirs, $1)= +if test -n "${_LT_TAGVAR(compiler_lib_search_path, $1)}"; then + _LT_TAGVAR(compiler_lib_search_dirs, $1)=`echo " ${_LT_TAGVAR(compiler_lib_search_path, $1)}" | $SED -e 's! -L! !g' -e 's!^ !!'` +fi +_LT_TAGDECL([], [compiler_lib_search_dirs], [1], + [The directories searched by this compiler when creating a shared library]) +_LT_TAGDECL([], [predep_objects], [1], + [Dependencies to place before and after the objects being linked to + create a shared library]) +_LT_TAGDECL([], [postdep_objects], [1]) +_LT_TAGDECL([], [predeps], [1]) +_LT_TAGDECL([], [postdeps], [1]) +_LT_TAGDECL([], [compiler_lib_search_path], [1], + [The library search path used internally by the compiler when linking + a shared library]) +])# _LT_SYS_HIDDEN_LIBDEPS + + +# _LT_LANG_F77_CONFIG([TAG]) +# -------------------------- +# Ensure that the configuration variables for a Fortran 77 compiler are +# suitably defined. These variables are subsequently used by _LT_CONFIG +# to write the compiler configuration to 'libtool'. +m4_defun([_LT_LANG_F77_CONFIG], +[AC_LANG_PUSH(Fortran 77) +if test -z "$F77" || test no = "$F77"; then + _lt_disable_F77=yes +fi + +_LT_TAGVAR(archive_cmds_need_lc, $1)=no +_LT_TAGVAR(allow_undefined_flag, $1)= +_LT_TAGVAR(always_export_symbols, $1)=no +_LT_TAGVAR(archive_expsym_cmds, $1)= +_LT_TAGVAR(export_dynamic_flag_spec, $1)= +_LT_TAGVAR(hardcode_direct, $1)=no +_LT_TAGVAR(hardcode_direct_absolute, $1)=no +_LT_TAGVAR(hardcode_libdir_flag_spec, $1)= +_LT_TAGVAR(hardcode_libdir_separator, $1)= +_LT_TAGVAR(hardcode_minus_L, $1)=no +_LT_TAGVAR(hardcode_automatic, $1)=no +_LT_TAGVAR(inherit_rpath, $1)=no +_LT_TAGVAR(module_cmds, $1)= +_LT_TAGVAR(module_expsym_cmds, $1)= +_LT_TAGVAR(link_all_deplibs, $1)=unknown +_LT_TAGVAR(old_archive_cmds, $1)=$old_archive_cmds +_LT_TAGVAR(reload_flag, $1)=$reload_flag +_LT_TAGVAR(reload_cmds, $1)=$reload_cmds +_LT_TAGVAR(no_undefined_flag, $1)= +_LT_TAGVAR(whole_archive_flag_spec, $1)= +_LT_TAGVAR(enable_shared_with_static_runtimes, $1)=no + +# Source file extension for f77 test sources. +ac_ext=f + +# Object file extension for compiled f77 test sources. +objext=o +_LT_TAGVAR(objext, $1)=$objext + +# No sense in running all these tests if we already determined that +# the F77 compiler isn't working. Some variables (like enable_shared) +# are currently assumed to apply to all compilers on this platform, +# and will be corrupted by setting them based on a non-working compiler. +if test yes != "$_lt_disable_F77"; then + # Code to be used in simple compile tests + lt_simple_compile_test_code="\ + subroutine t + return + end +" + + # Code to be used in simple link tests + lt_simple_link_test_code="\ + program t + end +" + + # ltmain only uses $CC for tagged configurations so make sure $CC is set. + _LT_TAG_COMPILER + + # save warnings/boilerplate of simple test code + _LT_COMPILER_BOILERPLATE + _LT_LINKER_BOILERPLATE + + # Allow CC to be a program name with arguments. + lt_save_CC=$CC + lt_save_GCC=$GCC + lt_save_CFLAGS=$CFLAGS + CC=${F77-"f77"} + CFLAGS=$FFLAGS + compiler=$CC + _LT_TAGVAR(compiler, $1)=$CC + _LT_CC_BASENAME([$compiler]) + GCC=$G77 + if test -n "$compiler"; then + AC_MSG_CHECKING([if libtool supports shared libraries]) + AC_MSG_RESULT([$can_build_shared]) + + AC_MSG_CHECKING([whether to build shared libraries]) + test no = "$can_build_shared" && enable_shared=no + + # On AIX, shared libraries and static libraries use the same namespace, and + # are all built from PIC. + case $host_os in + aix3*) + test yes = "$enable_shared" && enable_static=no + if test -n "$RANLIB"; then + archive_cmds="$archive_cmds~\$RANLIB \$lib" + postinstall_cmds='$RANLIB $lib' + fi + ;; + aix[[4-9]]*) + if test ia64 != "$host_cpu"; then + case $enable_shared,$with_aix_soname,$aix_use_runtimelinking in + yes,aix,yes) ;; # shared object as lib.so file only + yes,svr4,*) ;; # shared object as lib.so archive member only + yes,*) enable_static=no ;; # shared object in lib.a archive as well + esac + fi + ;; + esac + AC_MSG_RESULT([$enable_shared]) + + AC_MSG_CHECKING([whether to build static libraries]) + # Make sure either enable_shared or enable_static is yes. + test yes = "$enable_shared" || enable_static=yes + AC_MSG_RESULT([$enable_static]) + + _LT_TAGVAR(GCC, $1)=$G77 + _LT_TAGVAR(LD, $1)=$LD + + ## CAVEAT EMPTOR: + ## There is no encapsulation within the following macros, do not change + ## the running order or otherwise move them around unless you know exactly + ## what you are doing... + _LT_COMPILER_PIC($1) + _LT_COMPILER_C_O($1) + _LT_COMPILER_FILE_LOCKS($1) + _LT_LINKER_SHLIBS($1) + _LT_SYS_DYNAMIC_LINKER($1) + _LT_LINKER_HARDCODE_LIBPATH($1) + + _LT_CONFIG($1) + fi # test -n "$compiler" + + GCC=$lt_save_GCC + CC=$lt_save_CC + CFLAGS=$lt_save_CFLAGS +fi # test yes != "$_lt_disable_F77" + +AC_LANG_POP +])# _LT_LANG_F77_CONFIG + + +# _LT_LANG_FC_CONFIG([TAG]) +# ------------------------- +# Ensure that the configuration variables for a Fortran compiler are +# suitably defined. These variables are subsequently used by _LT_CONFIG +# to write the compiler configuration to 'libtool'. +m4_defun([_LT_LANG_FC_CONFIG], +[AC_LANG_PUSH(Fortran) + +if test -z "$FC" || test no = "$FC"; then + _lt_disable_FC=yes +fi + +_LT_TAGVAR(archive_cmds_need_lc, $1)=no +_LT_TAGVAR(allow_undefined_flag, $1)= +_LT_TAGVAR(always_export_symbols, $1)=no +_LT_TAGVAR(archive_expsym_cmds, $1)= +_LT_TAGVAR(export_dynamic_flag_spec, $1)= +_LT_TAGVAR(hardcode_direct, $1)=no +_LT_TAGVAR(hardcode_direct_absolute, $1)=no +_LT_TAGVAR(hardcode_libdir_flag_spec, $1)= +_LT_TAGVAR(hardcode_libdir_separator, $1)= +_LT_TAGVAR(hardcode_minus_L, $1)=no +_LT_TAGVAR(hardcode_automatic, $1)=no +_LT_TAGVAR(inherit_rpath, $1)=no +_LT_TAGVAR(module_cmds, $1)= +_LT_TAGVAR(module_expsym_cmds, $1)= +_LT_TAGVAR(link_all_deplibs, $1)=unknown +_LT_TAGVAR(old_archive_cmds, $1)=$old_archive_cmds +_LT_TAGVAR(reload_flag, $1)=$reload_flag +_LT_TAGVAR(reload_cmds, $1)=$reload_cmds +_LT_TAGVAR(no_undefined_flag, $1)= +_LT_TAGVAR(whole_archive_flag_spec, $1)= +_LT_TAGVAR(enable_shared_with_static_runtimes, $1)=no + +# Source file extension for fc test sources. +ac_ext=${ac_fc_srcext-f} + +# Object file extension for compiled fc test sources. +objext=o +_LT_TAGVAR(objext, $1)=$objext + +# No sense in running all these tests if we already determined that +# the FC compiler isn't working. Some variables (like enable_shared) +# are currently assumed to apply to all compilers on this platform, +# and will be corrupted by setting them based on a non-working compiler. +if test yes != "$_lt_disable_FC"; then + # Code to be used in simple compile tests + lt_simple_compile_test_code="\ + subroutine t + return + end +" + + # Code to be used in simple link tests + lt_simple_link_test_code="\ + program t + end +" + + # ltmain only uses $CC for tagged configurations so make sure $CC is set. + _LT_TAG_COMPILER + + # save warnings/boilerplate of simple test code + _LT_COMPILER_BOILERPLATE + _LT_LINKER_BOILERPLATE + + # Allow CC to be a program name with arguments. + lt_save_CC=$CC + lt_save_GCC=$GCC + lt_save_CFLAGS=$CFLAGS + CC=${FC-"f95"} + CFLAGS=$FCFLAGS + compiler=$CC + GCC=$ac_cv_fc_compiler_gnu + + _LT_TAGVAR(compiler, $1)=$CC + _LT_CC_BASENAME([$compiler]) + + if test -n "$compiler"; then + AC_MSG_CHECKING([if libtool supports shared libraries]) + AC_MSG_RESULT([$can_build_shared]) + + AC_MSG_CHECKING([whether to build shared libraries]) + test no = "$can_build_shared" && enable_shared=no + + # On AIX, shared libraries and static libraries use the same namespace, and + # are all built from PIC. + case $host_os in + aix3*) + test yes = "$enable_shared" && enable_static=no + if test -n "$RANLIB"; then + archive_cmds="$archive_cmds~\$RANLIB \$lib" + postinstall_cmds='$RANLIB $lib' + fi + ;; + aix[[4-9]]*) + if test ia64 != "$host_cpu"; then + case $enable_shared,$with_aix_soname,$aix_use_runtimelinking in + yes,aix,yes) ;; # shared object as lib.so file only + yes,svr4,*) ;; # shared object as lib.so archive member only + yes,*) enable_static=no ;; # shared object in lib.a archive as well + esac + fi + ;; + esac + AC_MSG_RESULT([$enable_shared]) + + AC_MSG_CHECKING([whether to build static libraries]) + # Make sure either enable_shared or enable_static is yes. + test yes = "$enable_shared" || enable_static=yes + AC_MSG_RESULT([$enable_static]) + + _LT_TAGVAR(GCC, $1)=$ac_cv_fc_compiler_gnu + _LT_TAGVAR(LD, $1)=$LD + + ## CAVEAT EMPTOR: + ## There is no encapsulation within the following macros, do not change + ## the running order or otherwise move them around unless you know exactly + ## what you are doing... + _LT_SYS_HIDDEN_LIBDEPS($1) + _LT_COMPILER_PIC($1) + _LT_COMPILER_C_O($1) + _LT_COMPILER_FILE_LOCKS($1) + _LT_LINKER_SHLIBS($1) + _LT_SYS_DYNAMIC_LINKER($1) + _LT_LINKER_HARDCODE_LIBPATH($1) + + _LT_CONFIG($1) + fi # test -n "$compiler" + + GCC=$lt_save_GCC + CC=$lt_save_CC + CFLAGS=$lt_save_CFLAGS +fi # test yes != "$_lt_disable_FC" + +AC_LANG_POP +])# _LT_LANG_FC_CONFIG + + +# _LT_LANG_GCJ_CONFIG([TAG]) +# -------------------------- +# Ensure that the configuration variables for the GNU Java Compiler compiler +# are suitably defined. These variables are subsequently used by _LT_CONFIG +# to write the compiler configuration to 'libtool'. +m4_defun([_LT_LANG_GCJ_CONFIG], +[AC_REQUIRE([LT_PROG_GCJ])dnl +AC_LANG_SAVE + +# Source file extension for Java test sources. +ac_ext=java + +# Object file extension for compiled Java test sources. +objext=o +_LT_TAGVAR(objext, $1)=$objext + +# Code to be used in simple compile tests +lt_simple_compile_test_code="class foo {}" + +# Code to be used in simple link tests +lt_simple_link_test_code='public class conftest { public static void main(String[[]] argv) {}; }' + +# ltmain only uses $CC for tagged configurations so make sure $CC is set. +_LT_TAG_COMPILER + +# save warnings/boilerplate of simple test code +_LT_COMPILER_BOILERPLATE +_LT_LINKER_BOILERPLATE + +# Allow CC to be a program name with arguments. +lt_save_CC=$CC +lt_save_CFLAGS=$CFLAGS +lt_save_GCC=$GCC +GCC=yes +CC=${GCJ-"gcj"} +CFLAGS=$GCJFLAGS +compiler=$CC +_LT_TAGVAR(compiler, $1)=$CC +_LT_TAGVAR(LD, $1)=$LD +_LT_CC_BASENAME([$compiler]) + +# GCJ did not exist at the time GCC didn't implicitly link libc in. +_LT_TAGVAR(archive_cmds_need_lc, $1)=no + +_LT_TAGVAR(old_archive_cmds, $1)=$old_archive_cmds +_LT_TAGVAR(reload_flag, $1)=$reload_flag +_LT_TAGVAR(reload_cmds, $1)=$reload_cmds + +## CAVEAT EMPTOR: +## There is no encapsulation within the following macros, do not change +## the running order or otherwise move them around unless you know exactly +## what you are doing... +if test -n "$compiler"; then + _LT_COMPILER_NO_RTTI($1) + _LT_COMPILER_PIC($1) + _LT_COMPILER_C_O($1) + _LT_COMPILER_FILE_LOCKS($1) + _LT_LINKER_SHLIBS($1) + _LT_LINKER_HARDCODE_LIBPATH($1) + + _LT_CONFIG($1) +fi + +AC_LANG_RESTORE + +GCC=$lt_save_GCC +CC=$lt_save_CC +CFLAGS=$lt_save_CFLAGS +])# _LT_LANG_GCJ_CONFIG + + +# _LT_LANG_GO_CONFIG([TAG]) +# -------------------------- +# Ensure that the configuration variables for the GNU Go compiler +# are suitably defined. These variables are subsequently used by _LT_CONFIG +# to write the compiler configuration to 'libtool'. +m4_defun([_LT_LANG_GO_CONFIG], +[AC_REQUIRE([LT_PROG_GO])dnl +AC_LANG_SAVE + +# Source file extension for Go test sources. +ac_ext=go + +# Object file extension for compiled Go test sources. +objext=o +_LT_TAGVAR(objext, $1)=$objext + +# Code to be used in simple compile tests +lt_simple_compile_test_code="package main; func main() { }" + +# Code to be used in simple link tests +lt_simple_link_test_code='package main; func main() { }' + +# ltmain only uses $CC for tagged configurations so make sure $CC is set. +_LT_TAG_COMPILER + +# save warnings/boilerplate of simple test code +_LT_COMPILER_BOILERPLATE +_LT_LINKER_BOILERPLATE + +# Allow CC to be a program name with arguments. +lt_save_CC=$CC +lt_save_CFLAGS=$CFLAGS +lt_save_GCC=$GCC +GCC=yes +CC=${GOC-"gccgo"} +CFLAGS=$GOFLAGS +compiler=$CC +_LT_TAGVAR(compiler, $1)=$CC +_LT_TAGVAR(LD, $1)=$LD +_LT_CC_BASENAME([$compiler]) + +# Go did not exist at the time GCC didn't implicitly link libc in. +_LT_TAGVAR(archive_cmds_need_lc, $1)=no + +_LT_TAGVAR(old_archive_cmds, $1)=$old_archive_cmds +_LT_TAGVAR(reload_flag, $1)=$reload_flag +_LT_TAGVAR(reload_cmds, $1)=$reload_cmds + +## CAVEAT EMPTOR: +## There is no encapsulation within the following macros, do not change +## the running order or otherwise move them around unless you know exactly +## what you are doing... +if test -n "$compiler"; then + _LT_COMPILER_NO_RTTI($1) + _LT_COMPILER_PIC($1) + _LT_COMPILER_C_O($1) + _LT_COMPILER_FILE_LOCKS($1) + _LT_LINKER_SHLIBS($1) + _LT_LINKER_HARDCODE_LIBPATH($1) + + _LT_CONFIG($1) +fi + +AC_LANG_RESTORE + +GCC=$lt_save_GCC +CC=$lt_save_CC +CFLAGS=$lt_save_CFLAGS +])# _LT_LANG_GO_CONFIG + + +# _LT_LANG_RC_CONFIG([TAG]) +# ------------------------- +# Ensure that the configuration variables for the Windows resource compiler +# are suitably defined. These variables are subsequently used by _LT_CONFIG +# to write the compiler configuration to 'libtool'. +m4_defun([_LT_LANG_RC_CONFIG], +[AC_REQUIRE([LT_PROG_RC])dnl +AC_LANG_SAVE + +# Source file extension for RC test sources. +ac_ext=rc + +# Object file extension for compiled RC test sources. +objext=o +_LT_TAGVAR(objext, $1)=$objext + +# Code to be used in simple compile tests +lt_simple_compile_test_code='sample MENU { MENUITEM "&Soup", 100, CHECKED }' + +# Code to be used in simple link tests +lt_simple_link_test_code=$lt_simple_compile_test_code + +# ltmain only uses $CC for tagged configurations so make sure $CC is set. +_LT_TAG_COMPILER + +# save warnings/boilerplate of simple test code +_LT_COMPILER_BOILERPLATE +_LT_LINKER_BOILERPLATE + +# Allow CC to be a program name with arguments. +lt_save_CC=$CC +lt_save_CFLAGS=$CFLAGS +lt_save_GCC=$GCC +GCC= +CC=${RC-"windres"} +CFLAGS= +compiler=$CC +_LT_TAGVAR(compiler, $1)=$CC +_LT_CC_BASENAME([$compiler]) +_LT_TAGVAR(lt_cv_prog_compiler_c_o, $1)=yes + +if test -n "$compiler"; then + : + _LT_CONFIG($1) +fi + +GCC=$lt_save_GCC +AC_LANG_RESTORE +CC=$lt_save_CC +CFLAGS=$lt_save_CFLAGS +])# _LT_LANG_RC_CONFIG + + +# LT_PROG_GCJ +# ----------- +AC_DEFUN([LT_PROG_GCJ], +[m4_ifdef([AC_PROG_GCJ], [AC_PROG_GCJ], + [m4_ifdef([A][M_PROG_GCJ], [A][M_PROG_GCJ], + [AC_CHECK_TOOL(GCJ, gcj,) + test set = "${GCJFLAGS+set}" || GCJFLAGS="-g -O2" + AC_SUBST(GCJFLAGS)])])[]dnl +]) + +# Old name: +AU_ALIAS([LT_AC_PROG_GCJ], [LT_PROG_GCJ]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([LT_AC_PROG_GCJ], []) + + +# LT_PROG_GO +# ---------- +AC_DEFUN([LT_PROG_GO], +[AC_CHECK_TOOL(GOC, gccgo,) +]) + + +# LT_PROG_RC +# ---------- +AC_DEFUN([LT_PROG_RC], +[AC_CHECK_TOOL(RC, windres,) +]) + +# Old name: +AU_ALIAS([LT_AC_PROG_RC], [LT_PROG_RC]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([LT_AC_PROG_RC], []) + + +# _LT_DECL_EGREP +# -------------- +# If we don't have a new enough Autoconf to choose the best grep +# available, choose the one first in the user's PATH. +m4_defun([_LT_DECL_EGREP], +[AC_REQUIRE([AC_PROG_EGREP])dnl +AC_REQUIRE([AC_PROG_FGREP])dnl +test -z "$GREP" && GREP=grep +_LT_DECL([], [GREP], [1], [A grep program that handles long lines]) +_LT_DECL([], [EGREP], [1], [An ERE matcher]) +_LT_DECL([], [FGREP], [1], [A literal string matcher]) +dnl Non-bleeding-edge autoconf doesn't subst GREP, so do it here too +AC_SUBST([GREP]) +]) + + +# _LT_DECL_OBJDUMP +# -------------- +# If we don't have a new enough Autoconf to choose the best objdump +# available, choose the one first in the user's PATH. +m4_defun([_LT_DECL_OBJDUMP], +[AC_CHECK_TOOL(OBJDUMP, objdump, false) +test -z "$OBJDUMP" && OBJDUMP=objdump +_LT_DECL([], [OBJDUMP], [1], [An object symbol dumper]) +AC_SUBST([OBJDUMP]) +]) + +# _LT_DECL_DLLTOOL +# ---------------- +# Ensure DLLTOOL variable is set. +m4_defun([_LT_DECL_DLLTOOL], +[AC_CHECK_TOOL(DLLTOOL, dlltool, false) +test -z "$DLLTOOL" && DLLTOOL=dlltool +_LT_DECL([], [DLLTOOL], [1], [DLL creation program]) +AC_SUBST([DLLTOOL]) +]) + +# _LT_DECL_SED +# ------------ +# Check for a fully-functional sed program, that truncates +# as few characters as possible. Prefer GNU sed if found. +m4_defun([_LT_DECL_SED], +[AC_PROG_SED +test -z "$SED" && SED=sed +Xsed="$SED -e 1s/^X//" +_LT_DECL([], [SED], [1], [A sed program that does not truncate output]) +_LT_DECL([], [Xsed], ["\$SED -e 1s/^X//"], + [Sed that helps us avoid accidentally triggering echo(1) options like -n]) +])# _LT_DECL_SED + +m4_ifndef([AC_PROG_SED], [ +############################################################ +# NOTE: This macro has been submitted for inclusion into # +# GNU Autoconf as AC_PROG_SED. When it is available in # +# a released version of Autoconf we should remove this # +# macro and use it instead. # +############################################################ + +m4_defun([AC_PROG_SED], +[AC_MSG_CHECKING([for a sed that does not truncate output]) +AC_CACHE_VAL(lt_cv_path_SED, +[# Loop through the user's path and test for sed and gsed. +# Then use that list of sed's as ones to test for truncation. +as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for lt_ac_prog in sed gsed; do + for ac_exec_ext in '' $ac_executable_extensions; do + if $as_executable_p "$as_dir/$lt_ac_prog$ac_exec_ext"; then + lt_ac_sed_list="$lt_ac_sed_list $as_dir/$lt_ac_prog$ac_exec_ext" + fi + done + done +done +IFS=$as_save_IFS +lt_ac_max=0 +lt_ac_count=0 +# Add /usr/xpg4/bin/sed as it is typically found on Solaris +# along with /bin/sed that truncates output. +for lt_ac_sed in $lt_ac_sed_list /usr/xpg4/bin/sed; do + test ! -f "$lt_ac_sed" && continue + cat /dev/null > conftest.in + lt_ac_count=0 + echo $ECHO_N "0123456789$ECHO_C" >conftest.in + # Check for GNU sed and select it if it is found. + if "$lt_ac_sed" --version 2>&1 < /dev/null | grep 'GNU' > /dev/null; then + lt_cv_path_SED=$lt_ac_sed + break + fi + while true; do + cat conftest.in conftest.in >conftest.tmp + mv conftest.tmp conftest.in + cp conftest.in conftest.nl + echo >>conftest.nl + $lt_ac_sed -e 's/a$//' < conftest.nl >conftest.out || break + cmp -s conftest.out conftest.nl || break + # 10000 chars as input seems more than enough + test 10 -lt "$lt_ac_count" && break + lt_ac_count=`expr $lt_ac_count + 1` + if test "$lt_ac_count" -gt "$lt_ac_max"; then + lt_ac_max=$lt_ac_count + lt_cv_path_SED=$lt_ac_sed + fi + done +done +]) +SED=$lt_cv_path_SED +AC_SUBST([SED]) +AC_MSG_RESULT([$SED]) +])#AC_PROG_SED +])#m4_ifndef + +# Old name: +AU_ALIAS([LT_AC_PROG_SED], [AC_PROG_SED]) +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([LT_AC_PROG_SED], []) + + +# _LT_CHECK_SHELL_FEATURES +# ------------------------ +# Find out whether the shell is Bourne or XSI compatible, +# or has some other useful features. +m4_defun([_LT_CHECK_SHELL_FEATURES], +[if ( (MAIL=60; unset MAIL) || exit) >/dev/null 2>&1; then + lt_unset=unset +else + lt_unset=false +fi +_LT_DECL([], [lt_unset], [0], [whether the shell understands "unset"])dnl + +# test EBCDIC or ASCII +case `echo X|tr X '\101'` in + A) # ASCII based system + # \n is not interpreted correctly by Solaris 8 /usr/ucb/tr + lt_SP2NL='tr \040 \012' + lt_NL2SP='tr \015\012 \040\040' + ;; + *) # EBCDIC based system + lt_SP2NL='tr \100 \n' + lt_NL2SP='tr \r\n \100\100' + ;; +esac +_LT_DECL([SP2NL], [lt_SP2NL], [1], [turn spaces into newlines])dnl +_LT_DECL([NL2SP], [lt_NL2SP], [1], [turn newlines into spaces])dnl +])# _LT_CHECK_SHELL_FEATURES + + +# _LT_PATH_CONVERSION_FUNCTIONS +# ----------------------------- +# Determine what file name conversion functions should be used by +# func_to_host_file (and, implicitly, by func_to_host_path). These are needed +# for certain cross-compile configurations and native mingw. +m4_defun([_LT_PATH_CONVERSION_FUNCTIONS], +[AC_REQUIRE([AC_CANONICAL_HOST])dnl +AC_REQUIRE([AC_CANONICAL_BUILD])dnl +AC_MSG_CHECKING([how to convert $build file names to $host format]) +AC_CACHE_VAL(lt_cv_to_host_file_cmd, +[case $host in + *-*-mingw* ) + case $build in + *-*-mingw* ) # actually msys + lt_cv_to_host_file_cmd=func_convert_file_msys_to_w32 + ;; + *-*-cygwin* ) + lt_cv_to_host_file_cmd=func_convert_file_cygwin_to_w32 + ;; + * ) # otherwise, assume *nix + lt_cv_to_host_file_cmd=func_convert_file_nix_to_w32 + ;; + esac + ;; + *-*-cygwin* ) + case $build in + *-*-mingw* ) # actually msys + lt_cv_to_host_file_cmd=func_convert_file_msys_to_cygwin + ;; + *-*-cygwin* ) + lt_cv_to_host_file_cmd=func_convert_file_noop + ;; + * ) # otherwise, assume *nix + lt_cv_to_host_file_cmd=func_convert_file_nix_to_cygwin + ;; + esac + ;; + * ) # unhandled hosts (and "normal" native builds) + lt_cv_to_host_file_cmd=func_convert_file_noop + ;; +esac +]) +to_host_file_cmd=$lt_cv_to_host_file_cmd +AC_MSG_RESULT([$lt_cv_to_host_file_cmd]) +_LT_DECL([to_host_file_cmd], [lt_cv_to_host_file_cmd], + [0], [convert $build file names to $host format])dnl + +AC_MSG_CHECKING([how to convert $build file names to toolchain format]) +AC_CACHE_VAL(lt_cv_to_tool_file_cmd, +[#assume ordinary cross tools, or native build. +lt_cv_to_tool_file_cmd=func_convert_file_noop +case $host in + *-*-mingw* ) + case $build in + *-*-mingw* ) # actually msys + lt_cv_to_tool_file_cmd=func_convert_file_msys_to_w32 + ;; + esac + ;; +esac +]) +to_tool_file_cmd=$lt_cv_to_tool_file_cmd +AC_MSG_RESULT([$lt_cv_to_tool_file_cmd]) +_LT_DECL([to_tool_file_cmd], [lt_cv_to_tool_file_cmd], + [0], [convert $build files to toolchain format])dnl +])# _LT_PATH_CONVERSION_FUNCTIONS diff --git a/m4/ltoptions.m4 b/m4/ltoptions.m4 new file mode 100644 index 0000000..94b0829 --- /dev/null +++ b/m4/ltoptions.m4 @@ -0,0 +1,437 @@ +# Helper functions for option handling. -*- Autoconf -*- +# +# Copyright (C) 2004-2005, 2007-2009, 2011-2015 Free Software +# Foundation, Inc. +# Written by Gary V. Vaughan, 2004 +# +# This file is free software; the Free Software Foundation gives +# unlimited permission to copy and/or distribute it, with or without +# modifications, as long as this notice is preserved. + +# serial 8 ltoptions.m4 + +# This is to help aclocal find these macros, as it can't see m4_define. +AC_DEFUN([LTOPTIONS_VERSION], [m4_if([1])]) + + +# _LT_MANGLE_OPTION(MACRO-NAME, OPTION-NAME) +# ------------------------------------------ +m4_define([_LT_MANGLE_OPTION], +[[_LT_OPTION_]m4_bpatsubst($1__$2, [[^a-zA-Z0-9_]], [_])]) + + +# _LT_SET_OPTION(MACRO-NAME, OPTION-NAME) +# --------------------------------------- +# Set option OPTION-NAME for macro MACRO-NAME, and if there is a +# matching handler defined, dispatch to it. Other OPTION-NAMEs are +# saved as a flag. +m4_define([_LT_SET_OPTION], +[m4_define(_LT_MANGLE_OPTION([$1], [$2]))dnl +m4_ifdef(_LT_MANGLE_DEFUN([$1], [$2]), + _LT_MANGLE_DEFUN([$1], [$2]), + [m4_warning([Unknown $1 option '$2'])])[]dnl +]) + + +# _LT_IF_OPTION(MACRO-NAME, OPTION-NAME, IF-SET, [IF-NOT-SET]) +# ------------------------------------------------------------ +# Execute IF-SET if OPTION is set, IF-NOT-SET otherwise. +m4_define([_LT_IF_OPTION], +[m4_ifdef(_LT_MANGLE_OPTION([$1], [$2]), [$3], [$4])]) + + +# _LT_UNLESS_OPTIONS(MACRO-NAME, OPTION-LIST, IF-NOT-SET) +# ------------------------------------------------------- +# Execute IF-NOT-SET unless all options in OPTION-LIST for MACRO-NAME +# are set. +m4_define([_LT_UNLESS_OPTIONS], +[m4_foreach([_LT_Option], m4_split(m4_normalize([$2])), + [m4_ifdef(_LT_MANGLE_OPTION([$1], _LT_Option), + [m4_define([$0_found])])])[]dnl +m4_ifdef([$0_found], [m4_undefine([$0_found])], [$3 +])[]dnl +]) + + +# _LT_SET_OPTIONS(MACRO-NAME, OPTION-LIST) +# ---------------------------------------- +# OPTION-LIST is a space-separated list of Libtool options associated +# with MACRO-NAME. If any OPTION has a matching handler declared with +# LT_OPTION_DEFINE, dispatch to that macro; otherwise complain about +# the unknown option and exit. +m4_defun([_LT_SET_OPTIONS], +[# Set options +m4_foreach([_LT_Option], m4_split(m4_normalize([$2])), + [_LT_SET_OPTION([$1], _LT_Option)]) + +m4_if([$1],[LT_INIT],[ + dnl + dnl Simply set some default values (i.e off) if boolean options were not + dnl specified: + _LT_UNLESS_OPTIONS([LT_INIT], [dlopen], [enable_dlopen=no + ]) + _LT_UNLESS_OPTIONS([LT_INIT], [win32-dll], [enable_win32_dll=no + ]) + dnl + dnl If no reference was made to various pairs of opposing options, then + dnl we run the default mode handler for the pair. For example, if neither + dnl 'shared' nor 'disable-shared' was passed, we enable building of shared + dnl archives by default: + _LT_UNLESS_OPTIONS([LT_INIT], [shared disable-shared], [_LT_ENABLE_SHARED]) + _LT_UNLESS_OPTIONS([LT_INIT], [static disable-static], [_LT_ENABLE_STATIC]) + _LT_UNLESS_OPTIONS([LT_INIT], [pic-only no-pic], [_LT_WITH_PIC]) + _LT_UNLESS_OPTIONS([LT_INIT], [fast-install disable-fast-install], + [_LT_ENABLE_FAST_INSTALL]) + _LT_UNLESS_OPTIONS([LT_INIT], [aix-soname=aix aix-soname=both aix-soname=svr4], + [_LT_WITH_AIX_SONAME([aix])]) + ]) +])# _LT_SET_OPTIONS + + +## --------------------------------- ## +## Macros to handle LT_INIT options. ## +## --------------------------------- ## + +# _LT_MANGLE_DEFUN(MACRO-NAME, OPTION-NAME) +# ----------------------------------------- +m4_define([_LT_MANGLE_DEFUN], +[[_LT_OPTION_DEFUN_]m4_bpatsubst(m4_toupper([$1__$2]), [[^A-Z0-9_]], [_])]) + + +# LT_OPTION_DEFINE(MACRO-NAME, OPTION-NAME, CODE) +# ----------------------------------------------- +m4_define([LT_OPTION_DEFINE], +[m4_define(_LT_MANGLE_DEFUN([$1], [$2]), [$3])[]dnl +])# LT_OPTION_DEFINE + + +# dlopen +# ------ +LT_OPTION_DEFINE([LT_INIT], [dlopen], [enable_dlopen=yes +]) + +AU_DEFUN([AC_LIBTOOL_DLOPEN], +[_LT_SET_OPTION([LT_INIT], [dlopen]) +AC_DIAGNOSE([obsolete], +[$0: Remove this warning and the call to _LT_SET_OPTION when you +put the 'dlopen' option into LT_INIT's first parameter.]) +]) + +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_LIBTOOL_DLOPEN], []) + + +# win32-dll +# --------- +# Declare package support for building win32 dll's. +LT_OPTION_DEFINE([LT_INIT], [win32-dll], +[enable_win32_dll=yes + +case $host in +*-*-cygwin* | *-*-mingw* | *-*-pw32* | *-*-cegcc*) + AC_CHECK_TOOL(AS, as, false) + AC_CHECK_TOOL(DLLTOOL, dlltool, false) + AC_CHECK_TOOL(OBJDUMP, objdump, false) + ;; +esac + +test -z "$AS" && AS=as +_LT_DECL([], [AS], [1], [Assembler program])dnl + +test -z "$DLLTOOL" && DLLTOOL=dlltool +_LT_DECL([], [DLLTOOL], [1], [DLL creation program])dnl + +test -z "$OBJDUMP" && OBJDUMP=objdump +_LT_DECL([], [OBJDUMP], [1], [Object dumper program])dnl +])# win32-dll + +AU_DEFUN([AC_LIBTOOL_WIN32_DLL], +[AC_REQUIRE([AC_CANONICAL_HOST])dnl +_LT_SET_OPTION([LT_INIT], [win32-dll]) +AC_DIAGNOSE([obsolete], +[$0: Remove this warning and the call to _LT_SET_OPTION when you +put the 'win32-dll' option into LT_INIT's first parameter.]) +]) + +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_LIBTOOL_WIN32_DLL], []) + + +# _LT_ENABLE_SHARED([DEFAULT]) +# ---------------------------- +# implement the --enable-shared flag, and supports the 'shared' and +# 'disable-shared' LT_INIT options. +# DEFAULT is either 'yes' or 'no'. If omitted, it defaults to 'yes'. +m4_define([_LT_ENABLE_SHARED], +[m4_define([_LT_ENABLE_SHARED_DEFAULT], [m4_if($1, no, no, yes)])dnl +AC_ARG_ENABLE([shared], + [AS_HELP_STRING([--enable-shared@<:@=PKGS@:>@], + [build shared libraries @<:@default=]_LT_ENABLE_SHARED_DEFAULT[@:>@])], + [p=${PACKAGE-default} + case $enableval in + yes) enable_shared=yes ;; + no) enable_shared=no ;; + *) + enable_shared=no + # Look at the argument we got. We use all the common list separators. + lt_save_ifs=$IFS; IFS=$IFS$PATH_SEPARATOR, + for pkg in $enableval; do + IFS=$lt_save_ifs + if test "X$pkg" = "X$p"; then + enable_shared=yes + fi + done + IFS=$lt_save_ifs + ;; + esac], + [enable_shared=]_LT_ENABLE_SHARED_DEFAULT) + + _LT_DECL([build_libtool_libs], [enable_shared], [0], + [Whether or not to build shared libraries]) +])# _LT_ENABLE_SHARED + +LT_OPTION_DEFINE([LT_INIT], [shared], [_LT_ENABLE_SHARED([yes])]) +LT_OPTION_DEFINE([LT_INIT], [disable-shared], [_LT_ENABLE_SHARED([no])]) + +# Old names: +AC_DEFUN([AC_ENABLE_SHARED], +[_LT_SET_OPTION([LT_INIT], m4_if([$1], [no], [disable-])[shared]) +]) + +AC_DEFUN([AC_DISABLE_SHARED], +[_LT_SET_OPTION([LT_INIT], [disable-shared]) +]) + +AU_DEFUN([AM_ENABLE_SHARED], [AC_ENABLE_SHARED($@)]) +AU_DEFUN([AM_DISABLE_SHARED], [AC_DISABLE_SHARED($@)]) + +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AM_ENABLE_SHARED], []) +dnl AC_DEFUN([AM_DISABLE_SHARED], []) + + + +# _LT_ENABLE_STATIC([DEFAULT]) +# ---------------------------- +# implement the --enable-static flag, and support the 'static' and +# 'disable-static' LT_INIT options. +# DEFAULT is either 'yes' or 'no'. If omitted, it defaults to 'yes'. +m4_define([_LT_ENABLE_STATIC], +[m4_define([_LT_ENABLE_STATIC_DEFAULT], [m4_if($1, no, no, yes)])dnl +AC_ARG_ENABLE([static], + [AS_HELP_STRING([--enable-static@<:@=PKGS@:>@], + [build static libraries @<:@default=]_LT_ENABLE_STATIC_DEFAULT[@:>@])], + [p=${PACKAGE-default} + case $enableval in + yes) enable_static=yes ;; + no) enable_static=no ;; + *) + enable_static=no + # Look at the argument we got. We use all the common list separators. + lt_save_ifs=$IFS; IFS=$IFS$PATH_SEPARATOR, + for pkg in $enableval; do + IFS=$lt_save_ifs + if test "X$pkg" = "X$p"; then + enable_static=yes + fi + done + IFS=$lt_save_ifs + ;; + esac], + [enable_static=]_LT_ENABLE_STATIC_DEFAULT) + + _LT_DECL([build_old_libs], [enable_static], [0], + [Whether or not to build static libraries]) +])# _LT_ENABLE_STATIC + +LT_OPTION_DEFINE([LT_INIT], [static], [_LT_ENABLE_STATIC([yes])]) +LT_OPTION_DEFINE([LT_INIT], [disable-static], [_LT_ENABLE_STATIC([no])]) + +# Old names: +AC_DEFUN([AC_ENABLE_STATIC], +[_LT_SET_OPTION([LT_INIT], m4_if([$1], [no], [disable-])[static]) +]) + +AC_DEFUN([AC_DISABLE_STATIC], +[_LT_SET_OPTION([LT_INIT], [disable-static]) +]) + +AU_DEFUN([AM_ENABLE_STATIC], [AC_ENABLE_STATIC($@)]) +AU_DEFUN([AM_DISABLE_STATIC], [AC_DISABLE_STATIC($@)]) + +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AM_ENABLE_STATIC], []) +dnl AC_DEFUN([AM_DISABLE_STATIC], []) + + + +# _LT_ENABLE_FAST_INSTALL([DEFAULT]) +# ---------------------------------- +# implement the --enable-fast-install flag, and support the 'fast-install' +# and 'disable-fast-install' LT_INIT options. +# DEFAULT is either 'yes' or 'no'. If omitted, it defaults to 'yes'. +m4_define([_LT_ENABLE_FAST_INSTALL], +[m4_define([_LT_ENABLE_FAST_INSTALL_DEFAULT], [m4_if($1, no, no, yes)])dnl +AC_ARG_ENABLE([fast-install], + [AS_HELP_STRING([--enable-fast-install@<:@=PKGS@:>@], + [optimize for fast installation @<:@default=]_LT_ENABLE_FAST_INSTALL_DEFAULT[@:>@])], + [p=${PACKAGE-default} + case $enableval in + yes) enable_fast_install=yes ;; + no) enable_fast_install=no ;; + *) + enable_fast_install=no + # Look at the argument we got. We use all the common list separators. + lt_save_ifs=$IFS; IFS=$IFS$PATH_SEPARATOR, + for pkg in $enableval; do + IFS=$lt_save_ifs + if test "X$pkg" = "X$p"; then + enable_fast_install=yes + fi + done + IFS=$lt_save_ifs + ;; + esac], + [enable_fast_install=]_LT_ENABLE_FAST_INSTALL_DEFAULT) + +_LT_DECL([fast_install], [enable_fast_install], [0], + [Whether or not to optimize for fast installation])dnl +])# _LT_ENABLE_FAST_INSTALL + +LT_OPTION_DEFINE([LT_INIT], [fast-install], [_LT_ENABLE_FAST_INSTALL([yes])]) +LT_OPTION_DEFINE([LT_INIT], [disable-fast-install], [_LT_ENABLE_FAST_INSTALL([no])]) + +# Old names: +AU_DEFUN([AC_ENABLE_FAST_INSTALL], +[_LT_SET_OPTION([LT_INIT], m4_if([$1], [no], [disable-])[fast-install]) +AC_DIAGNOSE([obsolete], +[$0: Remove this warning and the call to _LT_SET_OPTION when you put +the 'fast-install' option into LT_INIT's first parameter.]) +]) + +AU_DEFUN([AC_DISABLE_FAST_INSTALL], +[_LT_SET_OPTION([LT_INIT], [disable-fast-install]) +AC_DIAGNOSE([obsolete], +[$0: Remove this warning and the call to _LT_SET_OPTION when you put +the 'disable-fast-install' option into LT_INIT's first parameter.]) +]) + +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_ENABLE_FAST_INSTALL], []) +dnl AC_DEFUN([AM_DISABLE_FAST_INSTALL], []) + + +# _LT_WITH_AIX_SONAME([DEFAULT]) +# ---------------------------------- +# implement the --with-aix-soname flag, and support the `aix-soname=aix' +# and `aix-soname=both' and `aix-soname=svr4' LT_INIT options. DEFAULT +# is either `aix', `both' or `svr4'. If omitted, it defaults to `aix'. +m4_define([_LT_WITH_AIX_SONAME], +[m4_define([_LT_WITH_AIX_SONAME_DEFAULT], [m4_if($1, svr4, svr4, m4_if($1, both, both, aix))])dnl +shared_archive_member_spec= +case $host,$enable_shared in +power*-*-aix[[5-9]]*,yes) + AC_MSG_CHECKING([which variant of shared library versioning to provide]) + AC_ARG_WITH([aix-soname], + [AS_HELP_STRING([--with-aix-soname=aix|svr4|both], + [shared library versioning (aka "SONAME") variant to provide on AIX, @<:@default=]_LT_WITH_AIX_SONAME_DEFAULT[@:>@.])], + [case $withval in + aix|svr4|both) + ;; + *) + AC_MSG_ERROR([Unknown argument to --with-aix-soname]) + ;; + esac + lt_cv_with_aix_soname=$with_aix_soname], + [AC_CACHE_VAL([lt_cv_with_aix_soname], + [lt_cv_with_aix_soname=]_LT_WITH_AIX_SONAME_DEFAULT) + with_aix_soname=$lt_cv_with_aix_soname]) + AC_MSG_RESULT([$with_aix_soname]) + if test aix != "$with_aix_soname"; then + # For the AIX way of multilib, we name the shared archive member + # based on the bitwidth used, traditionally 'shr.o' or 'shr_64.o', + # and 'shr.imp' or 'shr_64.imp', respectively, for the Import File. + # Even when GNU compilers ignore OBJECT_MODE but need '-maix64' flag, + # the AIX toolchain works better with OBJECT_MODE set (default 32). + if test 64 = "${OBJECT_MODE-32}"; then + shared_archive_member_spec=shr_64 + else + shared_archive_member_spec=shr + fi + fi + ;; +*) + with_aix_soname=aix + ;; +esac + +_LT_DECL([], [shared_archive_member_spec], [0], + [Shared archive member basename, for filename based shared library versioning on AIX])dnl +])# _LT_WITH_AIX_SONAME + +LT_OPTION_DEFINE([LT_INIT], [aix-soname=aix], [_LT_WITH_AIX_SONAME([aix])]) +LT_OPTION_DEFINE([LT_INIT], [aix-soname=both], [_LT_WITH_AIX_SONAME([both])]) +LT_OPTION_DEFINE([LT_INIT], [aix-soname=svr4], [_LT_WITH_AIX_SONAME([svr4])]) + + +# _LT_WITH_PIC([MODE]) +# -------------------- +# implement the --with-pic flag, and support the 'pic-only' and 'no-pic' +# LT_INIT options. +# MODE is either 'yes' or 'no'. If omitted, it defaults to 'both'. +m4_define([_LT_WITH_PIC], +[AC_ARG_WITH([pic], + [AS_HELP_STRING([--with-pic@<:@=PKGS@:>@], + [try to use only PIC/non-PIC objects @<:@default=use both@:>@])], + [lt_p=${PACKAGE-default} + case $withval in + yes|no) pic_mode=$withval ;; + *) + pic_mode=default + # Look at the argument we got. We use all the common list separators. + lt_save_ifs=$IFS; IFS=$IFS$PATH_SEPARATOR, + for lt_pkg in $withval; do + IFS=$lt_save_ifs + if test "X$lt_pkg" = "X$lt_p"; then + pic_mode=yes + fi + done + IFS=$lt_save_ifs + ;; + esac], + [pic_mode=m4_default([$1], [default])]) + +_LT_DECL([], [pic_mode], [0], [What type of objects to build])dnl +])# _LT_WITH_PIC + +LT_OPTION_DEFINE([LT_INIT], [pic-only], [_LT_WITH_PIC([yes])]) +LT_OPTION_DEFINE([LT_INIT], [no-pic], [_LT_WITH_PIC([no])]) + +# Old name: +AU_DEFUN([AC_LIBTOOL_PICMODE], +[_LT_SET_OPTION([LT_INIT], [pic-only]) +AC_DIAGNOSE([obsolete], +[$0: Remove this warning and the call to _LT_SET_OPTION when you +put the 'pic-only' option into LT_INIT's first parameter.]) +]) + +dnl aclocal-1.4 backwards compatibility: +dnl AC_DEFUN([AC_LIBTOOL_PICMODE], []) + +## ----------------- ## +## LTDL_INIT Options ## +## ----------------- ## + +m4_define([_LTDL_MODE], []) +LT_OPTION_DEFINE([LTDL_INIT], [nonrecursive], + [m4_define([_LTDL_MODE], [nonrecursive])]) +LT_OPTION_DEFINE([LTDL_INIT], [recursive], + [m4_define([_LTDL_MODE], [recursive])]) +LT_OPTION_DEFINE([LTDL_INIT], [subproject], + [m4_define([_LTDL_MODE], [subproject])]) + +m4_define([_LTDL_TYPE], []) +LT_OPTION_DEFINE([LTDL_INIT], [installable], + [m4_define([_LTDL_TYPE], [installable])]) +LT_OPTION_DEFINE([LTDL_INIT], [convenience], + [m4_define([_LTDL_TYPE], [convenience])]) diff --git a/m4/ltsugar.m4 b/m4/ltsugar.m4 new file mode 100644 index 0000000..48bc934 --- /dev/null +++ b/m4/ltsugar.m4 @@ -0,0 +1,124 @@ +# ltsugar.m4 -- libtool m4 base layer. -*-Autoconf-*- +# +# Copyright (C) 2004-2005, 2007-2008, 2011-2015 Free Software +# Foundation, Inc. +# Written by Gary V. Vaughan, 2004 +# +# This file is free software; the Free Software Foundation gives +# unlimited permission to copy and/or distribute it, with or without +# modifications, as long as this notice is preserved. + +# serial 6 ltsugar.m4 + +# This is to help aclocal find these macros, as it can't see m4_define. +AC_DEFUN([LTSUGAR_VERSION], [m4_if([0.1])]) + + +# lt_join(SEP, ARG1, [ARG2...]) +# ----------------------------- +# Produce ARG1SEPARG2...SEPARGn, omitting [] arguments and their +# associated separator. +# Needed until we can rely on m4_join from Autoconf 2.62, since all earlier +# versions in m4sugar had bugs. +m4_define([lt_join], +[m4_if([$#], [1], [], + [$#], [2], [[$2]], + [m4_if([$2], [], [], [[$2]_])$0([$1], m4_shift(m4_shift($@)))])]) +m4_define([_lt_join], +[m4_if([$#$2], [2], [], + [m4_if([$2], [], [], [[$1$2]])$0([$1], m4_shift(m4_shift($@)))])]) + + +# lt_car(LIST) +# lt_cdr(LIST) +# ------------ +# Manipulate m4 lists. +# These macros are necessary as long as will still need to support +# Autoconf-2.59, which quotes differently. +m4_define([lt_car], [[$1]]) +m4_define([lt_cdr], +[m4_if([$#], 0, [m4_fatal([$0: cannot be called without arguments])], + [$#], 1, [], + [m4_dquote(m4_shift($@))])]) +m4_define([lt_unquote], $1) + + +# lt_append(MACRO-NAME, STRING, [SEPARATOR]) +# ------------------------------------------ +# Redefine MACRO-NAME to hold its former content plus 'SEPARATOR''STRING'. +# Note that neither SEPARATOR nor STRING are expanded; they are appended +# to MACRO-NAME as is (leaving the expansion for when MACRO-NAME is invoked). +# No SEPARATOR is output if MACRO-NAME was previously undefined (different +# than defined and empty). +# +# This macro is needed until we can rely on Autoconf 2.62, since earlier +# versions of m4sugar mistakenly expanded SEPARATOR but not STRING. +m4_define([lt_append], +[m4_define([$1], + m4_ifdef([$1], [m4_defn([$1])[$3]])[$2])]) + + + +# lt_combine(SEP, PREFIX-LIST, INFIX, SUFFIX1, [SUFFIX2...]) +# ---------------------------------------------------------- +# Produce a SEP delimited list of all paired combinations of elements of +# PREFIX-LIST with SUFFIX1 through SUFFIXn. Each element of the list +# has the form PREFIXmINFIXSUFFIXn. +# Needed until we can rely on m4_combine added in Autoconf 2.62. +m4_define([lt_combine], +[m4_if(m4_eval([$# > 3]), [1], + [m4_pushdef([_Lt_sep], [m4_define([_Lt_sep], m4_defn([lt_car]))])]]dnl +[[m4_foreach([_Lt_prefix], [$2], + [m4_foreach([_Lt_suffix], + ]m4_dquote(m4_dquote(m4_shift(m4_shift(m4_shift($@)))))[, + [_Lt_sep([$1])[]m4_defn([_Lt_prefix])[$3]m4_defn([_Lt_suffix])])])])]) + + +# lt_if_append_uniq(MACRO-NAME, VARNAME, [SEPARATOR], [UNIQ], [NOT-UNIQ]) +# ----------------------------------------------------------------------- +# Iff MACRO-NAME does not yet contain VARNAME, then append it (delimited +# by SEPARATOR if supplied) and expand UNIQ, else NOT-UNIQ. +m4_define([lt_if_append_uniq], +[m4_ifdef([$1], + [m4_if(m4_index([$3]m4_defn([$1])[$3], [$3$2$3]), [-1], + [lt_append([$1], [$2], [$3])$4], + [$5])], + [lt_append([$1], [$2], [$3])$4])]) + + +# lt_dict_add(DICT, KEY, VALUE) +# ----------------------------- +m4_define([lt_dict_add], +[m4_define([$1($2)], [$3])]) + + +# lt_dict_add_subkey(DICT, KEY, SUBKEY, VALUE) +# -------------------------------------------- +m4_define([lt_dict_add_subkey], +[m4_define([$1($2:$3)], [$4])]) + + +# lt_dict_fetch(DICT, KEY, [SUBKEY]) +# ---------------------------------- +m4_define([lt_dict_fetch], +[m4_ifval([$3], + m4_ifdef([$1($2:$3)], [m4_defn([$1($2:$3)])]), + m4_ifdef([$1($2)], [m4_defn([$1($2)])]))]) + + +# lt_if_dict_fetch(DICT, KEY, [SUBKEY], VALUE, IF-TRUE, [IF-FALSE]) +# ----------------------------------------------------------------- +m4_define([lt_if_dict_fetch], +[m4_if(lt_dict_fetch([$1], [$2], [$3]), [$4], + [$5], + [$6])]) + + +# lt_dict_filter(DICT, [SUBKEY], VALUE, [SEPARATOR], KEY, [...]) +# -------------------------------------------------------------- +m4_define([lt_dict_filter], +[m4_if([$5], [], [], + [lt_join(m4_quote(m4_default([$4], [[, ]])), + lt_unquote(m4_split(m4_normalize(m4_foreach(_Lt_key, lt_car([m4_shiftn(4, $@)]), + [lt_if_dict_fetch([$1], _Lt_key, [$2], [$3], [_Lt_key ])])))))])[]dnl +]) diff --git a/m4/ltversion.m4 b/m4/ltversion.m4 new file mode 100644 index 0000000..fa04b52 --- /dev/null +++ b/m4/ltversion.m4 @@ -0,0 +1,23 @@ +# ltversion.m4 -- version numbers -*- Autoconf -*- +# +# Copyright (C) 2004, 2011-2015 Free Software Foundation, Inc. +# Written by Scott James Remnant, 2004 +# +# This file is free software; the Free Software Foundation gives +# unlimited permission to copy and/or distribute it, with or without +# modifications, as long as this notice is preserved. + +# @configure_input@ + +# serial 4179 ltversion.m4 +# This file is part of GNU Libtool + +m4_define([LT_PACKAGE_VERSION], [2.4.6]) +m4_define([LT_PACKAGE_REVISION], [2.4.6]) + +AC_DEFUN([LTVERSION_VERSION], +[macro_version='2.4.6' +macro_revision='2.4.6' +_LT_DECL(, macro_version, 0, [Which release of libtool.m4 was used?]) +_LT_DECL(, macro_revision, 0) +]) diff --git a/m4/lt~obsolete.m4 b/m4/lt~obsolete.m4 new file mode 100644 index 0000000..c6b26f8 --- /dev/null +++ b/m4/lt~obsolete.m4 @@ -0,0 +1,99 @@ +# lt~obsolete.m4 -- aclocal satisfying obsolete definitions. -*-Autoconf-*- +# +# Copyright (C) 2004-2005, 2007, 2009, 2011-2015 Free Software +# Foundation, Inc. +# Written by Scott James Remnant, 2004. +# +# This file is free software; the Free Software Foundation gives +# unlimited permission to copy and/or distribute it, with or without +# modifications, as long as this notice is preserved. + +# serial 5 lt~obsolete.m4 + +# These exist entirely to fool aclocal when bootstrapping libtool. +# +# In the past libtool.m4 has provided macros via AC_DEFUN (or AU_DEFUN), +# which have later been changed to m4_define as they aren't part of the +# exported API, or moved to Autoconf or Automake where they belong. +# +# The trouble is, aclocal is a bit thick. It'll see the old AC_DEFUN +# in /usr/share/aclocal/libtool.m4 and remember it, then when it sees us +# using a macro with the same name in our local m4/libtool.m4 it'll +# pull the old libtool.m4 in (it doesn't see our shiny new m4_define +# and doesn't know about Autoconf macros at all.) +# +# So we provide this file, which has a silly filename so it's always +# included after everything else. This provides aclocal with the +# AC_DEFUNs it wants, but when m4 processes it, it doesn't do anything +# because those macros already exist, or will be overwritten later. +# We use AC_DEFUN over AU_DEFUN for compatibility with aclocal-1.6. +# +# Anytime we withdraw an AC_DEFUN or AU_DEFUN, remember to add it here. +# Yes, that means every name once taken will need to remain here until +# we give up compatibility with versions before 1.7, at which point +# we need to keep only those names which we still refer to. + +# This is to help aclocal find these macros, as it can't see m4_define. +AC_DEFUN([LTOBSOLETE_VERSION], [m4_if([1])]) + +m4_ifndef([AC_LIBTOOL_LINKER_OPTION], [AC_DEFUN([AC_LIBTOOL_LINKER_OPTION])]) +m4_ifndef([AC_PROG_EGREP], [AC_DEFUN([AC_PROG_EGREP])]) +m4_ifndef([_LT_AC_PROG_ECHO_BACKSLASH], [AC_DEFUN([_LT_AC_PROG_ECHO_BACKSLASH])]) +m4_ifndef([_LT_AC_SHELL_INIT], [AC_DEFUN([_LT_AC_SHELL_INIT])]) +m4_ifndef([_LT_AC_SYS_LIBPATH_AIX], [AC_DEFUN([_LT_AC_SYS_LIBPATH_AIX])]) +m4_ifndef([_LT_PROG_LTMAIN], [AC_DEFUN([_LT_PROG_LTMAIN])]) +m4_ifndef([_LT_AC_TAGVAR], [AC_DEFUN([_LT_AC_TAGVAR])]) +m4_ifndef([AC_LTDL_ENABLE_INSTALL], [AC_DEFUN([AC_LTDL_ENABLE_INSTALL])]) +m4_ifndef([AC_LTDL_PREOPEN], [AC_DEFUN([AC_LTDL_PREOPEN])]) +m4_ifndef([_LT_AC_SYS_COMPILER], [AC_DEFUN([_LT_AC_SYS_COMPILER])]) +m4_ifndef([_LT_AC_LOCK], [AC_DEFUN([_LT_AC_LOCK])]) +m4_ifndef([AC_LIBTOOL_SYS_OLD_ARCHIVE], [AC_DEFUN([AC_LIBTOOL_SYS_OLD_ARCHIVE])]) +m4_ifndef([_LT_AC_TRY_DLOPEN_SELF], [AC_DEFUN([_LT_AC_TRY_DLOPEN_SELF])]) +m4_ifndef([AC_LIBTOOL_PROG_CC_C_O], [AC_DEFUN([AC_LIBTOOL_PROG_CC_C_O])]) +m4_ifndef([AC_LIBTOOL_SYS_HARD_LINK_LOCKS], [AC_DEFUN([AC_LIBTOOL_SYS_HARD_LINK_LOCKS])]) +m4_ifndef([AC_LIBTOOL_OBJDIR], [AC_DEFUN([AC_LIBTOOL_OBJDIR])]) +m4_ifndef([AC_LTDL_OBJDIR], [AC_DEFUN([AC_LTDL_OBJDIR])]) +m4_ifndef([AC_LIBTOOL_PROG_LD_HARDCODE_LIBPATH], [AC_DEFUN([AC_LIBTOOL_PROG_LD_HARDCODE_LIBPATH])]) +m4_ifndef([AC_LIBTOOL_SYS_LIB_STRIP], [AC_DEFUN([AC_LIBTOOL_SYS_LIB_STRIP])]) +m4_ifndef([AC_PATH_MAGIC], [AC_DEFUN([AC_PATH_MAGIC])]) +m4_ifndef([AC_PROG_LD_GNU], [AC_DEFUN([AC_PROG_LD_GNU])]) +m4_ifndef([AC_PROG_LD_RELOAD_FLAG], [AC_DEFUN([AC_PROG_LD_RELOAD_FLAG])]) +m4_ifndef([AC_DEPLIBS_CHECK_METHOD], [AC_DEFUN([AC_DEPLIBS_CHECK_METHOD])]) +m4_ifndef([AC_LIBTOOL_PROG_COMPILER_NO_RTTI], [AC_DEFUN([AC_LIBTOOL_PROG_COMPILER_NO_RTTI])]) +m4_ifndef([AC_LIBTOOL_SYS_GLOBAL_SYMBOL_PIPE], [AC_DEFUN([AC_LIBTOOL_SYS_GLOBAL_SYMBOL_PIPE])]) +m4_ifndef([AC_LIBTOOL_PROG_COMPILER_PIC], [AC_DEFUN([AC_LIBTOOL_PROG_COMPILER_PIC])]) +m4_ifndef([AC_LIBTOOL_PROG_LD_SHLIBS], [AC_DEFUN([AC_LIBTOOL_PROG_LD_SHLIBS])]) +m4_ifndef([AC_LIBTOOL_POSTDEP_PREDEP], [AC_DEFUN([AC_LIBTOOL_POSTDEP_PREDEP])]) +m4_ifndef([LT_AC_PROG_EGREP], [AC_DEFUN([LT_AC_PROG_EGREP])]) +m4_ifndef([LT_AC_PROG_SED], [AC_DEFUN([LT_AC_PROG_SED])]) +m4_ifndef([_LT_CC_BASENAME], [AC_DEFUN([_LT_CC_BASENAME])]) +m4_ifndef([_LT_COMPILER_BOILERPLATE], [AC_DEFUN([_LT_COMPILER_BOILERPLATE])]) +m4_ifndef([_LT_LINKER_BOILERPLATE], [AC_DEFUN([_LT_LINKER_BOILERPLATE])]) +m4_ifndef([_AC_PROG_LIBTOOL], [AC_DEFUN([_AC_PROG_LIBTOOL])]) +m4_ifndef([AC_LIBTOOL_SETUP], [AC_DEFUN([AC_LIBTOOL_SETUP])]) +m4_ifndef([_LT_AC_CHECK_DLFCN], [AC_DEFUN([_LT_AC_CHECK_DLFCN])]) +m4_ifndef([AC_LIBTOOL_SYS_DYNAMIC_LINKER], [AC_DEFUN([AC_LIBTOOL_SYS_DYNAMIC_LINKER])]) +m4_ifndef([_LT_AC_TAGCONFIG], [AC_DEFUN([_LT_AC_TAGCONFIG])]) +m4_ifndef([AC_DISABLE_FAST_INSTALL], [AC_DEFUN([AC_DISABLE_FAST_INSTALL])]) +m4_ifndef([_LT_AC_LANG_CXX], [AC_DEFUN([_LT_AC_LANG_CXX])]) +m4_ifndef([_LT_AC_LANG_F77], [AC_DEFUN([_LT_AC_LANG_F77])]) +m4_ifndef([_LT_AC_LANG_GCJ], [AC_DEFUN([_LT_AC_LANG_GCJ])]) +m4_ifndef([AC_LIBTOOL_LANG_C_CONFIG], [AC_DEFUN([AC_LIBTOOL_LANG_C_CONFIG])]) +m4_ifndef([_LT_AC_LANG_C_CONFIG], [AC_DEFUN([_LT_AC_LANG_C_CONFIG])]) +m4_ifndef([AC_LIBTOOL_LANG_CXX_CONFIG], [AC_DEFUN([AC_LIBTOOL_LANG_CXX_CONFIG])]) +m4_ifndef([_LT_AC_LANG_CXX_CONFIG], [AC_DEFUN([_LT_AC_LANG_CXX_CONFIG])]) +m4_ifndef([AC_LIBTOOL_LANG_F77_CONFIG], [AC_DEFUN([AC_LIBTOOL_LANG_F77_CONFIG])]) +m4_ifndef([_LT_AC_LANG_F77_CONFIG], [AC_DEFUN([_LT_AC_LANG_F77_CONFIG])]) +m4_ifndef([AC_LIBTOOL_LANG_GCJ_CONFIG], [AC_DEFUN([AC_LIBTOOL_LANG_GCJ_CONFIG])]) +m4_ifndef([_LT_AC_LANG_GCJ_CONFIG], [AC_DEFUN([_LT_AC_LANG_GCJ_CONFIG])]) +m4_ifndef([AC_LIBTOOL_LANG_RC_CONFIG], [AC_DEFUN([AC_LIBTOOL_LANG_RC_CONFIG])]) +m4_ifndef([_LT_AC_LANG_RC_CONFIG], [AC_DEFUN([_LT_AC_LANG_RC_CONFIG])]) +m4_ifndef([AC_LIBTOOL_CONFIG], [AC_DEFUN([AC_LIBTOOL_CONFIG])]) +m4_ifndef([_LT_AC_FILE_LTDLL_C], [AC_DEFUN([_LT_AC_FILE_LTDLL_C])]) +m4_ifndef([_LT_REQUIRED_DARWIN_CHECKS], [AC_DEFUN([_LT_REQUIRED_DARWIN_CHECKS])]) +m4_ifndef([_LT_AC_PROG_CXXCPP], [AC_DEFUN([_LT_AC_PROG_CXXCPP])]) +m4_ifndef([_LT_PREPARE_SED_QUOTE_VARS], [AC_DEFUN([_LT_PREPARE_SED_QUOTE_VARS])]) +m4_ifndef([_LT_PROG_ECHO_BACKSLASH], [AC_DEFUN([_LT_PROG_ECHO_BACKSLASH])]) +m4_ifndef([_LT_PROG_F77], [AC_DEFUN([_LT_PROG_F77])]) +m4_ifndef([_LT_PROG_FC], [AC_DEFUN([_LT_PROG_FC])]) +m4_ifndef([_LT_PROG_CXX], [AC_DEFUN([_LT_PROG_CXX])]) diff --git a/m4/pkg.m4 b/m4/pkg.m4 new file mode 100644 index 0000000..82bea96 --- /dev/null +++ b/m4/pkg.m4 @@ -0,0 +1,275 @@ +dnl pkg.m4 - Macros to locate and utilise pkg-config. -*- Autoconf -*- +dnl serial 11 (pkg-config-0.29.1) +dnl +dnl Copyright © 2004 Scott James Remnant . +dnl Copyright © 2012-2015 Dan Nicholson +dnl +dnl This program is free software; you can redistribute it and/or modify +dnl it under the terms of the GNU General Public License as published by +dnl the Free Software Foundation; either version 2 of the License, or +dnl (at your option) any later version. +dnl +dnl This program is distributed in the hope that it will be useful, but +dnl WITHOUT ANY WARRANTY; without even the implied warranty of +dnl MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +dnl General Public License for more details. +dnl +dnl You should have received a copy of the GNU General Public License +dnl along with this program; if not, write to the Free Software +dnl Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA +dnl 02111-1307, USA. +dnl +dnl As a special exception to the GNU General Public License, if you +dnl distribute this file as part of a program that contains a +dnl configuration script generated by Autoconf, you may include it under +dnl the same distribution terms that you use for the rest of that +dnl program. + +dnl PKG_PREREQ(MIN-VERSION) +dnl ----------------------- +dnl Since: 0.29 +dnl +dnl Verify that the version of the pkg-config macros are at least +dnl MIN-VERSION. Unlike PKG_PROG_PKG_CONFIG, which checks the user's +dnl installed version of pkg-config, this checks the developer's version +dnl of pkg.m4 when generating configure. +dnl +dnl To ensure that this macro is defined, also add: +dnl m4_ifndef([PKG_PREREQ], +dnl [m4_fatal([must install pkg-config 0.29 or later before running autoconf/autogen])]) +dnl +dnl See the "Since" comment for each macro you use to see what version +dnl of the macros you require. +m4_defun([PKG_PREREQ], +[m4_define([PKG_MACROS_VERSION], [0.29.1]) +m4_if(m4_version_compare(PKG_MACROS_VERSION, [$1]), -1, + [m4_fatal([pkg.m4 version $1 or higher is required but ]PKG_MACROS_VERSION[ found])]) +])dnl PKG_PREREQ + +dnl PKG_PROG_PKG_CONFIG([MIN-VERSION]) +dnl ---------------------------------- +dnl Since: 0.16 +dnl +dnl Search for the pkg-config tool and set the PKG_CONFIG variable to +dnl first found in the path. Checks that the version of pkg-config found +dnl is at least MIN-VERSION. If MIN-VERSION is not specified, 0.9.0 is +dnl used since that's the first version where most current features of +dnl pkg-config existed. +AC_DEFUN([PKG_PROG_PKG_CONFIG], +[m4_pattern_forbid([^_?PKG_[A-Z_]+$]) +m4_pattern_allow([^PKG_CONFIG(_(PATH|LIBDIR|SYSROOT_DIR|ALLOW_SYSTEM_(CFLAGS|LIBS)))?$]) +m4_pattern_allow([^PKG_CONFIG_(DISABLE_UNINSTALLED|TOP_BUILD_DIR|DEBUG_SPEW)$]) +AC_ARG_VAR([PKG_CONFIG], [path to pkg-config utility]) +AC_ARG_VAR([PKG_CONFIG_PATH], [directories to add to pkg-config's search path]) +AC_ARG_VAR([PKG_CONFIG_LIBDIR], [path overriding pkg-config's built-in search path]) + +if test "x$ac_cv_env_PKG_CONFIG_set" != "xset"; then + AC_PATH_TOOL([PKG_CONFIG], [pkg-config]) +fi +if test -n "$PKG_CONFIG"; then + _pkg_min_version=m4_default([$1], [0.9.0]) + AC_MSG_CHECKING([pkg-config is at least version $_pkg_min_version]) + if $PKG_CONFIG --atleast-pkgconfig-version $_pkg_min_version; then + AC_MSG_RESULT([yes]) + else + AC_MSG_RESULT([no]) + PKG_CONFIG="" + fi +fi[]dnl +])dnl PKG_PROG_PKG_CONFIG + +dnl PKG_CHECK_EXISTS(MODULES, [ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND]) +dnl ------------------------------------------------------------------- +dnl Since: 0.18 +dnl +dnl Check to see whether a particular set of modules exists. Similar to +dnl PKG_CHECK_MODULES(), but does not set variables or print errors. +dnl +dnl Please remember that m4 expands AC_REQUIRE([PKG_PROG_PKG_CONFIG]) +dnl only at the first occurence in configure.ac, so if the first place +dnl it's called might be skipped (such as if it is within an "if", you +dnl have to call PKG_CHECK_EXISTS manually +AC_DEFUN([PKG_CHECK_EXISTS], +[AC_REQUIRE([PKG_PROG_PKG_CONFIG])dnl +if test -n "$PKG_CONFIG" && \ + AC_RUN_LOG([$PKG_CONFIG --exists --print-errors "$1"]); then + m4_default([$2], [:]) +m4_ifvaln([$3], [else + $3])dnl +fi]) + +dnl _PKG_CONFIG([VARIABLE], [COMMAND], [MODULES]) +dnl --------------------------------------------- +dnl Internal wrapper calling pkg-config via PKG_CONFIG and setting +dnl pkg_failed based on the result. +m4_define([_PKG_CONFIG], +[if test -n "$$1"; then + pkg_cv_[]$1="$$1" + elif test -n "$PKG_CONFIG"; then + PKG_CHECK_EXISTS([$3], + [pkg_cv_[]$1=`$PKG_CONFIG --[]$2 "$3" 2>/dev/null` + test "x$?" != "x0" && pkg_failed=yes ], + [pkg_failed=yes]) + else + pkg_failed=untried +fi[]dnl +])dnl _PKG_CONFIG + +dnl _PKG_SHORT_ERRORS_SUPPORTED +dnl --------------------------- +dnl Internal check to see if pkg-config supports short errors. +AC_DEFUN([_PKG_SHORT_ERRORS_SUPPORTED], +[AC_REQUIRE([PKG_PROG_PKG_CONFIG]) +if $PKG_CONFIG --atleast-pkgconfig-version 0.20; then + _pkg_short_errors_supported=yes +else + _pkg_short_errors_supported=no +fi[]dnl +])dnl _PKG_SHORT_ERRORS_SUPPORTED + + +dnl PKG_CHECK_MODULES(VARIABLE-PREFIX, MODULES, [ACTION-IF-FOUND], +dnl [ACTION-IF-NOT-FOUND]) +dnl -------------------------------------------------------------- +dnl Since: 0.4.0 +dnl +dnl Note that if there is a possibility the first call to +dnl PKG_CHECK_MODULES might not happen, you should be sure to include an +dnl explicit call to PKG_PROG_PKG_CONFIG in your configure.ac +AC_DEFUN([PKG_CHECK_MODULES], +[AC_REQUIRE([PKG_PROG_PKG_CONFIG])dnl +AC_ARG_VAR([$1][_CFLAGS], [C compiler flags for $1, overriding pkg-config])dnl +AC_ARG_VAR([$1][_LIBS], [linker flags for $1, overriding pkg-config])dnl + +pkg_failed=no +AC_MSG_CHECKING([for $1]) + +_PKG_CONFIG([$1][_CFLAGS], [cflags], [$2]) +_PKG_CONFIG([$1][_LIBS], [libs], [$2]) + +m4_define([_PKG_TEXT], [Alternatively, you may set the environment variables $1[]_CFLAGS +and $1[]_LIBS to avoid the need to call pkg-config. +See the pkg-config man page for more details.]) + +if test $pkg_failed = yes; then + AC_MSG_RESULT([no]) + _PKG_SHORT_ERRORS_SUPPORTED + if test $_pkg_short_errors_supported = yes; then + $1[]_PKG_ERRORS=`$PKG_CONFIG --short-errors --print-errors --cflags --libs "$2" 2>&1` + else + $1[]_PKG_ERRORS=`$PKG_CONFIG --print-errors --cflags --libs "$2" 2>&1` + fi + # Put the nasty error message in config.log where it belongs + echo "$$1[]_PKG_ERRORS" >&AS_MESSAGE_LOG_FD + + m4_default([$4], [AC_MSG_ERROR( +[Package requirements ($2) were not met: + +$$1_PKG_ERRORS + +Consider adjusting the PKG_CONFIG_PATH environment variable if you +installed software in a non-standard prefix. + +_PKG_TEXT])[]dnl + ]) +elif test $pkg_failed = untried; then + AC_MSG_RESULT([no]) + m4_default([$4], [AC_MSG_FAILURE( +[The pkg-config script could not be found or is too old. Make sure it +is in your PATH or set the PKG_CONFIG environment variable to the full +path to pkg-config. + +_PKG_TEXT + +To get pkg-config, see .])[]dnl + ]) +else + $1[]_CFLAGS=$pkg_cv_[]$1[]_CFLAGS + $1[]_LIBS=$pkg_cv_[]$1[]_LIBS + AC_MSG_RESULT([yes]) + $3 +fi[]dnl +])dnl PKG_CHECK_MODULES + + +dnl PKG_CHECK_MODULES_STATIC(VARIABLE-PREFIX, MODULES, [ACTION-IF-FOUND], +dnl [ACTION-IF-NOT-FOUND]) +dnl --------------------------------------------------------------------- +dnl Since: 0.29 +dnl +dnl Checks for existence of MODULES and gathers its build flags with +dnl static libraries enabled. Sets VARIABLE-PREFIX_CFLAGS from --cflags +dnl and VARIABLE-PREFIX_LIBS from --libs. +dnl +dnl Note that if there is a possibility the first call to +dnl PKG_CHECK_MODULES_STATIC might not happen, you should be sure to +dnl include an explicit call to PKG_PROG_PKG_CONFIG in your +dnl configure.ac. +AC_DEFUN([PKG_CHECK_MODULES_STATIC], +[AC_REQUIRE([PKG_PROG_PKG_CONFIG])dnl +_save_PKG_CONFIG=$PKG_CONFIG +PKG_CONFIG="$PKG_CONFIG --static" +PKG_CHECK_MODULES($@) +PKG_CONFIG=$_save_PKG_CONFIG[]dnl +])dnl PKG_CHECK_MODULES_STATIC + + +dnl PKG_INSTALLDIR([DIRECTORY]) +dnl ------------------------- +dnl Since: 0.27 +dnl +dnl Substitutes the variable pkgconfigdir as the location where a module +dnl should install pkg-config .pc files. By default the directory is +dnl $libdir/pkgconfig, but the default can be changed by passing +dnl DIRECTORY. The user can override through the --with-pkgconfigdir +dnl parameter. +AC_DEFUN([PKG_INSTALLDIR], +[m4_pushdef([pkg_default], [m4_default([$1], ['${libdir}/pkgconfig'])]) +m4_pushdef([pkg_description], + [pkg-config installation directory @<:@]pkg_default[@:>@]) +AC_ARG_WITH([pkgconfigdir], + [AS_HELP_STRING([--with-pkgconfigdir], pkg_description)],, + [with_pkgconfigdir=]pkg_default) +AC_SUBST([pkgconfigdir], [$with_pkgconfigdir]) +m4_popdef([pkg_default]) +m4_popdef([pkg_description]) +])dnl PKG_INSTALLDIR + + +dnl PKG_NOARCH_INSTALLDIR([DIRECTORY]) +dnl -------------------------------- +dnl Since: 0.27 +dnl +dnl Substitutes the variable noarch_pkgconfigdir as the location where a +dnl module should install arch-independent pkg-config .pc files. By +dnl default the directory is $datadir/pkgconfig, but the default can be +dnl changed by passing DIRECTORY. The user can override through the +dnl --with-noarch-pkgconfigdir parameter. +AC_DEFUN([PKG_NOARCH_INSTALLDIR], +[m4_pushdef([pkg_default], [m4_default([$1], ['${datadir}/pkgconfig'])]) +m4_pushdef([pkg_description], + [pkg-config arch-independent installation directory @<:@]pkg_default[@:>@]) +AC_ARG_WITH([noarch-pkgconfigdir], + [AS_HELP_STRING([--with-noarch-pkgconfigdir], pkg_description)],, + [with_noarch_pkgconfigdir=]pkg_default) +AC_SUBST([noarch_pkgconfigdir], [$with_noarch_pkgconfigdir]) +m4_popdef([pkg_default]) +m4_popdef([pkg_description]) +])dnl PKG_NOARCH_INSTALLDIR + + +dnl PKG_CHECK_VAR(VARIABLE, MODULE, CONFIG-VARIABLE, +dnl [ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND]) +dnl ------------------------------------------- +dnl Since: 0.28 +dnl +dnl Retrieves the value of the pkg-config variable for the given module. +AC_DEFUN([PKG_CHECK_VAR], +[AC_REQUIRE([PKG_PROG_PKG_CONFIG])dnl +AC_ARG_VAR([$1], [value of $3 for $2, overriding pkg-config])dnl + +_PKG_CONFIG([$1], [variable="][$3]["], [$2]) +AS_VAR_COPY([$1], [pkg_cv_][$1]) + +AS_VAR_IF([$1], [""], [$5], [$4])dnl +])dnl PKG_CHECK_VAR diff --git a/missing b/missing new file mode 100755 index 0000000..f62bbae --- /dev/null +++ b/missing @@ -0,0 +1,215 @@ +#! /bin/sh +# Common wrapper for a few potentially missing GNU programs. + +scriptversion=2013-10-28.13; # UTC + +# Copyright (C) 1996-2014 Free Software Foundation, Inc. +# Originally written by Fran,cois Pinard , 1996. + +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 2, or (at your option) +# any later version. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. + +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +# As a special exception to the GNU General Public License, if you +# distribute this file as part of a program that contains a +# configuration script generated by Autoconf, you may include it under +# the same distribution terms that you use for the rest of that program. + +if test $# -eq 0; then + echo 1>&2 "Try '$0 --help' for more information" + exit 1 +fi + +case $1 in + + --is-lightweight) + # Used by our autoconf macros to check whether the available missing + # script is modern enough. + exit 0 + ;; + + --run) + # Back-compat with the calling convention used by older automake. + shift + ;; + + -h|--h|--he|--hel|--help) + echo "\ +$0 [OPTION]... PROGRAM [ARGUMENT]... + +Run 'PROGRAM [ARGUMENT]...', returning a proper advice when this fails due +to PROGRAM being missing or too old. + +Options: + -h, --help display this help and exit + -v, --version output version information and exit + +Supported PROGRAM values: + aclocal autoconf autoheader autom4te automake makeinfo + bison yacc flex lex help2man + +Version suffixes to PROGRAM as well as the prefixes 'gnu-', 'gnu', and +'g' are ignored when checking the name. + +Send bug reports to ." + exit $? + ;; + + -v|--v|--ve|--ver|--vers|--versi|--versio|--version) + echo "missing $scriptversion (GNU Automake)" + exit $? + ;; + + -*) + echo 1>&2 "$0: unknown '$1' option" + echo 1>&2 "Try '$0 --help' for more information" + exit 1 + ;; + +esac + +# Run the given program, remember its exit status. +"$@"; st=$? + +# If it succeeded, we are done. +test $st -eq 0 && exit 0 + +# Also exit now if we it failed (or wasn't found), and '--version' was +# passed; such an option is passed most likely to detect whether the +# program is present and works. +case $2 in --version|--help) exit $st;; esac + +# Exit code 63 means version mismatch. This often happens when the user +# tries to use an ancient version of a tool on a file that requires a +# minimum version. +if test $st -eq 63; then + msg="probably too old" +elif test $st -eq 127; then + # Program was missing. + msg="missing on your system" +else + # Program was found and executed, but failed. Give up. + exit $st +fi + +perl_URL=http://www.perl.org/ +flex_URL=http://flex.sourceforge.net/ +gnu_software_URL=http://www.gnu.org/software + +program_details () +{ + case $1 in + aclocal|automake) + echo "The '$1' program is part of the GNU Automake package:" + echo "<$gnu_software_URL/automake>" + echo "It also requires GNU Autoconf, GNU m4 and Perl in order to run:" + echo "<$gnu_software_URL/autoconf>" + echo "<$gnu_software_URL/m4/>" + echo "<$perl_URL>" + ;; + autoconf|autom4te|autoheader) + echo "The '$1' program is part of the GNU Autoconf package:" + echo "<$gnu_software_URL/autoconf/>" + echo "It also requires GNU m4 and Perl in order to run:" + echo "<$gnu_software_URL/m4/>" + echo "<$perl_URL>" + ;; + esac +} + +give_advice () +{ + # Normalize program name to check for. + normalized_program=`echo "$1" | sed ' + s/^gnu-//; t + s/^gnu//; t + s/^g//; t'` + + printf '%s\n' "'$1' is $msg." + + configure_deps="'configure.ac' or m4 files included by 'configure.ac'" + case $normalized_program in + autoconf*) + echo "You should only need it if you modified 'configure.ac'," + echo "or m4 files included by it." + program_details 'autoconf' + ;; + autoheader*) + echo "You should only need it if you modified 'acconfig.h' or" + echo "$configure_deps." + program_details 'autoheader' + ;; + automake*) + echo "You should only need it if you modified 'Makefile.am' or" + echo "$configure_deps." + program_details 'automake' + ;; + aclocal*) + echo "You should only need it if you modified 'acinclude.m4' or" + echo "$configure_deps." + program_details 'aclocal' + ;; + autom4te*) + echo "You might have modified some maintainer files that require" + echo "the 'autom4te' program to be rebuilt." + program_details 'autom4te' + ;; + bison*|yacc*) + echo "You should only need it if you modified a '.y' file." + echo "You may want to install the GNU Bison package:" + echo "<$gnu_software_URL/bison/>" + ;; + lex*|flex*) + echo "You should only need it if you modified a '.l' file." + echo "You may want to install the Fast Lexical Analyzer package:" + echo "<$flex_URL>" + ;; + help2man*) + echo "You should only need it if you modified a dependency" \ + "of a man page." + echo "You may want to install the GNU Help2man package:" + echo "<$gnu_software_URL/help2man/>" + ;; + makeinfo*) + echo "You should only need it if you modified a '.texi' file, or" + echo "any other file indirectly affecting the aspect of the manual." + echo "You might want to install the Texinfo package:" + echo "<$gnu_software_URL/texinfo/>" + echo "The spurious makeinfo call might also be the consequence of" + echo "using a buggy 'make' (AIX, DU, IRIX), in which case you might" + echo "want to install GNU make:" + echo "<$gnu_software_URL/make/>" + ;; + *) + echo "You might have modified some files without having the proper" + echo "tools for further handling them. Check the 'README' file, it" + echo "often tells you about the needed prerequisites for installing" + echo "this package. You may also peek at any GNU archive site, in" + echo "case some other package contains this missing '$1' program." + ;; + esac +} + +give_advice "$1" | sed -e '1s/^/WARNING: /' \ + -e '2,$s/^/ /' >&2 + +# Propagate the correct exit status (expected to be 127 for a program +# not found, 63 for a program that failed due to version mismatch). +exit $st + +# Local variables: +# eval: (add-hook 'write-file-hooks 'time-stamp) +# time-stamp-start: "scriptversion=" +# time-stamp-format: "%:y-%02m-%02d.%02H" +# time-stamp-time-zone: "UTC" +# time-stamp-end: "; # UTC" +# End: diff --git a/rulebases/cisco.rulebase b/rulebases/cisco.rulebase new file mode 100644 index 0000000..4ef2aa9 --- /dev/null +++ b/rulebases/cisco.rulebase @@ -0,0 +1,10 @@ +prefix=%date:date-rfc3164% %host:word% %seqnum:number%: %othseq:char-to:\x3a%: %%%tag:char-to:\x3a%: +rule=: Configured from console by %tty:word:% (%ip:ipv4%) +rule=: Authentication failure for %proto:word% req from host %ip:ipv4% +rule=: Interface %interface:char-to:,%, changed state to %state:word% +rule=: Line protocol on Interface %interface:char-to:,%, changed state to %state:word% +rule=: Attempted to connect to %servname:word% from %ip:ipv4% +# too-generic syntaces (like %port:word% below) cause problems. +# Best is to have very specific syntaxes, but as an +# interim solution we may need to backtrack if there is no other way to handle it. +#: %port:word% transmit error diff --git a/rulebases/messages.rulebase b/rulebases/messages.rulebase new file mode 100644 index 0000000..c6929d5 --- /dev/null +++ b/rulebases/messages.rulebase @@ -0,0 +1,9 @@ +prefix=%date:date-rfc3164% %host:word% %tag:char-to:\x3a%: +rule=: restart. +rule=: Bad line received from identity server at %ip:ipv4%: %port:number% +rule=: FTP session closed +rule=: wu-ftpd - TLS settings: control %wuftp-control:char-to:,%, client_cert %wuftp-clcert:char-to:,%, data %wuftp-allow:word% +rule=: User %user:word% timed out after %timeout:number% seconds at %otherdatesyntax:word% %otherdate:date-rfc3164% %otheryear:word% +rule=: getpeername (in.ftpd): Transport endpoint is not connected +# the one below is problematic (and needs some backtracking) +#: %disk:char-to:\x3a%: timeout waiting for DMA diff --git a/rulebases/sample.rulebase b/rulebases/sample.rulebase new file mode 100644 index 0000000..883a776 --- /dev/null +++ b/rulebases/sample.rulebase @@ -0,0 +1,70 @@ +# Some sample rules and strings matching them + +# Prefix sample: +# myhostname: code=23 +prefix=%host:char-to:\x3a%: +rule=prefixed_code:code=%code:number% +# myhostname: name=somename +rule=prefixed_name:name=%name:word% +# Reset prefix to default (empty value): +prefix= + +# Quantity: 555 +rule=tag1:Quantity: %N:number% + +# Weight: 42kg +rule=tag2:Weight: %N:number%%unit:word% +annotate=tag2:+fat="free" + +# %% +rule=tag3,percent:\x25%% +annotate=percent:+percent="100" +annotate=tag3:+whole="whale" +annotate=tag3:+part="wha" + +# literal +rule=tag4,tag5,tag6,tag4:literal +annotate=tag4:+this="that" + +# first field,second field,third field,fourth field +rule=csv:%r1:char-to:,%,%r2:char-to:,%,%r3:char-to:,%,%r4:rest% + +# CSV: field1,,field3 +rule=better-csv:CSV: %f1:char-sep:,%,%f2:char-sep:,%,%f3:char-sep:,% + +# Snow White and the Seven Dwarfs +rule=tale:Snow White and %company:rest% + +# iptables: SRC=192.168.1.134 DST=46.252.161.13 LEN=48 TOS=0x00 PREC=0x00 +rule=ipt:iptables: %dummy:iptables% + +# 2012-10-11 src=127.0.0.1 dst=88.111.222.19 +rule=:%date:date-iso% src=%src:ipv4% dst=%dst:ipv4% + +# Oct 29 09:47:08 server rsyslogd: rsyslogd's groupid changed to 103 +rule=syslog:%date1:date-rfc3164% %host:word% %tag:char-to:\x3a%: %text:rest% + +# Oct 29 09:47:08 +rule=rfc3164:%date1:date-rfc3164% + +# 1985-04-12T19:20:50.52-04:00 +rule=rfc5424:%date1:date-rfc5424% + +# 1985-04-12T19:20:50.52-04:00 testing 123 +rule=rfc5424:%date1:date-rfc5424% %test:word% %test2:number% + +# quoted_string="Contents of a quoted string cannot include quote marks" +rule=quote:quoted_string=%quote:quoted-string% + +# tokenized words: aaa.org; bbb.com; ccc.net +rule=tokenized_words:tokenized words: %arr:tokenized:; :char-sep:\x3b% + +# tokenized regex: aaa.org; bbb.com; ccc.net +rule=tokenized_regex:tokenized regex: %arr:tokenized:; :regex:[^; ]+% + +# regex: abcdef +rule=regex:regex: %token:regex:abc.ef% + +# host451 +# generates { basename:"host", hostid:451 } +rule=:%basename:alpha%%hostid:number% diff --git a/rulebases/syntax.txt b/rulebases/syntax.txt new file mode 100644 index 0000000..d24637c --- /dev/null +++ b/rulebases/syntax.txt @@ -0,0 +1,144 @@ +WARNING +======= + +This file is somewhat obsolete, for current information look at doc/ +directory. + +Basic syntax +============ + +Each line in rulebase file is evaluated separately. +Lines starting with '#' are commentaries. +Empty lines are just skipped, they can be inserted for readability. +If the line starts with 'rule=', then it contains a rule. This line has +following format: + + rule=[[,...]]: + +Everything before a colon is treated as comma-separated list of tags, which +will be attached to a match. After the colon, match description should be +given. It consists of string literals and field selectors. String literals +should match exactly. Field selector has this format: + + %:[:]% + +Percent sign is used to enclose field selector. If you need to match literal +'%', it can be written as '%%' or '\x25'. + +Behaviour of field selector depends on its type, which is decribed below. + +If field name is set to '-', this field is matched but not saved. + +Several rules can have a common prefix. You can set it once with this syntax: + + prefix= + +Every following rule will be treated as an addition to this prefix. + +Prefix can be reset to default (empty value) by the line: + + prefix= + +Tags of the matched rule are attached to the message and can be used to +annotate it. Annotation allows to add fixed fields to the message. +Syntax is as following: + + annotate=:+="" + +Field value should always be enclosed in double quote marks. + +There can be multiple annotations for the same tag. + +Field types +=========== + +Field type: 'number' +Matches: One or more decimal digits. +Extra data: Not used +Example: %field_name:number% + +Field type: 'word' +Matches: One or more characters, up to the next space (\x20), or + up to end of line. +Extra data: Not used +Example: %field_name:word% + +Field type: 'alpha' +Matches: One or more alphabetic characters, up to the next + whitespace, punctuation, decimal digit or ctrl. +Extra data: Not used +Example: %field_name:alpha% + +Field type: 'char-to' +Matches: One or more characters, up to the next character given in + extra data. +Extra data: One character (can be escaped) +Example: %field_name:char-to:,% + %field_name:char-to:\x25% + +Field type: 'char-sep' +Matches: Zero or more characters, up to the next character given in + extra data, or up to end of line. +Extra data: One character (can be escaped) +Example: %field_name:char-sep:,% + %field_name:char-sep:\x25% + +Field type: 'rest' +Matches: Zero or more characters till end of line. +Extra data: Not used +Example: %field_name:rest% +Notes: Should be always at end of the rule. + +Field type: 'quoted-string' +Matches: Zero or more characters, surrounded by double quote marks. +Extra data: Not used +Example: %field_name:quoted-string% +Notes: Quote marks are stripped from the match. + +Field type: 'date-iso' +Matches: Date of format 'YYYY-MM-DD'. +Extra data: Not used +Example: %field-name:date-iso% + +Field type: 'time-24hr' +Matches: Time of format 'HH:MM:SS', where HH is 00..23. +Extra data: Not used +Example: %field_name:time-24hr% + +Field type: 'time-12hr' +Matches: Time of format 'HH:MM:SS', where HH is 00..12. +Extra data: Not used +Example: %field_name:time-12hr% + +Field type: 'ipv4' +Matches: IPv4 address, in dot-decimal notation (AAA.BBB.CCC.DDD). +Extra data: Not used +Example: %field_name:ipv4% + +Field type: 'date-rfc3164' +Matches: Valid date/time in RFC3164 format, i.e.: 'Oct 29 09:47:08' +Extra data: Not used +Example: %field_name:date-rfc3164% +Notes: This parser implements several quirks to match malformed + timestamps from some devices. + +Field type: 'date-rfc5424' +Matches: Valid date/time in RFC5424 format, i.e.: + '1985-04-12T19:20:50.52-04:00' +Extra data: Not used +Example: %field_name:date-rfc5424% +Notes: Slightly different formats are allowed. + +Field type: 'iptables' +Matches: Name=value pairs, separated by spaces, as in Netfilter log + messages. +Extra data: Not used +Example: %-:iptables% +Notes: Name of the selector is not used; names from the line are + used instead. This selector always matches everything till + end of the line. Cannot match zero characters. + +Examples +======== + +Look at sample.rulebase for example rules and matching lines. diff --git a/src/Makefile.am b/src/Makefile.am new file mode 100644 index 0000000..397e2d8 --- /dev/null +++ b/src/Makefile.am @@ -0,0 +1,70 @@ +# Uncomment for debugging +DEBUG = -g +PTHREADS_CFLAGS = -pthread + +#CFLAGS += $(DEBUG) + +# we need to clean the normalizer up once we have reached a decent +# milestone (latest at initial release!) +bin_PROGRAMS = lognormalizer +lognormalizer_SOURCES = lognormalizer.c +lognormalizer_CPPFLAGS = -I$(top_srcdir) $(WARN_CFLAGS) $(JSON_C_CFLAGS) $(LIBESTR_CFLAGS) +lognormalizer_LDADD = $(JSON_C_LIBS) $(LIBLOGNORM_LIBS) $(LIBESTR_LIBS) ../compat/compat.la +lognormalizer_DEPENDENCIES = liblognorm.la + +check_PROGRAMS = ln_test +ln_test_SOURCES = $(lognormalizer_SOURCES) +ln_test_CPPFLAGS = $(lognormalizer_CPPFLAGS) +ln_test_LDADD = $(lognormalizer_LDADD) +ln_test_DEPENDENCIES = $(lognormalizer_DEPENDENCIES) +ln_test_LDFLAGS = -no-install + +lib_LTLIBRARIES = liblognorm.la + +liblognorm_la_SOURCES = \ + liblognorm.c \ + pdag.c \ + annot.c \ + samp.c \ + lognorm.c \ + parser.c \ + enc_syslog.c \ + enc_csv.c \ + enc_xml.c + +# Users violently requested that v2 shall be able to understand v1 +# rulebases. As both are very very different, we now include the +# full v1 engine for this purpose. This here is what does this. +# see also: https://github.com/rsyslog/liblognorm/issues/103 +liblognorm_la_SOURCES += \ + v1_liblognorm.c \ + v1_parser.c \ + v1_ptree.c \ + v1_samp.c + +liblognorm_la_CPPFLAGS = $(JSON_C_CFLAGS) $(WARN_CFLAGS) $(LIBESTR_CFLAGS) $(PCRE_CFLAGS) +liblognorm_la_LIBADD = $(rt_libs) $(JSON_C_LIBS) $(LIBESTR_LIBS) $(PCRE_LIBS) -lestr +# info on version-info: +# http://www.gnu.org/software/libtool/manual/html_node/Updating-version-info.html +# Note: v2 now starts at version 5, as v1 previously also had 4 +liblognorm_la_LDFLAGS = -version-info 6:0:1 + +EXTRA_DIST = \ + internal.h \ + liblognorm.h \ + lognorm.h \ + pdag.h \ + annot.h \ + samp.h \ + enc.h \ + parser.h \ + helpers.h + +# and now the old cruft: +EXTRA_DIST += \ + v1_liblognorm.h \ + v1_parser.h \ + v1_samp.h \ + v1_ptree.h + +include_HEADERS = liblognorm.h samp.h lognorm.h pdag.h annot.h enc.h parser.h lognorm-features.h diff --git a/src/Makefile.in b/src/Makefile.in new file mode 100644 index 0000000..2ccdd8b --- /dev/null +++ b/src/Makefile.in @@ -0,0 +1,949 @@ +# Makefile.in generated by automake 1.15 from Makefile.am. +# @configure_input@ + +# Copyright (C) 1994-2014 Free Software Foundation, Inc. + +# This Makefile.in is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY, to the extent permitted by law; without +# even the implied warranty of MERCHANTABILITY or FITNESS FOR A +# PARTICULAR PURPOSE. + +@SET_MAKE@ + + + +VPATH = @srcdir@ +am__is_gnu_make = { \ + if test -z '$(MAKELEVEL)'; then \ + false; \ + elif test -n '$(MAKE_HOST)'; then \ + true; \ + elif test -n '$(MAKE_VERSION)' && test -n '$(CURDIR)'; then \ + true; \ + else \ + false; \ + fi; \ +} +am__make_running_with_option = \ + case $${target_option-} in \ + ?) ;; \ + *) echo "am__make_running_with_option: internal error: invalid" \ + "target option '$${target_option-}' specified" >&2; \ + exit 1;; \ + esac; \ + has_opt=no; \ + sane_makeflags=$$MAKEFLAGS; \ + if $(am__is_gnu_make); then \ + sane_makeflags=$$MFLAGS; \ + else \ + case $$MAKEFLAGS in \ + *\\[\ \ ]*) \ + bs=\\; \ + sane_makeflags=`printf '%s\n' "$$MAKEFLAGS" \ + | sed "s/$$bs$$bs[$$bs $$bs ]*//g"`;; \ + esac; \ + fi; \ + skip_next=no; \ + strip_trailopt () \ + { \ + flg=`printf '%s\n' "$$flg" | sed "s/$$1.*$$//"`; \ + }; \ + for flg in $$sane_makeflags; do \ + test $$skip_next = yes && { skip_next=no; continue; }; \ + case $$flg in \ + *=*|--*) continue;; \ + -*I) strip_trailopt 'I'; skip_next=yes;; \ + -*I?*) strip_trailopt 'I';; \ + -*O) strip_trailopt 'O'; skip_next=yes;; \ + -*O?*) strip_trailopt 'O';; \ + -*l) strip_trailopt 'l'; skip_next=yes;; \ + -*l?*) strip_trailopt 'l';; \ + -[dEDm]) skip_next=yes;; \ + -[JT]) skip_next=yes;; \ + esac; \ + case $$flg in \ + *$$target_option*) has_opt=yes; break;; \ + esac; \ + done; \ + test $$has_opt = yes +am__make_dryrun = (target_option=n; $(am__make_running_with_option)) +am__make_keepgoing = (target_option=k; $(am__make_running_with_option)) +pkgdatadir = $(datadir)/@PACKAGE@ +pkgincludedir = $(includedir)/@PACKAGE@ +pkglibdir = $(libdir)/@PACKAGE@ +pkglibexecdir = $(libexecdir)/@PACKAGE@ +am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd +install_sh_DATA = $(install_sh) -c -m 644 +install_sh_PROGRAM = $(install_sh) -c +install_sh_SCRIPT = $(install_sh) -c +INSTALL_HEADER = $(INSTALL_DATA) +transform = $(program_transform_name) +NORMAL_INSTALL = : +PRE_INSTALL = : +POST_INSTALL = : +NORMAL_UNINSTALL = : +PRE_UNINSTALL = : +POST_UNINSTALL = : +build_triplet = @build@ +host_triplet = @host@ +bin_PROGRAMS = lognormalizer$(EXEEXT) +check_PROGRAMS = ln_test$(EXEEXT) +subdir = src +ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 +am__aclocal_m4_deps = $(top_srcdir)/m4/libtool.m4 \ + $(top_srcdir)/m4/ltoptions.m4 $(top_srcdir)/m4/ltsugar.m4 \ + $(top_srcdir)/m4/ltversion.m4 $(top_srcdir)/m4/lt~obsolete.m4 \ + $(top_srcdir)/m4/pkg.m4 $(top_srcdir)/configure.ac +am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \ + $(ACLOCAL_M4) +DIST_COMMON = $(srcdir)/Makefile.am $(include_HEADERS) \ + $(am__DIST_COMMON) +mkinstalldirs = $(install_sh) -d +CONFIG_HEADER = $(top_builddir)/config.h +CONFIG_CLEAN_FILES = lognorm-features.h +CONFIG_CLEAN_VPATH_FILES = +am__vpath_adj_setup = srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`; +am__vpath_adj = case $$p in \ + $(srcdir)/*) f=`echo "$$p" | sed "s|^$$srcdirstrip/||"`;; \ + *) f=$$p;; \ + esac; +am__strip_dir = f=`echo $$p | sed -e 's|^.*/||'`; +am__install_max = 40 +am__nobase_strip_setup = \ + srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*|]/\\\\&/g'` +am__nobase_strip = \ + for p in $$list; do echo "$$p"; done | sed -e "s|$$srcdirstrip/||" +am__nobase_list = $(am__nobase_strip_setup); \ + for p in $$list; do echo "$$p $$p"; done | \ + sed "s| $$srcdirstrip/| |;"' / .*\//!s/ .*/ ./; s,\( .*\)/[^/]*$$,\1,' | \ + $(AWK) 'BEGIN { files["."] = "" } { files[$$2] = files[$$2] " " $$1; \ + if (++n[$$2] == $(am__install_max)) \ + { print $$2, files[$$2]; n[$$2] = 0; files[$$2] = "" } } \ + END { for (dir in files) print dir, files[dir] }' +am__base_list = \ + sed '$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;s/\n/ /g' | \ + sed '$$!N;$$!N;$$!N;$$!N;s/\n/ /g' +am__uninstall_files_from_dir = { \ + test -z "$$files" \ + || { test ! -d "$$dir" && test ! -f "$$dir" && test ! -r "$$dir"; } \ + || { echo " ( cd '$$dir' && rm -f" $$files ")"; \ + $(am__cd) "$$dir" && rm -f $$files; }; \ + } +am__installdirs = "$(DESTDIR)$(libdir)" "$(DESTDIR)$(bindir)" \ + "$(DESTDIR)$(includedir)" +LTLIBRARIES = $(lib_LTLIBRARIES) +am__DEPENDENCIES_1 = +liblognorm_la_DEPENDENCIES = $(am__DEPENDENCIES_1) \ + $(am__DEPENDENCIES_1) $(am__DEPENDENCIES_1) +am_liblognorm_la_OBJECTS = liblognorm_la-liblognorm.lo \ + liblognorm_la-pdag.lo liblognorm_la-annot.lo \ + liblognorm_la-samp.lo liblognorm_la-lognorm.lo \ + liblognorm_la-parser.lo liblognorm_la-enc_syslog.lo \ + liblognorm_la-enc_csv.lo liblognorm_la-enc_xml.lo \ + liblognorm_la-v1_liblognorm.lo liblognorm_la-v1_parser.lo \ + liblognorm_la-v1_ptree.lo liblognorm_la-v1_samp.lo +liblognorm_la_OBJECTS = $(am_liblognorm_la_OBJECTS) +AM_V_lt = $(am__v_lt_@AM_V@) +am__v_lt_ = $(am__v_lt_@AM_DEFAULT_V@) +am__v_lt_0 = --silent +am__v_lt_1 = +liblognorm_la_LINK = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \ + $(liblognorm_la_LDFLAGS) $(LDFLAGS) -o $@ +PROGRAMS = $(bin_PROGRAMS) +am__objects_1 = ln_test-lognormalizer.$(OBJEXT) +am_ln_test_OBJECTS = $(am__objects_1) +ln_test_OBJECTS = $(am_ln_test_OBJECTS) +am__DEPENDENCIES_2 = $(am__DEPENDENCIES_1) $(am__DEPENDENCIES_1) \ + $(am__DEPENDENCIES_1) ../compat/compat.la +ln_test_LINK = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \ + $(ln_test_LDFLAGS) $(LDFLAGS) -o $@ +am_lognormalizer_OBJECTS = lognormalizer-lognormalizer.$(OBJEXT) +lognormalizer_OBJECTS = $(am_lognormalizer_OBJECTS) +AM_V_P = $(am__v_P_@AM_V@) +am__v_P_ = $(am__v_P_@AM_DEFAULT_V@) +am__v_P_0 = false +am__v_P_1 = : +AM_V_GEN = $(am__v_GEN_@AM_V@) +am__v_GEN_ = $(am__v_GEN_@AM_DEFAULT_V@) +am__v_GEN_0 = @echo " GEN " $@; +am__v_GEN_1 = +AM_V_at = $(am__v_at_@AM_V@) +am__v_at_ = $(am__v_at_@AM_DEFAULT_V@) +am__v_at_0 = @ +am__v_at_1 = +DEFAULT_INCLUDES = -I.@am__isrc@ -I$(top_builddir) +depcomp = $(SHELL) $(top_srcdir)/depcomp +am__depfiles_maybe = depfiles +am__mv = mv -f +COMPILE = $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(AM_CPPFLAGS) \ + $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) +LTCOMPILE = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) \ + $(DEFAULT_INCLUDES) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) \ + $(AM_CFLAGS) $(CFLAGS) +AM_V_CC = $(am__v_CC_@AM_V@) +am__v_CC_ = $(am__v_CC_@AM_DEFAULT_V@) +am__v_CC_0 = @echo " CC " $@; +am__v_CC_1 = +CCLD = $(CC) +LINK = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \ + $(AM_LDFLAGS) $(LDFLAGS) -o $@ +AM_V_CCLD = $(am__v_CCLD_@AM_V@) +am__v_CCLD_ = $(am__v_CCLD_@AM_DEFAULT_V@) +am__v_CCLD_0 = @echo " CCLD " $@; +am__v_CCLD_1 = +SOURCES = $(liblognorm_la_SOURCES) $(ln_test_SOURCES) \ + $(lognormalizer_SOURCES) +DIST_SOURCES = $(liblognorm_la_SOURCES) $(ln_test_SOURCES) \ + $(lognormalizer_SOURCES) +am__can_run_installinfo = \ + case $$AM_UPDATE_INFO_DIR in \ + n|no|NO) false;; \ + *) (install-info --version) >/dev/null 2>&1;; \ + esac +HEADERS = $(include_HEADERS) +am__tagged_files = $(HEADERS) $(SOURCES) $(TAGS_FILES) $(LISP) +# Read a list of newline-separated strings from the standard input, +# and print each of them once, without duplicates. Input order is +# *not* preserved. +am__uniquify_input = $(AWK) '\ + BEGIN { nonempty = 0; } \ + { items[$$0] = 1; nonempty = 1; } \ + END { if (nonempty) { for (i in items) print i; }; } \ +' +# Make sure the list of sources is unique. This is necessary because, +# e.g., the same source file might be shared among _SOURCES variables +# for different programs/libraries. +am__define_uniq_tagged_files = \ + list='$(am__tagged_files)'; \ + unique=`for i in $$list; do \ + if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \ + done | $(am__uniquify_input)` +ETAGS = etags +CTAGS = ctags +am__DIST_COMMON = $(srcdir)/Makefile.in \ + $(srcdir)/lognorm-features.h.in $(top_srcdir)/depcomp +DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST) +ACLOCAL = @ACLOCAL@ +AMTAR = @AMTAR@ +AM_DEFAULT_VERBOSITY = @AM_DEFAULT_VERBOSITY@ +AR = @AR@ +AUTOCONF = @AUTOCONF@ +AUTOHEADER = @AUTOHEADER@ +AUTOMAKE = @AUTOMAKE@ +AWK = @AWK@ +CC = @CC@ +CCDEPMODE = @CCDEPMODE@ +CFLAGS = @CFLAGS@ +CPP = @CPP@ +CPPFLAGS = @CPPFLAGS@ +CYGPATH_W = @CYGPATH_W@ +DEFS = @DEFS@ +DEPDIR = @DEPDIR@ +DLLTOOL = @DLLTOOL@ +DSYMUTIL = @DSYMUTIL@ +DUMPBIN = @DUMPBIN@ +ECHO_C = @ECHO_C@ +ECHO_N = @ECHO_N@ +ECHO_T = @ECHO_T@ +EGREP = @EGREP@ +EXEEXT = @EXEEXT@ +FEATURE_REGEXP = @FEATURE_REGEXP@ +FGREP = @FGREP@ +GREP = @GREP@ +INSTALL = @INSTALL@ +INSTALL_DATA = @INSTALL_DATA@ +INSTALL_PROGRAM = @INSTALL_PROGRAM@ +INSTALL_SCRIPT = @INSTALL_SCRIPT@ +INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@ +JSON_C_CFLAGS = @JSON_C_CFLAGS@ +JSON_C_LIBS = @JSON_C_LIBS@ +LD = @LD@ +LDFLAGS = @LDFLAGS@ +LIBESTR_CFLAGS = @LIBESTR_CFLAGS@ +LIBESTR_LIBS = @LIBESTR_LIBS@ +LIBLOGNORM_CFLAGS = @LIBLOGNORM_CFLAGS@ +LIBLOGNORM_LIBS = @LIBLOGNORM_LIBS@ +LIBOBJS = @LIBOBJS@ +LIBS = @LIBS@ +LIBTOOL = @LIBTOOL@ +LIPO = @LIPO@ +LN_S = @LN_S@ +LTLIBOBJS = @LTLIBOBJS@ +LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAKEINFO = @MAKEINFO@ +MANIFEST_TOOL = @MANIFEST_TOOL@ +MKDIR_P = @MKDIR_P@ +NM = @NM@ +NMEDIT = @NMEDIT@ +OBJDUMP = @OBJDUMP@ +OBJEXT = @OBJEXT@ +OTOOL = @OTOOL@ +OTOOL64 = @OTOOL64@ +PACKAGE = @PACKAGE@ +PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@ +PACKAGE_NAME = @PACKAGE_NAME@ +PACKAGE_STRING = @PACKAGE_STRING@ +PACKAGE_TARNAME = @PACKAGE_TARNAME@ +PACKAGE_URL = @PACKAGE_URL@ +PACKAGE_VERSION = @PACKAGE_VERSION@ +PATH_SEPARATOR = @PATH_SEPARATOR@ +PCRE_CFLAGS = @PCRE_CFLAGS@ +PCRE_LIBS = @PCRE_LIBS@ +PKG_CONFIG = @PKG_CONFIG@ +PKG_CONFIG_LIBDIR = @PKG_CONFIG_LIBDIR@ +PKG_CONFIG_PATH = @PKG_CONFIG_PATH@ +RANLIB = @RANLIB@ +SED = @SED@ +SET_MAKE = @SET_MAKE@ +SHELL = @SHELL@ +SPHINXBUILD = @SPHINXBUILD@ +STRIP = @STRIP@ +VALGRIND = @VALGRIND@ +VERSION = @VERSION@ +WARN_CFLAGS = @WARN_CFLAGS@ +WARN_LDFLAGS = @WARN_LDFLAGS@ +WARN_SCANNERFLAGS = @WARN_SCANNERFLAGS@ +abs_builddir = @abs_builddir@ +abs_srcdir = @abs_srcdir@ +abs_top_builddir = @abs_top_builddir@ +abs_top_srcdir = @abs_top_srcdir@ +ac_ct_AR = @ac_ct_AR@ +ac_ct_CC = @ac_ct_CC@ +ac_ct_DUMPBIN = @ac_ct_DUMPBIN@ +am__include = @am__include@ +am__leading_dot = @am__leading_dot@ +am__quote = @am__quote@ +am__tar = @am__tar@ +am__untar = @am__untar@ +bindir = @bindir@ +build = @build@ +build_alias = @build_alias@ +build_cpu = @build_cpu@ +build_os = @build_os@ +build_vendor = @build_vendor@ +builddir = @builddir@ +datadir = @datadir@ +datarootdir = @datarootdir@ +docdir = @docdir@ +dvidir = @dvidir@ +exec_prefix = @exec_prefix@ +host = @host@ +host_alias = @host_alias@ +host_cpu = @host_cpu@ +host_os = @host_os@ +host_vendor = @host_vendor@ +htmldir = @htmldir@ +includedir = @includedir@ +infodir = @infodir@ +install_sh = @install_sh@ +libdir = @libdir@ +libexecdir = @libexecdir@ +localedir = @localedir@ +localstatedir = @localstatedir@ +mandir = @mandir@ +mkdir_p = @mkdir_p@ +oldincludedir = @oldincludedir@ +pdfdir = @pdfdir@ +pkg_config_libs_private = @pkg_config_libs_private@ +prefix = @prefix@ +program_transform_name = @program_transform_name@ +psdir = @psdir@ +runstatedir = @runstatedir@ +sbindir = @sbindir@ +sharedstatedir = @sharedstatedir@ +srcdir = @srcdir@ +sysconfdir = @sysconfdir@ +target_alias = @target_alias@ +top_build_prefix = @top_build_prefix@ +top_builddir = @top_builddir@ +top_srcdir = @top_srcdir@ + +# Uncomment for debugging +DEBUG = -g +PTHREADS_CFLAGS = -pthread +lognormalizer_SOURCES = lognormalizer.c +lognormalizer_CPPFLAGS = -I$(top_srcdir) $(WARN_CFLAGS) $(JSON_C_CFLAGS) $(LIBESTR_CFLAGS) +lognormalizer_LDADD = $(JSON_C_LIBS) $(LIBLOGNORM_LIBS) $(LIBESTR_LIBS) ../compat/compat.la +lognormalizer_DEPENDENCIES = liblognorm.la +ln_test_SOURCES = $(lognormalizer_SOURCES) +ln_test_CPPFLAGS = $(lognormalizer_CPPFLAGS) +ln_test_LDADD = $(lognormalizer_LDADD) +ln_test_DEPENDENCIES = $(lognormalizer_DEPENDENCIES) +ln_test_LDFLAGS = -no-install +lib_LTLIBRARIES = liblognorm.la + +# Users violently requested that v2 shall be able to understand v1 +# rulebases. As both are very very different, we now include the +# full v1 engine for this purpose. This here is what does this. +# see also: https://github.com/rsyslog/liblognorm/issues/103 +liblognorm_la_SOURCES = liblognorm.c pdag.c annot.c samp.c lognorm.c \ + parser.c enc_syslog.c enc_csv.c enc_xml.c v1_liblognorm.c \ + v1_parser.c v1_ptree.c v1_samp.c +liblognorm_la_CPPFLAGS = $(JSON_C_CFLAGS) $(WARN_CFLAGS) $(LIBESTR_CFLAGS) $(PCRE_CFLAGS) +liblognorm_la_LIBADD = $(rt_libs) $(JSON_C_LIBS) $(LIBESTR_LIBS) $(PCRE_LIBS) -lestr +# info on version-info: +# http://www.gnu.org/software/libtool/manual/html_node/Updating-version-info.html +# Note: v2 now starts at version 5, as v1 previously also had 4 +liblognorm_la_LDFLAGS = -version-info 6:0:1 + +# and now the old cruft: +EXTRA_DIST = internal.h liblognorm.h lognorm.h pdag.h annot.h samp.h \ + enc.h parser.h helpers.h v1_liblognorm.h v1_parser.h v1_samp.h \ + v1_ptree.h +include_HEADERS = liblognorm.h samp.h lognorm.h pdag.h annot.h enc.h parser.h lognorm-features.h +all: all-am + +.SUFFIXES: +.SUFFIXES: .c .lo .o .obj +$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) + @for dep in $?; do \ + case '$(am__configure_deps)' in \ + *$$dep*) \ + ( cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh ) \ + && { if test -f $@; then exit 0; else break; fi; }; \ + exit 1;; \ + esac; \ + done; \ + echo ' cd $(top_srcdir) && $(AUTOMAKE) --gnu src/Makefile'; \ + $(am__cd) $(top_srcdir) && \ + $(AUTOMAKE) --gnu src/Makefile +Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status + @case '$?' in \ + *config.status*) \ + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \ + *) \ + echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \ + cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \ + esac; + +$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh + +$(top_srcdir)/configure: $(am__configure_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(ACLOCAL_M4): $(am__aclocal_m4_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(am__aclocal_m4_deps): +lognorm-features.h: $(top_builddir)/config.status $(srcdir)/lognorm-features.h.in + cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ + +install-libLTLIBRARIES: $(lib_LTLIBRARIES) + @$(NORMAL_INSTALL) + @list='$(lib_LTLIBRARIES)'; test -n "$(libdir)" || list=; \ + list2=; for p in $$list; do \ + if test -f $$p; then \ + list2="$$list2 $$p"; \ + else :; fi; \ + done; \ + test -z "$$list2" || { \ + echo " $(MKDIR_P) '$(DESTDIR)$(libdir)'"; \ + $(MKDIR_P) "$(DESTDIR)$(libdir)" || exit 1; \ + echo " $(LIBTOOL) $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=install $(INSTALL) $(INSTALL_STRIP_FLAG) $$list2 '$(DESTDIR)$(libdir)'"; \ + $(LIBTOOL) $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=install $(INSTALL) $(INSTALL_STRIP_FLAG) $$list2 "$(DESTDIR)$(libdir)"; \ + } + +uninstall-libLTLIBRARIES: + @$(NORMAL_UNINSTALL) + @list='$(lib_LTLIBRARIES)'; test -n "$(libdir)" || list=; \ + for p in $$list; do \ + $(am__strip_dir) \ + echo " $(LIBTOOL) $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=uninstall rm -f '$(DESTDIR)$(libdir)/$$f'"; \ + $(LIBTOOL) $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=uninstall rm -f "$(DESTDIR)$(libdir)/$$f"; \ + done + +clean-libLTLIBRARIES: + -test -z "$(lib_LTLIBRARIES)" || rm -f $(lib_LTLIBRARIES) + @list='$(lib_LTLIBRARIES)'; \ + locs=`for p in $$list; do echo $$p; done | \ + sed 's|^[^/]*$$|.|; s|/[^/]*$$||; s|$$|/so_locations|' | \ + sort -u`; \ + test -z "$$locs" || { \ + echo rm -f $${locs}; \ + rm -f $${locs}; \ + } + +liblognorm.la: $(liblognorm_la_OBJECTS) $(liblognorm_la_DEPENDENCIES) $(EXTRA_liblognorm_la_DEPENDENCIES) + $(AM_V_CCLD)$(liblognorm_la_LINK) -rpath $(libdir) $(liblognorm_la_OBJECTS) $(liblognorm_la_LIBADD) $(LIBS) +install-binPROGRAMS: $(bin_PROGRAMS) + @$(NORMAL_INSTALL) + @list='$(bin_PROGRAMS)'; test -n "$(bindir)" || list=; \ + if test -n "$$list"; then \ + echo " $(MKDIR_P) '$(DESTDIR)$(bindir)'"; \ + $(MKDIR_P) "$(DESTDIR)$(bindir)" || exit 1; \ + fi; \ + for p in $$list; do echo "$$p $$p"; done | \ + sed 's/$(EXEEXT)$$//' | \ + while read p p1; do if test -f $$p \ + || test -f $$p1 \ + ; then echo "$$p"; echo "$$p"; else :; fi; \ + done | \ + sed -e 'p;s,.*/,,;n;h' \ + -e 's|.*|.|' \ + -e 'p;x;s,.*/,,;s/$(EXEEXT)$$//;$(transform);s/$$/$(EXEEXT)/' | \ + sed 'N;N;N;s,\n, ,g' | \ + $(AWK) 'BEGIN { files["."] = ""; dirs["."] = 1 } \ + { d=$$3; if (dirs[d] != 1) { print "d", d; dirs[d] = 1 } \ + if ($$2 == $$4) files[d] = files[d] " " $$1; \ + else { print "f", $$3 "/" $$4, $$1; } } \ + END { for (d in files) print "f", d, files[d] }' | \ + while read type dir files; do \ + if test "$$dir" = .; then dir=; else dir=/$$dir; fi; \ + test -z "$$files" || { \ + echo " $(INSTALL_PROGRAM_ENV) $(LIBTOOL) $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=install $(INSTALL_PROGRAM) $$files '$(DESTDIR)$(bindir)$$dir'"; \ + $(INSTALL_PROGRAM_ENV) $(LIBTOOL) $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=install $(INSTALL_PROGRAM) $$files "$(DESTDIR)$(bindir)$$dir" || exit $$?; \ + } \ + ; done + +uninstall-binPROGRAMS: + @$(NORMAL_UNINSTALL) + @list='$(bin_PROGRAMS)'; test -n "$(bindir)" || list=; \ + files=`for p in $$list; do echo "$$p"; done | \ + sed -e 'h;s,^.*/,,;s/$(EXEEXT)$$//;$(transform)' \ + -e 's/$$/$(EXEEXT)/' \ + `; \ + test -n "$$list" || exit 0; \ + echo " ( cd '$(DESTDIR)$(bindir)' && rm -f" $$files ")"; \ + cd "$(DESTDIR)$(bindir)" && rm -f $$files + +clean-binPROGRAMS: + @list='$(bin_PROGRAMS)'; test -n "$$list" || exit 0; \ + echo " rm -f" $$list; \ + rm -f $$list || exit $$?; \ + test -n "$(EXEEXT)" || exit 0; \ + list=`for p in $$list; do echo "$$p"; done | sed 's/$(EXEEXT)$$//'`; \ + echo " rm -f" $$list; \ + rm -f $$list + +clean-checkPROGRAMS: + @list='$(check_PROGRAMS)'; test -n "$$list" || exit 0; \ + echo " rm -f" $$list; \ + rm -f $$list || exit $$?; \ + test -n "$(EXEEXT)" || exit 0; \ + list=`for p in $$list; do echo "$$p"; done | sed 's/$(EXEEXT)$$//'`; \ + echo " rm -f" $$list; \ + rm -f $$list + +ln_test$(EXEEXT): $(ln_test_OBJECTS) $(ln_test_DEPENDENCIES) $(EXTRA_ln_test_DEPENDENCIES) + @rm -f ln_test$(EXEEXT) + $(AM_V_CCLD)$(ln_test_LINK) $(ln_test_OBJECTS) $(ln_test_LDADD) $(LIBS) + +lognormalizer$(EXEEXT): $(lognormalizer_OBJECTS) $(lognormalizer_DEPENDENCIES) $(EXTRA_lognormalizer_DEPENDENCIES) + @rm -f lognormalizer$(EXEEXT) + $(AM_V_CCLD)$(LINK) $(lognormalizer_OBJECTS) $(lognormalizer_LDADD) $(LIBS) + +mostlyclean-compile: + -rm -f *.$(OBJEXT) + +distclean-compile: + -rm -f *.tab.c + +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-annot.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-enc_csv.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-enc_syslog.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-enc_xml.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-liblognorm.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-lognorm.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-parser.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-pdag.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-samp.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-v1_liblognorm.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-v1_parser.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-v1_ptree.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/liblognorm_la-v1_samp.Plo@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/ln_test-lognormalizer.Po@am__quote@ +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/lognormalizer-lognormalizer.Po@am__quote@ + +.c.o: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(COMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ $< +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(COMPILE) -c -o $@ $< + +.c.obj: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(COMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ `$(CYGPATH_W) '$<'` +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(COMPILE) -c -o $@ `$(CYGPATH_W) '$<'` + +.c.lo: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LTCOMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ $< +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LTCOMPILE) -c -o $@ $< + +liblognorm_la-liblognorm.lo: liblognorm.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-liblognorm.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-liblognorm.Tpo -c -o liblognorm_la-liblognorm.lo `test -f 'liblognorm.c' || echo '$(srcdir)/'`liblognorm.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-liblognorm.Tpo $(DEPDIR)/liblognorm_la-liblognorm.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='liblognorm.c' object='liblognorm_la-liblognorm.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-liblognorm.lo `test -f 'liblognorm.c' || echo '$(srcdir)/'`liblognorm.c + +liblognorm_la-pdag.lo: pdag.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-pdag.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-pdag.Tpo -c -o liblognorm_la-pdag.lo `test -f 'pdag.c' || echo '$(srcdir)/'`pdag.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-pdag.Tpo $(DEPDIR)/liblognorm_la-pdag.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='pdag.c' object='liblognorm_la-pdag.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-pdag.lo `test -f 'pdag.c' || echo '$(srcdir)/'`pdag.c + +liblognorm_la-annot.lo: annot.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-annot.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-annot.Tpo -c -o liblognorm_la-annot.lo `test -f 'annot.c' || echo '$(srcdir)/'`annot.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-annot.Tpo $(DEPDIR)/liblognorm_la-annot.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='annot.c' object='liblognorm_la-annot.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-annot.lo `test -f 'annot.c' || echo '$(srcdir)/'`annot.c + +liblognorm_la-samp.lo: samp.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-samp.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-samp.Tpo -c -o liblognorm_la-samp.lo `test -f 'samp.c' || echo '$(srcdir)/'`samp.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-samp.Tpo $(DEPDIR)/liblognorm_la-samp.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='samp.c' object='liblognorm_la-samp.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-samp.lo `test -f 'samp.c' || echo '$(srcdir)/'`samp.c + +liblognorm_la-lognorm.lo: lognorm.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-lognorm.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-lognorm.Tpo -c -o liblognorm_la-lognorm.lo `test -f 'lognorm.c' || echo '$(srcdir)/'`lognorm.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-lognorm.Tpo $(DEPDIR)/liblognorm_la-lognorm.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='lognorm.c' object='liblognorm_la-lognorm.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-lognorm.lo `test -f 'lognorm.c' || echo '$(srcdir)/'`lognorm.c + +liblognorm_la-parser.lo: parser.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-parser.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-parser.Tpo -c -o liblognorm_la-parser.lo `test -f 'parser.c' || echo '$(srcdir)/'`parser.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-parser.Tpo $(DEPDIR)/liblognorm_la-parser.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='parser.c' object='liblognorm_la-parser.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-parser.lo `test -f 'parser.c' || echo '$(srcdir)/'`parser.c + +liblognorm_la-enc_syslog.lo: enc_syslog.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-enc_syslog.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-enc_syslog.Tpo -c -o liblognorm_la-enc_syslog.lo `test -f 'enc_syslog.c' || echo '$(srcdir)/'`enc_syslog.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-enc_syslog.Tpo $(DEPDIR)/liblognorm_la-enc_syslog.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='enc_syslog.c' object='liblognorm_la-enc_syslog.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-enc_syslog.lo `test -f 'enc_syslog.c' || echo '$(srcdir)/'`enc_syslog.c + +liblognorm_la-enc_csv.lo: enc_csv.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-enc_csv.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-enc_csv.Tpo -c -o liblognorm_la-enc_csv.lo `test -f 'enc_csv.c' || echo '$(srcdir)/'`enc_csv.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-enc_csv.Tpo $(DEPDIR)/liblognorm_la-enc_csv.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='enc_csv.c' object='liblognorm_la-enc_csv.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-enc_csv.lo `test -f 'enc_csv.c' || echo '$(srcdir)/'`enc_csv.c + +liblognorm_la-enc_xml.lo: enc_xml.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-enc_xml.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-enc_xml.Tpo -c -o liblognorm_la-enc_xml.lo `test -f 'enc_xml.c' || echo '$(srcdir)/'`enc_xml.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-enc_xml.Tpo $(DEPDIR)/liblognorm_la-enc_xml.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='enc_xml.c' object='liblognorm_la-enc_xml.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-enc_xml.lo `test -f 'enc_xml.c' || echo '$(srcdir)/'`enc_xml.c + +liblognorm_la-v1_liblognorm.lo: v1_liblognorm.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-v1_liblognorm.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-v1_liblognorm.Tpo -c -o liblognorm_la-v1_liblognorm.lo `test -f 'v1_liblognorm.c' || echo '$(srcdir)/'`v1_liblognorm.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-v1_liblognorm.Tpo $(DEPDIR)/liblognorm_la-v1_liblognorm.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='v1_liblognorm.c' object='liblognorm_la-v1_liblognorm.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-v1_liblognorm.lo `test -f 'v1_liblognorm.c' || echo '$(srcdir)/'`v1_liblognorm.c + +liblognorm_la-v1_parser.lo: v1_parser.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-v1_parser.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-v1_parser.Tpo -c -o liblognorm_la-v1_parser.lo `test -f 'v1_parser.c' || echo '$(srcdir)/'`v1_parser.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-v1_parser.Tpo $(DEPDIR)/liblognorm_la-v1_parser.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='v1_parser.c' object='liblognorm_la-v1_parser.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-v1_parser.lo `test -f 'v1_parser.c' || echo '$(srcdir)/'`v1_parser.c + +liblognorm_la-v1_ptree.lo: v1_ptree.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-v1_ptree.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-v1_ptree.Tpo -c -o liblognorm_la-v1_ptree.lo `test -f 'v1_ptree.c' || echo '$(srcdir)/'`v1_ptree.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-v1_ptree.Tpo $(DEPDIR)/liblognorm_la-v1_ptree.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='v1_ptree.c' object='liblognorm_la-v1_ptree.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-v1_ptree.lo `test -f 'v1_ptree.c' || echo '$(srcdir)/'`v1_ptree.c + +liblognorm_la-v1_samp.lo: v1_samp.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT liblognorm_la-v1_samp.lo -MD -MP -MF $(DEPDIR)/liblognorm_la-v1_samp.Tpo -c -o liblognorm_la-v1_samp.lo `test -f 'v1_samp.c' || echo '$(srcdir)/'`v1_samp.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/liblognorm_la-v1_samp.Tpo $(DEPDIR)/liblognorm_la-v1_samp.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='v1_samp.c' object='liblognorm_la-v1_samp.lo' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(liblognorm_la_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o liblognorm_la-v1_samp.lo `test -f 'v1_samp.c' || echo '$(srcdir)/'`v1_samp.c + +ln_test-lognormalizer.o: lognormalizer.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ln_test_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT ln_test-lognormalizer.o -MD -MP -MF $(DEPDIR)/ln_test-lognormalizer.Tpo -c -o ln_test-lognormalizer.o `test -f 'lognormalizer.c' || echo '$(srcdir)/'`lognormalizer.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/ln_test-lognormalizer.Tpo $(DEPDIR)/ln_test-lognormalizer.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='lognormalizer.c' object='ln_test-lognormalizer.o' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ln_test_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o ln_test-lognormalizer.o `test -f 'lognormalizer.c' || echo '$(srcdir)/'`lognormalizer.c + +ln_test-lognormalizer.obj: lognormalizer.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ln_test_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT ln_test-lognormalizer.obj -MD -MP -MF $(DEPDIR)/ln_test-lognormalizer.Tpo -c -o ln_test-lognormalizer.obj `if test -f 'lognormalizer.c'; then $(CYGPATH_W) 'lognormalizer.c'; else $(CYGPATH_W) '$(srcdir)/lognormalizer.c'; fi` +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/ln_test-lognormalizer.Tpo $(DEPDIR)/ln_test-lognormalizer.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='lognormalizer.c' object='ln_test-lognormalizer.obj' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ln_test_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o ln_test-lognormalizer.obj `if test -f 'lognormalizer.c'; then $(CYGPATH_W) 'lognormalizer.c'; else $(CYGPATH_W) '$(srcdir)/lognormalizer.c'; fi` + +lognormalizer-lognormalizer.o: lognormalizer.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(lognormalizer_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT lognormalizer-lognormalizer.o -MD -MP -MF $(DEPDIR)/lognormalizer-lognormalizer.Tpo -c -o lognormalizer-lognormalizer.o `test -f 'lognormalizer.c' || echo '$(srcdir)/'`lognormalizer.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/lognormalizer-lognormalizer.Tpo $(DEPDIR)/lognormalizer-lognormalizer.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='lognormalizer.c' object='lognormalizer-lognormalizer.o' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(lognormalizer_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o lognormalizer-lognormalizer.o `test -f 'lognormalizer.c' || echo '$(srcdir)/'`lognormalizer.c + +lognormalizer-lognormalizer.obj: lognormalizer.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(lognormalizer_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT lognormalizer-lognormalizer.obj -MD -MP -MF $(DEPDIR)/lognormalizer-lognormalizer.Tpo -c -o lognormalizer-lognormalizer.obj `if test -f 'lognormalizer.c'; then $(CYGPATH_W) 'lognormalizer.c'; else $(CYGPATH_W) '$(srcdir)/lognormalizer.c'; fi` +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/lognormalizer-lognormalizer.Tpo $(DEPDIR)/lognormalizer-lognormalizer.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='lognormalizer.c' object='lognormalizer-lognormalizer.obj' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(lognormalizer_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o lognormalizer-lognormalizer.obj `if test -f 'lognormalizer.c'; then $(CYGPATH_W) 'lognormalizer.c'; else $(CYGPATH_W) '$(srcdir)/lognormalizer.c'; fi` + +mostlyclean-libtool: + -rm -f *.lo + +clean-libtool: + -rm -rf .libs _libs +install-includeHEADERS: $(include_HEADERS) + @$(NORMAL_INSTALL) + @list='$(include_HEADERS)'; test -n "$(includedir)" || list=; \ + if test -n "$$list"; then \ + echo " $(MKDIR_P) '$(DESTDIR)$(includedir)'"; \ + $(MKDIR_P) "$(DESTDIR)$(includedir)" || exit 1; \ + fi; \ + for p in $$list; do \ + if test -f "$$p"; then d=; else d="$(srcdir)/"; fi; \ + echo "$$d$$p"; \ + done | $(am__base_list) | \ + while read files; do \ + echo " $(INSTALL_HEADER) $$files '$(DESTDIR)$(includedir)'"; \ + $(INSTALL_HEADER) $$files "$(DESTDIR)$(includedir)" || exit $$?; \ + done + +uninstall-includeHEADERS: + @$(NORMAL_UNINSTALL) + @list='$(include_HEADERS)'; test -n "$(includedir)" || list=; \ + files=`for p in $$list; do echo $$p; done | sed -e 's|^.*/||'`; \ + dir='$(DESTDIR)$(includedir)'; $(am__uninstall_files_from_dir) + +ID: $(am__tagged_files) + $(am__define_uniq_tagged_files); mkid -fID $$unique +tags: tags-am +TAGS: tags + +tags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files) + set x; \ + here=`pwd`; \ + $(am__define_uniq_tagged_files); \ + shift; \ + if test -z "$(ETAGS_ARGS)$$*$$unique"; then :; else \ + test -n "$$unique" || unique=$$empty_fix; \ + if test $$# -gt 0; then \ + $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \ + "$$@" $$unique; \ + else \ + $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \ + $$unique; \ + fi; \ + fi +ctags: ctags-am + +CTAGS: ctags +ctags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files) + $(am__define_uniq_tagged_files); \ + test -z "$(CTAGS_ARGS)$$unique" \ + || $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \ + $$unique + +GTAGS: + here=`$(am__cd) $(top_builddir) && pwd` \ + && $(am__cd) $(top_srcdir) \ + && gtags -i $(GTAGS_ARGS) "$$here" +cscopelist: cscopelist-am + +cscopelist-am: $(am__tagged_files) + list='$(am__tagged_files)'; \ + case "$(srcdir)" in \ + [\\/]* | ?:[\\/]*) sdir="$(srcdir)" ;; \ + *) sdir=$(subdir)/$(srcdir) ;; \ + esac; \ + for i in $$list; do \ + if test -f "$$i"; then \ + echo "$(subdir)/$$i"; \ + else \ + echo "$$sdir/$$i"; \ + fi; \ + done >> $(top_builddir)/cscope.files + +distclean-tags: + -rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags + +distdir: $(DISTFILES) + @srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + list='$(DISTFILES)'; \ + dist_files=`for file in $$list; do echo $$file; done | \ + sed -e "s|^$$srcdirstrip/||;t" \ + -e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \ + case $$dist_files in \ + */*) $(MKDIR_P) `echo "$$dist_files" | \ + sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \ + sort -u` ;; \ + esac; \ + for file in $$dist_files; do \ + if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \ + if test -d $$d/$$file; then \ + dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \ + if test -d "$(distdir)/$$file"; then \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \ + cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \ + else \ + test -f "$(distdir)/$$file" \ + || cp -p $$d/$$file "$(distdir)/$$file" \ + || exit 1; \ + fi; \ + done +check-am: all-am + $(MAKE) $(AM_MAKEFLAGS) $(check_PROGRAMS) +check: check-am +all-am: Makefile $(LTLIBRARIES) $(PROGRAMS) $(HEADERS) +install-binPROGRAMS: install-libLTLIBRARIES + +installdirs: + for dir in "$(DESTDIR)$(libdir)" "$(DESTDIR)$(bindir)" "$(DESTDIR)$(includedir)"; do \ + test -z "$$dir" || $(MKDIR_P) "$$dir"; \ + done +install: install-am +install-exec: install-exec-am +install-data: install-data-am +uninstall: uninstall-am + +install-am: all-am + @$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am + +installcheck: installcheck-am +install-strip: + if test -z '$(STRIP)'; then \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + install; \ + else \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \ + fi +mostlyclean-generic: + +clean-generic: + +distclean-generic: + -test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES) + -test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES) + +maintainer-clean-generic: + @echo "This command is intended for maintainers to use" + @echo "it deletes files that may require special tools to rebuild." +clean: clean-am + +clean-am: clean-binPROGRAMS clean-checkPROGRAMS clean-generic \ + clean-libLTLIBRARIES clean-libtool mostlyclean-am + +distclean: distclean-am + -rm -rf ./$(DEPDIR) + -rm -f Makefile +distclean-am: clean-am distclean-compile distclean-generic \ + distclean-tags + +dvi: dvi-am + +dvi-am: + +html: html-am + +html-am: + +info: info-am + +info-am: + +install-data-am: install-includeHEADERS + +install-dvi: install-dvi-am + +install-dvi-am: + +install-exec-am: install-binPROGRAMS install-libLTLIBRARIES + +install-html: install-html-am + +install-html-am: + +install-info: install-info-am + +install-info-am: + +install-man: + +install-pdf: install-pdf-am + +install-pdf-am: + +install-ps: install-ps-am + +install-ps-am: + +installcheck-am: + +maintainer-clean: maintainer-clean-am + -rm -rf ./$(DEPDIR) + -rm -f Makefile +maintainer-clean-am: distclean-am maintainer-clean-generic + +mostlyclean: mostlyclean-am + +mostlyclean-am: mostlyclean-compile mostlyclean-generic \ + mostlyclean-libtool + +pdf: pdf-am + +pdf-am: + +ps: ps-am + +ps-am: + +uninstall-am: uninstall-binPROGRAMS uninstall-includeHEADERS \ + uninstall-libLTLIBRARIES + +.MAKE: check-am install-am install-strip + +.PHONY: CTAGS GTAGS TAGS all all-am check check-am clean \ + clean-binPROGRAMS clean-checkPROGRAMS clean-generic \ + clean-libLTLIBRARIES clean-libtool cscopelist-am ctags \ + ctags-am distclean distclean-compile distclean-generic \ + distclean-libtool distclean-tags distdir dvi dvi-am html \ + html-am info info-am install install-am install-binPROGRAMS \ + install-data install-data-am install-dvi install-dvi-am \ + install-exec install-exec-am install-html install-html-am \ + install-includeHEADERS install-info install-info-am \ + install-libLTLIBRARIES install-man install-pdf install-pdf-am \ + install-ps install-ps-am install-strip installcheck \ + installcheck-am installdirs maintainer-clean \ + maintainer-clean-generic mostlyclean mostlyclean-compile \ + mostlyclean-generic mostlyclean-libtool pdf pdf-am ps ps-am \ + tags tags-am uninstall uninstall-am uninstall-binPROGRAMS \ + uninstall-includeHEADERS uninstall-libLTLIBRARIES + +.PRECIOUS: Makefile + + +# Tell versions [3.59,3.63) of GNU make to not export all variables. +# Otherwise a system limit (for SysV at least) may be exceeded. +.NOEXPORT: diff --git a/src/annot.c b/src/annot.c new file mode 100644 index 0000000..ec01499 --- /dev/null +++ b/src/annot.c @@ -0,0 +1,239 @@ +/** + * @file annot.c + * @brief Implementation of the annotation set object. + * @class ln_annot annot.h + *//* + * Copyright 2011 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Sannott, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include +#include +#include + +#include "lognorm.h" +#include "samp.h" +#include "annot.h" +#include "internal.h" + +ln_annotSet* +ln_newAnnotSet(ln_ctx ctx) +{ + ln_annotSet *as; + + if((as = calloc(1, sizeof(struct ln_annotSet_s))) == NULL) + goto done; + as->ctx = ctx; +done: return as; +} + + +void +ln_deleteAnnotSet(ln_annotSet *as) +{ + ln_annot *node, *nextnode; + if(as == NULL) + goto done; + + for(node = as->aroot; node != NULL; node = nextnode) { + nextnode = node->next; + ln_deleteAnnot(node); + } + free(as); +done: return; +} + + +ln_annot* +ln_findAnnot(ln_annotSet *as, es_str_t *tag) +{ + ln_annot *annot; + if(as == NULL) { + annot = NULL; + goto done; + } + + for( annot = as->aroot + ; annot != NULL && es_strcmp(annot->tag, tag) + ; annot = annot->next) { + ; /* do nothing, just search... */ + } +done: return annot; +} + + +/** + * Combine two annotations. + * @param[in] annot currently existing and surviving annotation + * @param[in] add annotation to be added. This will be destructed + * as part of the process. + * @returns 0 if ok, something else otherwise + */ +static int +ln_combineAnnot(ln_annot *annot, ln_annot *add) +{ + int r = 0; + ln_annot_op *op, *nextop; + + for(op = add->oproot; op != NULL; op = nextop) { + CHKR(ln_addAnnotOp(annot, op->opc, op->name, op->value)); + nextop = op->next; + free(op); + } + es_deleteStr(add->tag); + free(add); + +done: return r; +} + + +int +ln_addAnnotToSet(ln_annotSet *as, ln_annot *annot) +{ + int r = 0; + ln_annot *aexist; + assert(annot->tag != NULL); + aexist = ln_findAnnot(as, annot->tag); + if(aexist == NULL) { + /* does not yet exist, simply store new annot */ + annot->next = as->aroot; + as->aroot = annot; + } else { /* annotation already exists, combine */ + r = ln_combineAnnot(aexist, annot); + } + return r; +} + + +ln_annot* +ln_newAnnot(es_str_t *tag) +{ + ln_annot *annot; + + if((annot = calloc(1, sizeof(struct ln_annot_s))) == NULL) + goto done; + annot->tag = tag; +done: return annot; +} + + +void +ln_deleteAnnot(ln_annot *annot) +{ + ln_annot_op *op, *nextop; + if(annot == NULL) + goto done; + + es_deleteStr(annot->tag); + for(op = annot->oproot; op != NULL; op = nextop) { + es_deleteStr(op->name); + if(op->value != NULL) + es_deleteStr(op->value); + nextop = op->next; + free(op); + } + free(annot); +done: return; +} + + +int +ln_addAnnotOp(ln_annot *annot, ln_annot_opcode opc, es_str_t *name, es_str_t *value) +{ + int r = -1; + ln_annot_op *node; + + if((node = calloc(1, sizeof(struct ln_annot_op_s))) == NULL) + goto done; + node->opc = opc; + node->name = name; + node->value = value; + + if(annot->oproot != NULL) { + node->next = annot->oproot; + } + annot->oproot = node; + r = 0; + +done: return r; +} + + +/* annotate the event with a specific tag. helper to keep code + * small and easy to follow. + */ +static inline int +ln_annotateEventWithTag(ln_ctx ctx, struct json_object *json, es_str_t *tag) +{ + int r=0; + ln_annot *annot; + ln_annot_op *op; + struct json_object *field; + char *cstr; + + if (NULL == (annot = ln_findAnnot(ctx->pas, tag))) + goto done; + for(op = annot->oproot ; op != NULL ; op = op->next) { + if(op->opc == ln_annot_ADD) { + CHKN(cstr = ln_es_str2cstr(&op->value)); + CHKN(field = json_object_new_string(cstr)); + CHKN(cstr = ln_es_str2cstr(&op->name)); + json_object_object_add(json, cstr, field); + } else { + // TODO: implement + } + } + +done: return r; +} + + +int +ln_annotate(ln_ctx ctx, struct json_object *json, struct json_object *tagbucket) +{ + int r = 0; + es_str_t *tag; + struct json_object *tagObj; + const char *tagCstr; + int i; + + ln_dbgprintf(ctx, "ln_annotate called [aroot=%p]", ctx->pas->aroot); + /* shortcut: terminate immediately if nothing to do... */ + if(ctx->pas->aroot == NULL) + goto done; + + /* iterate over tagbucket */ + for (i = json_object_array_length(tagbucket) - 1; i >= 0; i--) { + CHKN(tagObj = json_object_array_get_idx(tagbucket, i)); + CHKN(tagCstr = json_object_get_string(tagObj)); + ln_dbgprintf(ctx, "ln_annotate, current tag %d, cstr %s", i, tagCstr); + CHKN(tag = es_newStrFromCStr(tagCstr, strlen(tagCstr))); + CHKR(ln_annotateEventWithTag(ctx, json, tag)); + es_deleteStr(tag); + } + +done: return r; +} diff --git a/src/annot.h b/src/annot.h new file mode 100644 index 0000000..95fb1b6 --- /dev/null +++ b/src/annot.h @@ -0,0 +1,172 @@ +/** + * @file annot.h + * @brief The annotation set object + * @class ln_annot annot.h + *//* + * Copyright 2011 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is meant to be included by applications using liblognorm. + * For lognorm library files themselves, include "lognorm.h". + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef LIBLOGNORM_ANNOT_H_INCLUDED +#define LIBLOGNORM_ANNOT_H_INCLUDED +#include + +typedef struct ln_annotSet_s ln_annotSet; +typedef struct ln_annot_s ln_annot; +typedef struct ln_annot_op_s ln_annot_op; +typedef enum {ln_annot_ADD=0, ln_annot_RM=1} ln_annot_opcode; + +/** + * List of annotation operations. + */ +struct ln_annot_op_s { + ln_annot_op *next; + ln_annot_opcode opc; /**< opcode */ + es_str_t *name; + es_str_t *value; +}; + +/** + * annotation object + */ +struct ln_annot_s { + ln_annot *next; /**< used for chaining annotations */ + es_str_t *tag; /**< tag associated for this annotation */ + ln_annot_op *oproot; +}; + +/** + * annotation set object + * + * Note: we do not (yet) use a hash table. However, performance should + * be gained by pre-processing rules so that tags directly point into + * the annotation. This is even faster than hash table access. + */ +struct ln_annotSet_s { + ln_annot *aroot; + ln_ctx ctx; /**< save our context for easy dbgprintf et al... */ +}; + +/* Methods */ + + +/** + * Allocates and initializes a new annotation set. + * @memberof ln_annot + * + * @param[in] ctx current library context. This MUST match the + * context of the parent. + * + * @return pointer to new node or NULL on error + */ +ln_annotSet* ln_newAnnotSet(ln_ctx ctx); + + +/** + * Free annotation set and destruct all members. + * @memberof ln_annot + * + * @param[in] tree pointer to annot to free + */ +void ln_deleteAnnotSet(ln_annotSet *as); + + +/** + * Find annotation inside set based on given tag name. + * @memberof ln_annot + * + * @param[in] as annotation set + * @param[in] tag tag name to look for + * + * @returns NULL if not found, ptr to object otherwise + */ +ln_annot* ln_findAnnot(ln_annotSet *as, es_str_t *tag); + + +/** + * Add annotation to set. + * If an annotation associated with this tag already exists, these + * are combined. If not, a new annotation is added. Note that the + * caller must not access any of the objects passed in to this method + * after it has finished (objects may become deallocated during the + * method). + * @memberof ln_annot + * + * @param[in] as annotation set + * @param[in] annot annotation to add + * + * @returns 0 on success, something else otherwise + */ +int ln_addAnnotToSet(ln_annotSet *as, ln_annot *annot); + + +/** + * Allocates and initializes a new annotation. + * The tag passed in identifies the new annotation. The caller + * no longer owns the tag string after calling this method, so + * it must not access the same copy when the method returns. + * @memberof ln_annot + * + * @param[in] tag tag associated to annot (must not be NULL) + * @return pointer to new node or NULL on error + */ +ln_annot* ln_newAnnot(es_str_t *tag); + + +/** + * Free annotation and destruct all members. + * @memberof ln_annot + * + * @param[in] tree pointer to annot to free + */ +void ln_deleteAnnot(ln_annot *annot); + + +/** + * Add an operation to the annotation set. + * The operation description will be added as entry. + * @memberof ln_annot + * + * @param[in] annot pointer to annot to modify + * @param[in] op operation + * @param[in] name name of field, must NOT be re-used by caller + * @param[in] value value of field, may be NULL (e.g. in remove operation), + * must NOT be re-used by caller + * @returns 0 on success, something else otherwise + */ +int ln_addAnnotOp(ln_annot *anot, ln_annot_opcode opc, es_str_t *name, es_str_t *value); + + +/** + * Annotate an event. + * This adds annotations based on the event's tagbucket. + * @memberof ln_annot + * + * @param[in] ctx current context + * @param[in] event event to annotate (updated with anotations on exit) + * @returns 0 on success, something else otherwise + */ +int ln_annotate(ln_ctx ctx, struct json_object *json, struct json_object *tags); + +#endif /* #ifndef LOGNORM_ANNOT_H_INCLUDED */ diff --git a/src/enc.h b/src/enc.h new file mode 100644 index 0000000..f00c8c9 --- /dev/null +++ b/src/enc.h @@ -0,0 +1,39 @@ +/** + * @file enc.h + * @brief Encoder functions + */ +/* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ + +#ifndef LIBLOGNORM_ENC_H_INCLUDED +#define LIBLOGNORM_ENC_H_INCLUDED + +int ln_fmtEventToRFC5424(struct json_object *json, es_str_t **str); + +int ln_fmtEventToCSV(struct json_object *json, es_str_t **str, es_str_t *extraData); + +int ln_fmtEventToXML(struct json_object *json, es_str_t **str); + +#endif /* LIBLOGNORM_ENC_H_INCLUDED */ diff --git a/src/enc_csv.c b/src/enc_csv.c new file mode 100644 index 0000000..4f79738 --- /dev/null +++ b/src/enc_csv.c @@ -0,0 +1,220 @@ +/** + * @file enc_csv.c + * Encoder for CSV format. Note: CEE currently think about what a + * CEE-compliant CSV format may look like. As such, the format of + * this output will most probably change once the final decision + * has been made. At this time (2010-12), I do NOT even try to + * stay inline with the discussion. + * + * This file contains code from all related objects that is required in + * order to encode this format. The core idea of putting all of this into + * a single file is that this makes it very straightforward to write + * encoders for different encodings, as all is in one place. + * + */ +/* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2018 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include + +#include + +#include "lognorm.h" +#include "internal.h" +#include "enc.h" + +static char hexdigit[16] = + {'0', '1', '2', '3', '4', '5', '6', '7', '8', + '9', 'A', 'B', 'C', 'D', 'E', 'F' }; + +/* TODO: CSV encoding for Unicode characters is as of RFC4627 not fully + * supported. The algorithm is that we must build the wide character from + * UTF-8 (if char > 127) and build the full 4-octet Unicode character out + * of it. Then, this needs to be encoded. Currently, we work on a + * byte-by-byte basis, which simply is incorrect. + * rgerhards, 2010-11-09 + */ +static int +ln_addValue_CSV(const char *buf, es_str_t **str) +{ + int r; + unsigned char c; + es_size_t i; + char numbuf[4]; + int j; + + assert(str != NULL); + assert(*str != NULL); + assert(buf != NULL); + + for(i = 0; i < strlen(buf); i++) { + c = buf[i]; + if((c >= 0x23 && c <= 0x5b) + || (c >= 0x5d /* && c <= 0x10FFFF*/) + || c == 0x20 || c == 0x21) { + /* no need to escape */ + es_addChar(str, c); + } else { + /* we must escape, try RFC4627-defined special sequences first */ + switch(c) { + case '\0': + es_addBuf(str, "\\u0000", 6); + break; + case '\"': + es_addBuf(str, "\\\"", 2); + break; + case '\\': + es_addBuf(str, "\\\\", 2); + break; + case '\010': + es_addBuf(str, "\\b", 2); + break; + case '\014': + es_addBuf(str, "\\f", 2); + break; + case '\n': + es_addBuf(str, "\\n", 2); + break; + case '\r': + es_addBuf(str, "\\r", 2); + break; + case '\t': + es_addBuf(str, "\\t", 2); + break; + default: + /* TODO : proper Unicode encoding (see header comment) */ + for(j = 0 ; j < 4 ; ++j) { + numbuf[3-j] = hexdigit[c % 16]; + c = c / 16; + } + es_addBuf(str, "\\u", 2); + es_addBuf(str, numbuf, 4); + break; + } + } + } + r = 0; + + return r; +} + + +static int +ln_addField_CSV(struct json_object *field, es_str_t **str) +{ + int r, i; + struct json_object *obj; + int needComma = 0; + const char *value; + + assert(field != NULL); + assert(str != NULL); + assert(*str != NULL); + + switch(json_object_get_type(field)) { + case json_type_array: + CHKR(es_addChar(str, '[')); + for (i = json_object_array_length(field) - 1; i >= 0; i--) { + if(needComma) + es_addChar(str, ','); + else + needComma = 1; + CHKN(obj = json_object_array_get_idx(field, i)); + CHKN(value = json_object_get_string(obj)); + CHKR(ln_addValue_CSV(value, str)); + } + CHKR(es_addChar(str, ']')); + break; + case json_type_string: + case json_type_int: + CHKN(value = json_object_get_string(field)); + CHKR(ln_addValue_CSV(value, str)); + break; + case json_type_null: + case json_type_boolean: + case json_type_double: + case json_type_object: + CHKR(es_addBuf(str, "***unsupported type***", sizeof("***unsupported type***")-1)); + break; + default: + CHKR(es_addBuf(str, "***OBJECT***", sizeof("***OBJECT***")-1)); + } + + r = 0; + +done: + return r; +} + + +int +ln_fmtEventToCSV(struct json_object *json, es_str_t **str, es_str_t *extraData) +{ + int r = -1; + int needComma = 0; + struct json_object *field; + char *namelist = NULL, *name, *nn; + + assert(json != NULL); + assert(json_object_is_type(json, json_type_object)); + + if((*str = es_newStr(256)) == NULL) + goto done; + if(extraData == NULL) + goto done; + + CHKN(namelist = es_str2cstr(extraData, NULL)); + + for (name = namelist; name != NULL; name = nn) { + for (nn = name; *nn != '\0' && *nn != ',' && *nn != ' '; nn++) + { /* do nothing */ } + if (*nn == '\0') { + nn = NULL; + } else { + *nn = '\0'; + nn++; + } + json_object_object_get_ex(json, name, &field); + if (needComma) { + CHKR(es_addChar(str, ',')); + } else { + needComma = 1; + } + if (field != NULL) { + CHKR(es_addChar(str, '"')); + ln_addField_CSV(field, str); + CHKR(es_addChar(str, '"')); + } + } + r = 0; +done: + if (namelist != NULL) + free(namelist); + return r; +} diff --git a/src/enc_syslog.c b/src/enc_syslog.c new file mode 100644 index 0000000..04378be --- /dev/null +++ b/src/enc_syslog.c @@ -0,0 +1,209 @@ +/** + * @file enc_syslog.c + * Encoder for syslog format. + * This file contains code from all related objects that is required in + * order to encode syslog format. The core idea of putting all of this into + * a single file is that this makes it very straightforward to write + * encoders for different encodings, as all is in one place. + */ +/* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2016 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include + +#include + +#include "internal.h" +#include "liblognorm.h" +#include "enc.h" + +static int +ln_addValue_Syslog(const char *value, es_str_t **str) +{ + int r; + es_size_t i; + + assert(str != NULL); + assert(*str != NULL); + assert(value != NULL); + + for(i = 0; i < strlen(value); i++) { + switch(value[i]) { + case '\0': + es_addChar(str, '\\'); + es_addChar(str, '0'); + break; + case '\n': + es_addChar(str, '\\'); + es_addChar(str, 'n'); + break; + /* TODO : add rest of control characters here... */ + case ',': /* comma is CEE-reserved for lists */ + es_addChar(str, '\\'); + es_addChar(str, ','); + break; +#if 0 /* alternative encoding for discussion */ + case '^': /* CEE-reserved for lists */ + es_addChar(str, '\\'); + es_addChar(str, '^'); + break; +#endif + /* at this layer ... do we need to think about transport + * encoding at all? Or simply leave it to the transport agent? + */ + case '\\': /* RFC5424 reserved */ + es_addChar(str, '\\'); + es_addChar(str, '\\'); + break; + case ']': /* RFC5424 reserved */ + es_addChar(str, '\\'); + es_addChar(str, ']'); + break; + case '\"': /* RFC5424 reserved */ + es_addChar(str, '\\'); + es_addChar(str, '\"'); + break; + default: + es_addChar(str, value[i]); + break; + } + } + r = 0; + + return r; +} + + +static int +ln_addField_Syslog(char *name, struct json_object *field, es_str_t **str) +{ + int r; + const char *value; + int needComma = 0; + struct json_object *obj; + int i; + + assert(field != NULL); + assert(str != NULL); + assert(*str != NULL); + + CHKR(es_addBuf(str, name, strlen(name))); + CHKR(es_addBuf(str, "=\"", 2)); + switch(json_object_get_type(field)) { + case json_type_array: + for (i = json_object_array_length(field) - 1; i >= 0; i--) { + if(needComma) + es_addChar(str, ','); + else + needComma = 1; + CHKN(obj = json_object_array_get_idx(field, i)); + CHKN(value = json_object_get_string(obj)); + CHKR(ln_addValue_Syslog(value, str)); + } + break; + case json_type_string: + case json_type_int: + CHKN(value = json_object_get_string(field)); + CHKR(ln_addValue_Syslog(value, str)); + break; + case json_type_null: + case json_type_boolean: + case json_type_double: + case json_type_object: + CHKR(es_addBuf(str, "***unsupported type***", sizeof("***unsupported type***")-1)); + break; + default: + CHKR(es_addBuf(str, "***OBJECT***", sizeof("***OBJECT***")-1)); + } + CHKR(es_addChar(str, '\"')); + r = 0; + +done: + return r; +} + + +static inline int +ln_addTags_Syslog(struct json_object *taglist, es_str_t **str) +{ + int r = 0; + struct json_object *tagObj; + int needComma = 0; + const char *tagCstr; + int i; + + assert(json_object_is_type(taglist, json_type_array)); + + CHKR(es_addBuf(str, " event.tags=\"", 13)); + for (i = json_object_array_length(taglist) - 1; i >= 0; i--) { + if(needComma) + es_addChar(str, ','); + else + needComma = 1; + CHKN(tagObj = json_object_array_get_idx(taglist, i)); + CHKN(tagCstr = json_object_get_string(tagObj)); + CHKR(es_addBuf(str, (char*)tagCstr, strlen(tagCstr))); + } + es_addChar(str, '"'); + +done: return r; +} + + +int +ln_fmtEventToRFC5424(struct json_object *json, es_str_t **str) +{ + int r = -1; + struct json_object *tags; + + assert(json != NULL); + assert(json_object_is_type(json, json_type_object)); + if((*str = es_newStr(256)) == NULL) + goto done; + + es_addBuf(str, "[cee@115", 8); + + if(json_object_object_get_ex(json, "event.tags", &tags)) { + CHKR(ln_addTags_Syslog(tags, str)); + } + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + char *const name = (char*)json_object_iter_peek_name(&it); + if (strcmp(name, "event.tags")) { + es_addChar(str, ' '); + ln_addField_Syslog(name, json_object_iter_peek_value(&it), str); + } + json_object_iter_next(&it); + } + es_addChar(str, ']'); + +done: + return r; +} diff --git a/src/enc_xml.c b/src/enc_xml.c new file mode 100644 index 0000000..4cd7276 --- /dev/null +++ b/src/enc_xml.c @@ -0,0 +1,230 @@ +/** + * @file enc-xml.c + * Encoder for XML format. + * + * This file contains code from all related objects that is required in + * order to encode this format. The core idea of putting all of this into + * a single file is that this makes it very straightforward to write + * encoders for different encodings, as all is in one place. + * + */ +/* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2016 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include + +#include "lognorm.h" +#include "internal.h" +#include "enc.h" + +#if 0 +static char hexdigit[16] = + {'0', '1', '2', '3', '4', '5', '6', '7', '8', + '9', 'A', 'B', 'C', 'D', 'E', 'F' }; +#endif + +/* TODO: XML encoding for Unicode characters is as of RFC4627 not fully + * supported. The algorithm is that we must build the wide character from + * UTF-8 (if char > 127) and build the full 4-octet Unicode character out + * of it. Then, this needs to be encoded. Currently, we work on a + * byte-by-byte basis, which simply is incorrect. + * rgerhards, 2010-11-09 + */ +static int +ln_addValue_XML(const char *value, es_str_t **str) +{ + int r; + unsigned char c; + es_size_t i; +#if 0 + char numbuf[4]; + int j; +#endif + + assert(str != NULL); + assert(*str != NULL); + assert(value != NULL); + // TODO: support other types! + es_addBuf(str, "", 7); + + for(i = 0 ; i < strlen(value) ; ++i) { + c = value[i]; + switch(c) { + case '\0': + es_addBuf(str, "�", 5); + break; +#if 0 + case '\n': + es_addBuf(str, " ", 5); + break; + case '\r': + es_addBuf(str, " ", 5); + break; + case '\t': + es_addBuf(str, "&x08;", 5); + break; + case '\"': + es_addBuf(str, """, 6); + break; +#endif + case '<': + es_addBuf(str, "<", 4); + break; + case '&': + es_addBuf(str, "&", 5); + break; +#if 0 + case ',': + es_addBuf(str, "\\,", 2); + break; + case '\'': + es_addBuf(str, "'", 6); + break; +#endif + default: + es_addChar(str, c); +#if 0 + /* TODO : proper Unicode encoding (see header comment) */ + for(j = 0 ; j < 4 ; ++j) { + numbuf[3-j] = hexdigit[c % 16]; + c = c / 16; + } + es_addBuf(str, "\\u", 2); + es_addBuf(str, numbuf, 4); + break; +#endif + } + } + es_addBuf(str, "", 8); + r = 0; + + return r; +} + + +static int +ln_addField_XML(char *name, struct json_object *field, es_str_t **str) +{ + int r; + int i; + const char *value; + struct json_object *obj; + + assert(field != NULL); + assert(str != NULL); + assert(*str != NULL); + + CHKR(es_addBuf(str, "", 2)); + + switch(json_object_get_type(field)) { + case json_type_array: + for (i = json_object_array_length(field) - 1; i >= 0; i--) { + CHKN(obj = json_object_array_get_idx(field, i)); + CHKN(value = json_object_get_string(obj)); + CHKR(ln_addValue_XML(value, str)); + } + break; + case json_type_string: + case json_type_int: + CHKN(value = json_object_get_string(field)); + CHKR(ln_addValue_XML(value, str)); + break; + case json_type_null: + case json_type_boolean: + case json_type_double: + case json_type_object: + CHKR(es_addBuf(str, "***unsupported type***", sizeof("***unsupported type***")-1)); + break; + default: + CHKR(es_addBuf(str, "***OBJECT***", sizeof("***OBJECT***")-1)); + } + + CHKR(es_addBuf(str, "", 8)); + r = 0; + +done: + return r; +} + + +static inline int +ln_addTags_XML(struct json_object *taglist, es_str_t **str) +{ + int r = 0; + struct json_object *tagObj; + const char *tagCstr; + int i; + + CHKR(es_addBuf(str, "", 12)); + for (i = json_object_array_length(taglist) - 1; i >= 0; i--) { + CHKR(es_addBuf(str, "", 5)); + CHKN(tagObj = json_object_array_get_idx(taglist, i)); + CHKN(tagCstr = json_object_get_string(tagObj)); + CHKR(es_addBuf(str, (char*)tagCstr, strlen(tagCstr))); + CHKR(es_addBuf(str, "", 6)); + } + CHKR(es_addBuf(str, "", 13)); + +done: return r; +} + + +int +ln_fmtEventToXML(struct json_object *json, es_str_t **str) +{ + int r = -1; + struct json_object *tags; + + assert(json != NULL); + assert(json_object_is_type(json, json_type_object)); + + if((*str = es_newStr(256)) == NULL) + goto done; + + es_addBuf(str, "", 7); + if(json_object_object_get_ex(json, "event.tags", &tags)) { + CHKR(ln_addTags_XML(tags, str)); + } + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + char *const name = (char*) json_object_iter_peek_name(&it); + if (strcmp(name, "event.tags")) { + ln_addField_XML(name, json_object_iter_peek_value(&it), str); + } + json_object_iter_next(&it); + } + + es_addBuf(str, "", 8); + +done: + return r; +} diff --git a/src/helpers.h b/src/helpers.h new file mode 100644 index 0000000..49bc30c --- /dev/null +++ b/src/helpers.h @@ -0,0 +1,18 @@ +/** + * @file pdag.c + * @brief Implementation of the parse dag object. + * @class ln_pdag pdag.h + *//* + * Copyright 2015 by Rainer Gerhards and Adiscon GmbH. + * + * Released under ASL 2.0. + */ +#ifndef LIBLOGNORM_HELPERS_H +#define LIBLOGNORM_HELPERS_H + +static inline int myisdigit(char c) +{ + return (c >= '0' && c <= '9'); +} + +#endif diff --git a/src/internal.h b/src/internal.h new file mode 100644 index 0000000..92f00a9 --- /dev/null +++ b/src/internal.h @@ -0,0 +1,111 @@ +/** + * @file internal.h + * @brief Internal things just needed for building the library, but + * not to be installed. + *//** + * @mainpage + * Liblognorm is an easy to use and fast samples-based log normalization + * library. + * + * It can be passed a stream of arbitrary log messages, one at a time, and for + * each message it will output well-defined name-value pairs and a set of + * tags describing the message. + * + * For further details, see it's initial announcement available at + * http://blog.gerhards.net/2010/10/introducing-liblognorm.html + * + * The public interface of this library is describe in liblognorm.h. + * + * Liblognorm fully supports Unicode. Like most Linux tools, it operates + * on UTF-8 natively, called "passive mode". This was decided because we + * so can keep the size of data structures small while still supporting + * all of the world's languages (actually more than when we did UCS-2). + * + * At the technical level, we can handle UTF-8 multibyte sequences transparently. + * Liblognorm needs to look at a few US-ASCII characters to do the + * sample base parsing (things to indicate fields), so this is no + * issue. Inside the parse tree, a multibyte sequence can simple be processed + * as if it were a sequence of different characters that make up a their + * own symbol each. In fact, this even allows for somewhat greater parsing + * speed. + *//* + * + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2018 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef INTERNAL_H_INCLUDED +#define INTERNAL_H_INCLUDED + +/* the jump-misses-init gcc warning is overdoing when we jump to the + * exit of a function to get proper finalization. So let's disable it. + * rgerhards, 2018-04-25 + */ +#pragma GCC diagnostic ignored "-Wjump-misses-init" + +#include "liblognorm.h" + +#include + +/* we need to turn off this warning, as it also comes up in C99 mode, which + * we use. + */ +#pragma GCC diagnostic ignored "-Wdeclaration-after-statement" + +/* support for simple error checking */ + +#define CHKR(x) \ + if((r = (x)) != 0) goto done + +#define CHKN(x) \ + if((x) == NULL) { \ + r = LN_NOMEM; \ + goto done; \ + } + +#define FAIL(e) {r = (e); goto done;} + +static inline char* ln_es_str2cstr(es_str_t **str) +{ + int r = -1; + char *buf; + + if (es_strlen(*str) == (*str)->lenBuf) { + CHKR(es_extendBuf(str, 1)); + } + CHKN(buf = (char*)es_getBufAddr(*str)); + buf[es_strlen(*str)] = '\0'; + return buf; +done: + return NULL; +} + +const char * ln_DataForDisplayCharTo(__attribute__((unused)) ln_ctx ctx, void *const pdata); +const char * ln_DataForDisplayLiteral(__attribute__((unused)) ln_ctx ctx, void *const pdata); +const char * ln_JsonConfLiteral(__attribute__((unused)) ln_ctx ctx, void *const pdata); + +/* here we add some stuff from the compatibility layer */ +#ifndef HAVE_STRNDUP +char * strndup(const char *s, size_t n); +#endif + +#endif /* #ifndef INTERNAL_H_INCLUDED */ diff --git a/src/liblognorm.c b/src/liblognorm.c new file mode 100644 index 0000000..12c057b --- /dev/null +++ b/src/liblognorm.c @@ -0,0 +1,190 @@ +/* This file implements the liblognorm API. + * See header file for descriptions. + * + * liblognorm - a fast samples-based log normalization library + * Copyright 2013 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include + +#include "liblognorm.h" +#include "lognorm.h" +#include "annot.h" +#include "samp.h" +#include "v1_liblognorm.h" +#include "v1_ptree.h" + +#define CHECK_CTX \ + if(ctx->objID != LN_ObjID_CTX) { \ + r = -1; \ + goto done; \ + } + +const char * +ln_version(void) +{ + return VERSION; +} + +int +ln_hasAdvancedStats(void) +{ +#ifdef ADVANCED_STATS + return 1; +#else + return 0; +#endif +} + +ln_ctx +ln_initCtx(void) +{ + ln_ctx ctx; + if((ctx = calloc(1, sizeof(struct ln_ctx_s))) == NULL) + goto done; + +#ifdef HAVE_JSON_GLOBAL_SET_STRING_HASH + json_global_set_string_hash(JSON_C_STR_HASH_PERLLIKE); +#endif +#ifdef HAVE_JSON_GLOBAL_SET_PRINTBUF_INITIAL_SIZE + json_global_set_printbuf_initial_size(2048); +#endif + ctx->objID = LN_ObjID_CTX; + ctx->dbgCB = NULL; + ctx->opts = 0; + + /* we add an root for the empty word, this simplifies parse + * dag handling. + */ + if((ctx->pdag = ln_newPDAG(ctx)) == NULL) { + free(ctx); + ctx = NULL; + goto done; + } + /* same for annotation set */ + if((ctx->pas = ln_newAnnotSet(ctx)) == NULL) { + ln_pdagDelete(ctx->pdag); + free(ctx); + ctx = NULL; + goto done; + } + +done: + return ctx; +} + +void +ln_setCtxOpts(ln_ctx ctx, const unsigned opts) { + ctx->opts |= opts; +} + + +int +ln_exitCtx(ln_ctx ctx) +{ + int r = 0; + + CHECK_CTX; + + ln_dbgprintf(ctx, "exitCtx %p", ctx); + ctx->objID = LN_ObjID_None; /* prevent double free */ + /* support for old cruft */ + if(ctx->ptree != NULL) + ln_deletePTree(ctx->ptree); + /* end support for old cruft */ + if(ctx->pdag != NULL) + ln_pdagDelete(ctx->pdag); + for(int i = 0 ; i < ctx->nTypes ; ++i) { + free((void*)ctx->type_pdags[i].name); + ln_pdagDelete(ctx->type_pdags[i].pdag); + } + free(ctx->type_pdags); + if(ctx->rulePrefix != NULL) + es_deleteStr(ctx->rulePrefix); + if(ctx->pas != NULL) + ln_deleteAnnotSet(ctx->pas); + free(ctx); +done: + return r; +} + + +int +ln_setDebugCB(ln_ctx ctx, void (*cb)(void*, const char*, size_t), void *cookie) +{ + int r = 0; + + CHECK_CTX; + ctx->dbgCB = cb; + ctx->dbgCookie = cookie; +done: + return r; +} + + +int +ln_setErrMsgCB(ln_ctx ctx, void (*cb)(void*, const char*, size_t), void *cookie) +{ + int r = 0; + + CHECK_CTX; + ctx->errmsgCB = cb; + ctx->errmsgCookie = cookie; +done: + return r; +} + +int +ln_loadSamples(ln_ctx ctx, const char *file) +{ + int r = 0; + const char *tofree; + CHECK_CTX; + ctx->conf_file = tofree = strdup(file); + ctx->conf_ln_nbr = 0; + ++ctx->include_level; + r = ln_sampLoad(ctx, file); + --ctx->include_level; + free((void*)tofree); + ctx->conf_file = NULL; +done: + return r; +} + +int +ln_loadSamplesFromString(ln_ctx ctx, const char *string) +{ + int r = 0; + const char *tofree; + CHECK_CTX; + ctx->conf_file = tofree = strdup("--NO-FILE--"); + ctx->conf_ln_nbr = 0; + ++ctx->include_level; + r = ln_sampLoadFromString(ctx, string); + --ctx->include_level; + free((void*)tofree); + ctx->conf_file = NULL; +done: + return r; +} diff --git a/src/liblognorm.h b/src/liblognorm.h new file mode 100644 index 0000000..3b7ebd8 --- /dev/null +++ b/src/liblognorm.h @@ -0,0 +1,267 @@ +/** + * @file liblognorm.h + * @brief The public liblognorm API. + * + * Functions other than those defined here MUST not be called by + * a liblognorm "user" application. + * + * This file is meant to be included by applications using liblognorm. + * For lognorm library files themselves, include "lognorm.h". + *//** + * @mainpage + * Liblognorm is an easy to use and fast samples-based log normalization + * library. + * + * It can be passed a stream of arbitrary log messages, one at a time, and for + * each message it will output well-defined name-value pairs and a set of + * tags describing the message. + * + * For further details, see it's initial announcement available at + * http://blog.gerhards.net/2010/10/introducing-liblognorm.html + * + * The public interface of this library is describe in liblognorm.h. + * + * Liblognorm fully supports Unicode. Like most Linux tools, it operates + * on UTF-8 natively, called "passive mode". This was decided because we + * so can keep the size of data structures small while still supporting + * all of the world's languages (actually more than when we did UCS-2). + * + * At the technical level, we can handle UTF-8 multibyte sequences transparently. + * Liblognorm needs to look at a few US-ASCII characters to do the + * sample base parsing (things to indicate fields), so this is no + * issue. Inside the parse tree, a multibyte sequence can simple be processed + * as if it were a sequence of different characters that make up a their + * own symbol each. In fact, this even allows for somewhat greater parsing + * speed. + *//* + * + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2017 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef LIBLOGNORM_H_INCLUDED +#define LIBLOGNORM_H_INCLUDED +#include /* we need size_t */ +#include + +/* error codes */ +#define LN_NOMEM -1 +#define LN_INVLDFDESCR -1 +#define LN_BADCONFIG -250 +#define LN_BADPARSERSTATE -500 +#define LN_WRONGPARSER -1000 + +#define LN_RB_LINE_TOO_LONG -1001 +#define LN_OVER_SIZE_LIMIT -1002 + +/** + * The library context descriptor. + * This is used to permit multiple independent instances of the + * library to be called within a single program. This is most + * useful for plugin-based architectures. + */ +typedef struct ln_ctx_s* ln_ctx; + +/* API */ +/** + * Return library version string. + * + * Returns the version of the currently used library. + * + * @return Zero-Terminated library version string. + */ +/* Note: this MUST NOT be inline to make sure the actual library + * has the right version, not just what was used to compile! + */ +const char *ln_version(void); + +/** + * Return if library is build with advanced statistics + * activated. + * + * @return 1 if advanced stats are active, 0 if not + */ +int ln_hasAdvancedStats(void); + +/** + * Initialize a library context. + * + * To prevent memory leaks, ln_exitCtx() must be called on a library + * context that is no longer needed. + * + * @return new library context or NULL if an error occured + */ +ln_ctx ln_initCtx(void); + +/** + * Inherit control attributes from a library context. + * + * This does not copy the parse-tree, but does copy + * behaviour-controling attributes such as enableRegex. + * + * Just as with ln_initCtx, ln_exitCtx() must be called on a library + * context that is no longer needed. + * + * @return new library context or NULL if an error occured + */ +ln_ctx ln_inherittedCtx(ln_ctx parent); + +/** + * Discard a library context. + * + * Free's the ressources associated with the given library context. It + * MUST NOT be accessed after calling this function. + * + * @param ctx The context to be discarded. + * + * @return Returns zero on success, something else otherwise. + */ +int ln_exitCtx(ln_ctx ctx); + + +/* binary values, so that we can "or" them together */ +#define LN_CTXOPT_ALLOW_REGEX 0x01 /**< permit regex matching */ +#define LN_CTXOPT_ADD_EXEC_PATH 0x02 /**< add exec_path attribute (time-consuming!) */ +#define LN_CTXOPT_ADD_ORIGINALMSG 0x04 /**< always add original message to output + (not just in error case) */ +#define LN_CTXOPT_ADD_RULE 0x08 /**< add mockup rule */ +#define LN_CTXOPT_ADD_RULE_LOCATION 0x10 /**< add rule location (file, lineno) to metadata */ +/** + * Set options on ctx. + * + * @param ctx The context to be modified. + * @param opts a potentially or-ed list of options, see LN_CTXOPT_* + */ +void +ln_setCtxOpts(ln_ctx ctx, unsigned opts); + + +/** + * Set a debug message handler (callback). + * + * Liblognorm can provide helpful information for debugging + * - it's internal processing + * - the way a log message is being normalized + * + * It does so by emiting "interesting" information about its processing + * at various stages. A caller can obtain this information by registering + * an entry point. When done so, liblognorm will call the entry point + * whenever it has something to emit. Note that debugging can be rather + * verbose. + * + * The callback will be called with the following three parameters in that order: + * - the caller-provided cookie + * - a zero-terminated string buffer + * - the length of the string buffer, without the trailing NUL byte + * + * @note + * The provided callback function must not call any liblognorm + * APIs except when specifically flagged as safe for calling by a debug + * callback handler. + * + * @param[in] ctx The library context to apply callback to. + * @param[in] cb The function to be called for debugging + * @param[in] cookie Opaque cookie to be passed down to debug handler. Can be + * used for some state tracking by the caller. This is defined as + * void* to support pointers. To play it safe, a pointer should be + * passed (but advantorous folks may also use an unsigned). + * + * @return Returns zero on success, something else otherwise. + */ +int ln_setDebugCB(ln_ctx ctx, void (*cb)(void*, const char*, size_t), void *cookie); + +/** + * Set a error message handler (callback). + * + * If set, this is used to emit error messages of interest to the user, e.g. + * on failures during rulebase load. It is suggested that a caller uses this + * feedback to aid its users in resolving issues. + * Its semantics are otherwise exactly the same like ln_setDebugCB(). + */ +int ln_setErrMsgCB(ln_ctx ctx, void (*cb)(void*, const char*, size_t), void *cookie); + + +/** + * enable or disable debug mode. + * + * @param[in] ctx context + * @param[in] b boolean 0 - disable debug mode, 1 - enable debug mode + */ +void ln_enableDebug(ln_ctx ctx, int i); + + +/** + * Load a (log) sample file. + * + * The file must contain log samples in syntactically correct format. Samples are added + * to set already loaded in the current context. If there is a sample with duplicate + * semantics, this sample will be ignored. Most importantly, this can \b not be used + * to change tag assignments for a given sample. + * + * @param[in] ctx The library context to apply callback to. + * @param[in] file Name of file to be loaded. + * + * @return Returns zero on success, something else otherwise. + */ +int ln_loadSamples(ln_ctx ctx, const char *file); + +/** + * Load a rulebase via a string. + * + * Note: this can only load v2 samples, v1 is NOT supported. + * + * @param[in] ctx The library context to apply callback to. + * @param[in] string The string with the actual rulebase. + * + * @return Returns zero on success, something else otherwise. + */ +int ln_loadSamplesFromString(ln_ctx ctx, const char *string); + +/** + * Normalize a message. + * + * This is the main library entry point. It is called with a message + * to normalize and will return a normalized in-memory representation + * of it. + * + * If an error occurs, the function returns -1. In that case, an + * in-memory event representation has been generated if event is + * non-NULL. In that case, the event contains further error details in + * normalized form. + * + * @note + * This function works on byte-counted strings and as such is able to + * process NUL bytes if they occur inside the message. On the other hand, + * this means the the correct messages size, \b excluding the NUL byte, + * must be provided. + * + * @param[in] ctx The library context to use. + * @param[in] str The message string (see note above). + * @param[in] strLen The length of the message in bytes. + * @param[out] json_p A new event record or NULL if an error occured. Must be + * destructed if no longer needed. + * + * @return Returns zero on success, something else otherwise. + */ +int ln_normalize(ln_ctx ctx, const char *str, const size_t strLen, struct json_object **json_p); + +#endif /* #ifndef LOGNORM_H_INCLUDED */ diff --git a/src/lognorm-features.h b/src/lognorm-features.h new file mode 100644 index 0000000..16bab15 --- /dev/null +++ b/src/lognorm-features.h @@ -0,0 +1,3 @@ +#if 0 +#define LOGNORM_REGEX_SUPPORTED 1 +#endif diff --git a/src/lognorm-features.h.in b/src/lognorm-features.h.in new file mode 100644 index 0000000..0bcd410 --- /dev/null +++ b/src/lognorm-features.h.in @@ -0,0 +1,3 @@ +#if @FEATURE_REGEXP@ +#define LOGNORM_REGEX_SUPPORTED 1 +#endif diff --git a/src/lognorm.c b/src/lognorm.c new file mode 100644 index 0000000..2c40ea0 --- /dev/null +++ b/src/lognorm.c @@ -0,0 +1,143 @@ +/* liblognorm - a fast samples-based log normalization library + * Copyright 2010 by Rainer Gerhards and Adiscon GmbH. + * + * This file is part of liblognorm. + * + * Released under ASL 2.0 + */ +#include "config.h" +#include +#include +#include +#include + +#include "liblognorm.h" +#include "lognorm.h" + +/* Code taken from rsyslog ASL 2.0 code. + * From varmojfekoj's mail on why he provided rs_strerror_r(): + * There are two problems with strerror_r(): + * I see you've rewritten some of the code which calls it to use only + * the supplied buffer; unfortunately the GNU implementation sometimes + * doesn't use the buffer at all and returns a pointer to some + * immutable string instead, as noted in the man page. + * + * The other problem is that on some systems strerror_r() has a return + * type of int. + * + * So I've written a wrapper function rs_strerror_r(), which should + * take care of all this and be used instead. + */ +static char * +rs_strerror_r(const int errnum, char *const buf, const size_t buflen) { +#ifndef HAVE_STRERROR_R + char *pszErr; + pszErr = strerror(errnum); + snprintf(buf, buflen, "%s", pszErr); +#else +# ifdef STRERROR_R_CHAR_P + char *p = strerror_r(errnum, buf, buflen); + if (p != buf) { + strncpy(buf, p, buflen); + buf[buflen - 1] = '\0'; + } +# else + strerror_r(errnum, buf, buflen); +# endif +#endif + return buf; +} +/** + * Generate some debug message and call the caller provided callback. + * + * Will first check if a user callback is registered. If not, returns + * immediately. + */ +void +ln_dbgprintf(ln_ctx ctx, const char *fmt, ...) +{ + va_list ap; + char buf[8*1024]; + size_t lenBuf; + + if(ctx->dbgCB == NULL) + goto done; + + va_start(ap, fmt); + lenBuf = vsnprintf(buf, sizeof(buf), fmt, ap); + va_end(ap); + if(lenBuf >= sizeof(buf)) { + /* prevent buffer overruns and garbagge display */ + buf[sizeof(buf) - 5] = '.'; + buf[sizeof(buf) - 4] = '.'; + buf[sizeof(buf) - 3] = '.'; + buf[sizeof(buf) - 2] = '\n'; + buf[sizeof(buf) - 1] = '\0'; + lenBuf = sizeof(buf) - 1; + } + + ctx->dbgCB(ctx->dbgCookie, buf, lenBuf); +done: return; +} + +/** + * Generate error message and call the caller provided callback. + * eno is the OS errno. If non-zero, the OS error description + * will be added after the user-provided string. + * + * Will first check if a user callback is registered. If not, returns + * immediately. + */ +void +ln_errprintf(const ln_ctx ctx, const int eno, const char *fmt, ...) +{ + va_list ap; + char buf[8*1024]; + char errbuf[1024]; + char finalbuf[9*1024]; + size_t lenBuf; + char *msg; + + if(ctx->errmsgCB == NULL) + goto done; + + va_start(ap, fmt); + lenBuf = vsnprintf(buf, sizeof(buf), fmt, ap); + va_end(ap); + if(lenBuf >= sizeof(buf)) { + /* prevent buffer overrruns and garbagge display */ + buf[sizeof(buf) - 5] = '.'; + buf[sizeof(buf) - 4] = '.'; + buf[sizeof(buf) - 3] = '.'; + buf[sizeof(buf) - 2] = '\n'; + buf[sizeof(buf) - 1] = '\0'; + lenBuf = sizeof(buf) - 1; + } + + if(eno != 0) { + rs_strerror_r(eno, errbuf, sizeof(errbuf)); + lenBuf = snprintf(finalbuf, sizeof(finalbuf), "%s: %s", buf, errbuf); + msg = finalbuf; + } else { + msg = buf; + } + + if(ctx->conf_file != NULL) { + /* error during config processing, add line info */ + const char *const m = strdup(msg); + lenBuf = snprintf(finalbuf, sizeof(finalbuf), "rulebase file %s[%d]: %s", + ctx->conf_file, ctx->conf_ln_nbr, m); + msg = finalbuf; + free((void*) m); + } + + ctx->errmsgCB(ctx->dbgCookie, msg, lenBuf); + ln_dbgprintf(ctx, "%s", msg); +done: return; +} + +void +ln_enableDebug(ln_ctx ctx, int i) +{ + ctx->debug = i & 0x01; +} diff --git a/src/lognorm.h b/src/lognorm.h new file mode 100644 index 0000000..92cb732 --- /dev/null +++ b/src/lognorm.h @@ -0,0 +1,85 @@ +/** + * @file lognorm.h + * @brief Private data structures used by the liblognorm API. + *//* + * + * liblognorm - a fast samples-based log normalization library + * Copyright 2010 by Rainer Gerhards and Adiscon GmbH. + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef LIBLOGNORM_LOGNORM_HINCLUDED +#define LIBLOGNORM_LOGNORM_HINCLUDED +#include /* we need size_t */ +#include "liblognorm.h" +#include "pdag.h" +#include "annot.h" + +/* some limits */ +#define MAX_FIELDNAME_LEN 1024 +#define MAX_TYPENAME_LEN 1024 + +#define LN_ObjID_None 0xFEFE0001 +#define LN_ObjID_CTX 0xFEFE0001 + +struct ln_type_pdag { + const char *name; + ln_pdag *pdag; +}; + +struct ln_ctx_s { + unsigned objID; /**< a magic number to prevent some memory addressing errors */ + void (*dbgCB)(void *cookie, const char *msg, size_t lenMsg); + /**< user-provided debug output callback */ + void *dbgCookie; /**< cookie to be passed to debug callback */ + void (*errmsgCB)(void *cookie, const char *msg, size_t lenMsg); + /**< user-provided error message callback */ + void *errmsgCookie; /**< cookie to be passed to error message callback */ + ln_pdag *pdag; /**< parse dag being used by this context */ + ln_annotSet *pas; /**< associated set of annotations */ + unsigned nNodes; /**< number of nodes in our parse tree */ + unsigned char debug; /**< boolean: are we in debug mode? */ + es_str_t *rulePrefix; /**< work variable for loading rule bases + * this is the prefix string that will be prepended + * to all rules before they are submitted to tree + * building. + */ + unsigned opts; /**< specific options, see LN_CTXOPTS_* defines */ + struct ln_type_pdag *type_pdags; /**< array of our type pdags */ + int nTypes; /**< number of type pdags */ + int version; /**< 1 or 2, depending on rulebase/algo version */ + + /* here follows stuff for the v1 subsystem -- do NOT make any changes + * down here. This is strictly read-only. May also be removed some time in + * the future. + */ + struct ln_ptree *ptree; + /* end old cruft */ + /* things for config processing / error message during it */ + int include_level; /**< 1 for main rulebase file, higher for include levels */ + const char *conf_file; /**< currently open config file or NULL, if none */ + unsigned int conf_ln_nbr; /**< current config file line number */ +}; + +void ln_dbgprintf(ln_ctx ctx, const char *fmt, ...) __attribute__((format(printf, 2, 3))); +void ln_errprintf(ln_ctx ctx, const int eno, const char *fmt, ...) __attribute__((format(printf, 3, 4))); + +#define LN_DBGPRINTF(ctx, ...) if(ctx->dbgCB != NULL) { ln_dbgprintf(ctx, __VA_ARGS__); } +//#define LN_DBGPRINTF(ctx, ...) +#endif /* #ifndef LIBLOGNORM_LOGNORM_HINCLUDED */ diff --git a/src/lognormalizer.c b/src/lognormalizer.c new file mode 100644 index 0000000..ade246c --- /dev/null +++ b/src/lognormalizer.c @@ -0,0 +1,526 @@ +/** + * @file normalizer.c + * @brief A small tool to normalize data. + * + * This is the most basic example demonstrating how to use liblognorm. + * It loads log samples from the files specified on the command line, + * reads to-be-normalized data from stdin and writes the normalized + * form to stdout. Besides being an example, it also carries out useful + * processing. + * + * @author Rainer Gerhards + * + *//* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2016 by Rainer Gerhards and Adiscon GmbH. + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include + +#include "liblognorm.h" +#include "lognorm.h" +#include "enc.h" + +/* we need to turn off this warning, as it also comes up in C99 mode, which + * we use. + */ +#pragma GCC diagnostic ignored "-Wdeclaration-after-statement" + +static ln_ctx ctx; + +static int verbose = 0; +#define OUTPUT_PARSED_RECS 0x01 +#define OUTPUT_UNPARSED_RECS 0x02 +static int recOutput = OUTPUT_PARSED_RECS | OUTPUT_UNPARSED_RECS; + /**< controls which records to output */ +static int outputSummaryLine = 0; +static int outputNbrUnparsed = 0; +static int addErrLineNbr = 0; /**< add line number info to unparsed events */ +static int flatTags = 0; /**< print event.tags in JSON? */ +static FILE *fpDOT; +static es_str_t *encFmt = NULL; /**< a format string for encoder use */ +static es_str_t *mandatoryTag = NULL; /**< tag which must be given so that mesg will + be output. NULL=all */ +static enum { f_syslog, f_json, f_xml, f_csv, f_raw } outfmt = f_json; + +static void +errCallBack(void __attribute__((unused)) *cookie, const char *msg, + size_t __attribute__((unused)) lenMsg) +{ + fprintf(stderr, "liblognorm error: %s\n", msg); +} + +static void +dbgCallBack(void __attribute__((unused)) *cookie, const char *msg, + size_t __attribute__((unused)) lenMsg) +{ + fprintf(stderr, "liblognorm: %s\n", msg); +} + +static void +complain(const char *errmsg) +{ + fprintf(stderr, "%s\n", errmsg); +} + + +/* rawmsg is, as the name says, the raw message, in case we have + * "raw" formatter requested. + */ +static void +outputEvent(struct json_object *json, const char *const rawmsg) +{ + char *cstr = NULL; + es_str_t *str = NULL; + + if(outfmt == f_raw) { + printf("%s\n", rawmsg); + return; + } + + switch(outfmt) { + case f_json: + if(!flatTags) { + json_object_object_del(json, "event.tags"); + } + cstr = (char*)json_object_to_json_string(json); + break; + case f_syslog: + ln_fmtEventToRFC5424(json, &str); + break; + case f_xml: + ln_fmtEventToXML(json, &str); + break; + case f_csv: + ln_fmtEventToCSV(json, &str, encFmt); + break; + case f_raw: + fprintf(stderr, "program error: f_raw should not occur " + "here (file %s, line %d)\n", __FILE__, __LINE__); + abort(); + break; + default: + fprintf(stderr, "program error: default case should not occur " + "here (file %s, line %d)\n", __FILE__, __LINE__); + abort(); + break; + } + if (str != NULL) + cstr = es_str2cstr(str, NULL); + if(verbose > 0) fprintf(stderr, "normalized: '%s'\n", cstr); + printf("%s\n", cstr); + if (str != NULL) + free(cstr); + es_deleteStr(str); +} + +/* test if the tag exists */ +static int +eventHasTag(struct json_object *json, const char *tag) +{ + struct json_object *tagbucket, *tagObj; + int i; + const char *tagCstr; + + if (tag == NULL) + return 1; + if (json_object_object_get_ex(json, "event.tags", &tagbucket)) { + if (json_object_get_type(tagbucket) == json_type_array) { + for (i = json_object_array_length(tagbucket) - 1; i >= 0; i--) { + tagObj = json_object_array_get_idx(tagbucket, i); + tagCstr = json_object_get_string(tagObj); + if (!strcmp(tag, tagCstr)) + return 1; + } + } + } + if (verbose > 1) + printf("Mandatory tag '%s' has not been found\n", tag); + return 0; +} + +static void +amendLineNbr(json_object *const json, const int line_nbr) +{ + + if(addErrLineNbr) { + struct json_object *jval; + jval = json_object_new_int(line_nbr); + json_object_object_add(json, "lognormalizer.line_nbr", jval); + } +} + +#define DEFAULT_LINE_SIZE (10 * 1024) + +static char * +read_line(FILE *fp) +{ + size_t line_capacity = DEFAULT_LINE_SIZE; + char *line = NULL; + size_t line_len = 0; + int ch = 0; + do { + ch = fgetc(fp); + if (ch == EOF) break; + if (line == NULL) { + line = malloc(line_capacity); + } else if (line_len == line_capacity) { + line_capacity *= 2; + line = realloc(line, line_capacity); + } + if (line == NULL) { + fprintf(stderr, "Couldn't allocate working-buffer for log-line\n"); + return NULL; + } + line[line_len++] = ch; + } while(ch != '\n'); + + if (line != NULL) { + line[--line_len] = '\0'; + if(line_len > 0 && line[line_len - 1] == '\r') + line[--line_len] = '\0'; + } + return line; +} + +/* normalize input data + */ +static void +normalize(void) +{ + FILE *fp = stdin; + char *line = NULL; + struct json_object *json = NULL; + long long unsigned numParsed = 0; + long long unsigned numUnparsed = 0; + long long unsigned numWrongTag = 0; + char *mandatoryTagCstr = NULL; + int line_nbr = 0; /* must be int to keep compatible with older json-c */ + + if (mandatoryTag != NULL) { + mandatoryTagCstr = es_str2cstr(mandatoryTag, NULL); + } + + while((line = read_line(fp)) != NULL) { + ++line_nbr; + if(verbose > 0) fprintf(stderr, "To normalize: '%s'\n", line); + ln_normalize(ctx, line, strlen(line), &json); + if(json != NULL) { + if(eventHasTag(json, mandatoryTagCstr)) { + struct json_object *dummy; + const int parsed = !json_object_object_get_ex(json, + "unparsed-data", &dummy); + if(parsed) { + numParsed++; + if(recOutput & OUTPUT_PARSED_RECS) { + outputEvent(json, line); + } + } else { + numUnparsed++; + amendLineNbr(json, line_nbr); + if(recOutput & OUTPUT_UNPARSED_RECS) { + outputEvent(json, line); + } + } + } else { + numWrongTag++; + } + json_object_put(json); + json = NULL; + } + free(line); + } + if(outputNbrUnparsed && numUnparsed > 0) + fprintf(stderr, "%llu unparsable entries\n", numUnparsed); + if(numWrongTag > 0) + fprintf(stderr, "%llu entries with wrong tag dropped\n", numWrongTag); + if(outputSummaryLine) { + fprintf(stderr, "%llu records processed, %llu parsed, %llu unparsed\n", + numParsed+numUnparsed, numParsed, numUnparsed); + } + free(mandatoryTagCstr); +} + + +/** + * Generate a command file for the GNU DOT tools. + */ +static void +genDOT(void) +{ + es_str_t *str; + + str = es_newStr(1024); + ln_genDotPDAGGraph(ctx->pdag, &str); + fwrite(es_getBufAddr(str), 1, es_strlen(str), fpDOT); +} + +static +void printVersion(void) +{ + fprintf(stderr, "lognormalizer version: " VERSION "\n"); + fprintf(stderr, "liblognorm version: %s\n", ln_version()); + fprintf(stderr, "\tadvanced stats: %s\n", + ln_hasAdvancedStats() ? "available" : "not available"); +} + +static void +handle_generic_option(const char* opt) { + if (strcmp("allowRegex", opt) == 0) { + ln_setCtxOpts(ctx, LN_CTXOPT_ALLOW_REGEX); + } else if (strcmp("addExecPath", opt) == 0) { + ln_setCtxOpts(ctx, LN_CTXOPT_ADD_EXEC_PATH); + } else if (strcmp("addOriginalMsg", opt) == 0) { + ln_setCtxOpts(ctx, LN_CTXOPT_ADD_ORIGINALMSG); + } else if (strcmp("addRule", opt) == 0) { + ln_setCtxOpts(ctx, LN_CTXOPT_ADD_RULE); + } else if (strcmp("addRuleLocation", opt) == 0) { + ln_setCtxOpts(ctx, LN_CTXOPT_ADD_RULE_LOCATION); + } else { + fprintf(stderr, "invalid -o option '%s'\n", opt); + exit(1); + } +} + +static void usage(void) +{ +fprintf(stderr, + "Options:\n" + " -r Rulebase to use. This is required option\n" + " -H print summary line (nbr of msgs Handled)\n" + " -U print number of unparsed messages (only if non-zero)\n" + " -e\n" + " Change output format. By default, json is used\n" + " Raw is exactly like the input. It is useful in combination\n" + " with -p/-P options to extract known good/bad messages\n" + " -E Encoder-specific format (used for CSV, read docs)\n" + " -T Include 'event.tags' in JSON format\n" + " -oallowRegex Allow regexp matching (read docs about performance penalty)\n" + " -oaddRule Add a mockup of the matching rule.\n" + " -oaddRuleLocation Add location of matching rule to metadata\n" + " -oaddExecPath Add exec_path attribute to output\n" + " -oaddOriginalMsg Always add original message to output, not just in error case\n" + " -p Print back only if the message has been parsed succesfully\n" + " -P Print back only if the message has NOT been parsed succesfully\n" + " -L Add source file line number information to unparsed line output\n" + " -t Print back only messages matching the tag\n" + " -v Print debug. When used 3 times, prints parse DAG\n" + " -V Print version information\n" + " -d Print DOT file to stdout and exit\n" + " -d Save DOT file to the filename\n" + " -s Print parse dag statistics and exit\n" + " -S Print extended parse dag statistics and exit (includes -s)\n" + " -x Print statistics as dot file (called only)\n" + "\n" + ); +} + +int main(int argc, char *argv[]) +{ + int opt; + char *repository = NULL; + int usedRB = 0; /* 0=no rule; 1=rule from rulebase; 2=rule from string */ + int ret = 0; + FILE *fpStats = NULL; + FILE *fpStatsDOT = NULL; + int extendedStats = 0; + + if((ctx = ln_initCtx()) == NULL) { + complain("Could not initialize liblognorm context"); + ret = 1; + goto exit; + } + + while((opt = getopt(argc, argv, "d:s:S:e:r:R:E:vVpPt:To:hHULx:")) != -1) { + switch (opt) { + case 'V': + printVersion(); + exit(1); + break; + case 'd': /* generate DOT file */ + if(!strcmp(optarg, "")) { + fpDOT = stdout; + } else { + if((fpDOT = fopen(optarg, "w")) == NULL) { + perror(optarg); + complain("Cannot open DOT file"); + ret = 1; + goto exit; + } + } + break; + case 'x': /* generate statistics DOT file */ + if(!strcmp(optarg, "")) { + fpStatsDOT = stdout; + } else { + if((fpStatsDOT = fopen(optarg, "w")) == NULL) { + perror(optarg); + complain("Cannot open statistics DOT file"); + ret = 1; + goto exit; + } + } + break; + case 'S': /* generate pdag statistic file */ + extendedStats = 1; + /* INTENTIONALLY NO BREAK! - KEEP order! */ + /*FALLTHROUGH*/ + case 's': /* generate pdag statistic file */ + if(!strcmp(optarg, "-")) { + fpStats = stdout; + } else { + if((fpStats = fopen(optarg, "w")) == NULL) { + perror(optarg); + complain("Cannot open parser statistics file"); + ret = 1; + goto exit; + } + } + break; + case 'v': + verbose++; + break; + case 'E': /* encoder-specific format string (will be validated by encoder) */ + encFmt = es_newStrFromCStr(optarg, strlen(optarg)); + break; + case 'p': + recOutput = OUTPUT_PARSED_RECS; + break; + case 'P': + recOutput = OUTPUT_UNPARSED_RECS; + break; + case 'H': + outputSummaryLine = 1; + break; + case 'U': + outputNbrUnparsed = 1; + break; + case 'L': + addErrLineNbr = 1; + break; + case 'T': + flatTags = 1; + break; + case 'e': /* encoder to use */ + if(!strcmp(optarg, "json")) { + outfmt = f_json; + } else if(!strcmp(optarg, "xml")) { + outfmt = f_xml; + } else if(!strcmp(optarg, "cee-syslog")) { + outfmt = f_syslog; + } else if(!strcmp(optarg, "csv")) { + outfmt = f_csv; + } else if(!strcmp(optarg, "raw")) { + outfmt = f_raw; + } + break; + case 'r': /* rule base to use */ + if(usedRB != 2) { + repository = optarg; + usedRB = 1; + } else { + usedRB = -1; + } + break; + case 'R': + if(usedRB != 1) { + repository = optarg; + usedRB = 2; + } else { + usedRB = -1; + } + break; + case 't': /* if given, only messages tagged with the argument + are output */ + mandatoryTag = es_newStrFromCStr(optarg, strlen(optarg)); + break; + case 'o': + handle_generic_option(optarg); + break; + case 'h': + default: + usage(); + ret = 1; + goto exit; + break; + } + } + + if(repository == NULL) { + complain("Samples repository or String must be given (-r or -R)"); + ret = 1; + goto exit; + } + + if(usedRB == -1) { + complain("Only use one rulebase (-r or -R)"); + ret = 1; + goto exit; + } + + ln_setErrMsgCB(ctx, errCallBack, NULL); + if(verbose) { + ln_setDebugCB(ctx, dbgCallBack, NULL); + ln_enableDebug(ctx, 1); + } + + if(usedRB == 1) { + if(ln_loadSamples(ctx, repository)) { + fprintf(stderr, "fatal error: cannot load rulebase\n"); + exit(1); + } + } else if(usedRB == 2) { + if(ln_loadSamplesFromString(ctx, repository)) { + fprintf(stderr, "fatal error: cannot load rule from String\n"); + exit(1); + } + } + + if(verbose > 0) + fprintf(stderr, "number of tree nodes: %d\n", ctx->nNodes); + + if(fpDOT != NULL) { + genDOT(); + ret=1; + goto exit; + } + + if(verbose > 2) ln_displayPDAG(ctx); + + normalize(); + + if(fpStats != NULL) { + ln_fullPdagStats(ctx, fpStats, extendedStats); + } + + if(fpStatsDOT != NULL) { + ln_fullPDagStatsDOT(ctx, fpStatsDOT); + } + +exit: + if (ctx) ln_exitCtx(ctx); + if (encFmt != NULL) + free(encFmt); + return ret; +} diff --git a/src/parser.c b/src/parser.c new file mode 100644 index 0000000..bcb7b3f --- /dev/null +++ b/src/parser.c @@ -0,0 +1,3453 @@ +/* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2018 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include + +#include "liblognorm.h" +#include "lognorm.h" +#include "internal.h" +#include "parser.h" +#include "samp.h" +#include "helpers.h" + +#ifdef FEATURE_REGEXP +#include +#include +#endif + + +/* how should output values be formatted? */ +enum FMT_MODE { + FMT_AS_STRING = 0, + FMT_AS_NUMBER = 1, + FMT_AS_TIMESTAMP_UX = 2, + FMT_AS_TIMESTAMP_UX_MS = 3 + }; + +/* some helpers */ +static inline int +hParseInt(const unsigned char **buf, size_t *lenBuf) +{ + const unsigned char *p = *buf; + size_t len = *lenBuf; + int i = 0; + + while(len > 0 && myisdigit(*p)) { + i = i * 10 + *p - '0'; + ++p; + --len; + } + + *buf = p; + *lenBuf = len; + return i; +} + +/* parser _parse interface + * + * All parsers receive + * + * @param[in] npb->str the to-be-parsed string + * @param[in] npb->strLen length of the to-be-parsed string + * @param[in] offs an offset into the string + * @param[in] pointer to parser data block + * @param[out] parsed bytes + * @param[out] value ptr to json object containing parsed data + * (can be unused, but if used *value MUST be NULL on entry) + * + * They will try to parse out "their" object from the string. If they + * succeed, they: + * + * return 0 on success and LN_WRONGPARSER if this parser could + * not successfully parse (but all went well otherwise) and something + * else in case of an error. + */ +#define PARSER_Parse(ParserName) \ +int ln_v2_parse##ParserName( \ + npb_t *const npb, \ + size_t *const offs, \ + __attribute__((unused)) void *const pdata, \ + size_t *parsed, \ + struct json_object **value) \ +{ \ + int r = LN_WRONGPARSER; \ + *parsed = 0; + +#define FAILParser \ + goto parserdone; /* suppress warnings */ \ +parserdone: \ + r = 0; \ + goto done; /* suppress warnings */ \ +done: + +#define ENDFailParser \ + return r; \ +} + + +/* Return printable representation of parser content for + * display purposes. This must not be 100% exact, but provide + * a good indication of what it contains for a human. + * @param[data] data parser data block + * @return pointer to c string, NOT to be freed + */ +#define PARSER_DataForDisplay(ParserName) \ +const char * ln_DataForDisplay##ParserName(__attribute__((unused)) ln_ctx ctx, void *const pdata) + + +/* Return JSON parser config. This is primarily for comparison + * of parser equalness. + * @param[data] data parser data block + * @return pointer to c string, NOT to be freed + */ +#define PARSER_JsonConf(ParserName) \ +const char * ln_JsonConf##ParserName(__attribute__((unused)) ln_ctx ctx, void *const pdata) + + +/* parser constructor + * @param[in] json config json items + * @param[out] data parser data block (to be allocated) + * At minimum, *data must be set to NULL + * @return error status (0 == OK) + */ +#define PARSER_Construct(ParserName) \ +int ln_construct##ParserName( \ + __attribute__((unused)) ln_ctx ctx, \ + __attribute__((unused)) json_object *const json, \ + void **pdata) + +/* parser destructor + * @param[data] data parser data block (to be de-allocated) + */ +#define PARSER_Destruct(ParserName) \ +void ln_destruct##ParserName(__attribute__((unused)) ln_ctx ctx, void *const pdata) + + +/* the following table saves us from computing an additional date to get + * the ordinal day of the year - at least from 1967-2099 + * Note: non-2038+ compliant systems (Solaris) will generate compiler + * warnings on the post 2038-rollover years. + */ +static const int yearInSec_startYear = 1967; +/* for x in $(seq 1967 2099) ; do + * printf %s', ' $(date --date="Dec 31 ${x} UTC 23:59:59" +%s) + * done |fold -w 70 -s */ +static const time_t yearInSecs[] = { + -63158401, -31536001, -1, 31535999, 63071999, 94694399, 126230399, + 157766399, 189302399, 220924799, 252460799, 283996799, 315532799, + 347155199, 378691199, 410227199, 441763199, 473385599, 504921599, + 536457599, 567993599, 599615999, 631151999, 662687999, 694223999, + 725846399, 757382399, 788918399, 820454399, 852076799, 883612799, + 915148799, 946684799, 978307199, 1009843199, 1041379199, 1072915199, + 1104537599, 1136073599, 1167609599, 1199145599, 1230767999, + 1262303999, 1293839999, 1325375999, 1356998399, 1388534399, + 1420070399, 1451606399, 1483228799, 1514764799, 1546300799, + 1577836799, 1609459199, 1640995199, 1672531199, 1704067199, + 1735689599, 1767225599, 1798761599, 1830297599, 1861919999, + 1893455999, 1924991999, 1956527999, 1988150399, 2019686399, + 2051222399, 2082758399, 2114380799, 2145916799, 2177452799, + 2208988799, 2240611199, 2272147199, 2303683199, 2335219199, + 2366841599, 2398377599, 2429913599, 2461449599, 2493071999, + 2524607999, 2556143999, 2587679999, 2619302399, 2650838399, + 2682374399, 2713910399, 2745532799, 2777068799, 2808604799, + 2840140799, 2871763199, 2903299199, 2934835199, 2966371199, + 2997993599, 3029529599, 3061065599, 3092601599, 3124223999, + 3155759999, 3187295999, 3218831999, 3250454399, 3281990399, + 3313526399, 3345062399, 3376684799, 3408220799, 3439756799, + 3471292799, 3502915199, 3534451199, 3565987199, 3597523199, + 3629145599, 3660681599, 3692217599, 3723753599, 3755375999, + 3786911999, 3818447999, 3849983999, 3881606399, 3913142399, + 3944678399, 3976214399, 4007836799, 4039372799, 4070908799, + 4102444799}; + +/** + * convert syslog timestamp to time_t + * Note: it would be better to use something similar to mktime() here. + * Unfortunately, mktime() semantics are problematic: first of all, it + * works on local time, on the machine's time zone. In syslog, we have + * to deal with multiple time zones at once, so we cannot plainly rely + * on the local zone, and so we cannot rely on mktime(). One solution would + * be to refactor all time-related functions so that they are all guarded + * by a mutex to ensure TZ consistency (which would also enable us to + * change the TZ at will for specific function calls). But that would + * potentially mean a lot of overhead. + * Also, mktime() has some side effects, at least setting of tzname. With + * a refactoring as described above that should probably not be a problem, + * but would also need more work. For some more thoughts on this topic, + * have a look here: + * http://stackoverflow.com/questions/18355101/is-standard-c-mktime-thread-safe-on-linux + * In conclusion, we keep our own code for generating the unix timestamp. + * rgerhards, 2016-03-02 (taken from rsyslog sources) + */ +static time_t +syslogTime2time_t(const int year, const int month, const int day, + const int hour, const int minute, const int second, + const int OffsetHour, const int OffsetMinute, const char OffsetMode) +{ + long MonthInDays, NumberOfYears, NumberOfDays; + int utcOffset; + time_t TimeInUnixFormat; + + if(year < 1970 || year > 2100) { + TimeInUnixFormat = 0; + goto done; + } + + /* Counting how many Days have passed since the 01.01 of the + * selected Year (Month level), according to the selected Month*/ + + switch(month) + { + case 1: + MonthInDays = 0; //until 01 of January + break; + case 2: + MonthInDays = 31; //until 01 of February - leap year handling down below! + break; + case 3: + MonthInDays = 59; //until 01 of March + break; + case 4: + MonthInDays = 90; //until 01 of April + break; + case 5: + MonthInDays = 120; //until 01 of Mai + break; + case 6: + MonthInDays = 151; //until 01 of June + break; + case 7: + MonthInDays = 181; //until 01 of July + break; + case 8: + MonthInDays = 212; //until 01 of August + break; + case 9: + MonthInDays = 243; //until 01 of September + break; + case 10: + MonthInDays = 273; //until 01 of Oktober + break; + case 11: + MonthInDays = 304; //until 01 of November + break; + case 12: + MonthInDays = 334; //until 01 of December + break; + default: /* this cannot happen (and would be a program error) + * but we need the code to keep the compiler silent. + */ + MonthInDays = 0; /* any value fits ;) */ + break; + } + /* adjust for leap years */ + if((year % 100 != 0 && year % 4 == 0) || (year == 2000)) { + if(month > 2) + MonthInDays++; + } + + + /* 1) Counting how many Years have passed since 1970 + 2) Counting how many Days have passed since the 01.01 of the selected Year + (Day level) according to the Selected Month and Day. Last day doesn't count, + it should be until last day + 3) Calculating this period (NumberOfDays) in seconds*/ + + NumberOfYears = year - yearInSec_startYear - 1; + NumberOfDays = MonthInDays + day - 1; + TimeInUnixFormat = (yearInSecs[NumberOfYears] + 1) + NumberOfDays * 86400; + + /*Add Hours, minutes and seconds */ + TimeInUnixFormat += hour*60*60; + TimeInUnixFormat += minute*60; + TimeInUnixFormat += second; + /* do UTC offset */ + utcOffset = OffsetHour*3600 + OffsetMinute*60; + if(OffsetMode == '+') + utcOffset *= -1; /* if timestamp is ahead, we need to "go back" to UTC */ + TimeInUnixFormat += utcOffset; +done: + return TimeInUnixFormat; +} + + +struct data_RFC5424Date { + enum FMT_MODE fmt_mode; +}; +/** + * Parse a TIMESTAMP as specified in RFC5424 (subset of RFC3339). + */ +PARSER_Parse(RFC5424Date) + const unsigned char *pszTS; + struct data_RFC5424Date *const data = (struct data_RFC5424Date*) pdata; + /* variables to temporarily hold time information while we parse */ + int year; + int month; + int day; + int hour; /* 24 hour clock */ + int minute; + int second; + int secfrac; /* fractional seconds (must be 32 bit!) */ + int secfracPrecision; + int OffsetHour; /* UTC offset in hours */ + int OffsetMinute; /* UTC offset in minutes */ + char OffsetMode; + size_t len; + size_t orglen; + /* end variables to temporarily hold time information while we parse */ + + pszTS = (unsigned char*) npb->str + *offs; + len = orglen = npb->strLen - *offs; + + year = hParseInt(&pszTS, &len); + + /* We take the liberty to accept slightly malformed timestamps e.g. in + * the format of 2003-9-1T1:0:0. */ + if(len == 0 || *pszTS++ != '-') goto done; + --len; + month = hParseInt(&pszTS, &len); + if(month < 1 || month > 12) goto done; + + if(len == 0 || *pszTS++ != '-') + goto done; + --len; + day = hParseInt(&pszTS, &len); + if(day < 1 || day > 31) goto done; + + if(len == 0 || *pszTS++ != 'T') goto done; + --len; + + hour = hParseInt(&pszTS, &len); + if(hour < 0 || hour > 23) goto done; + + if(len == 0 || *pszTS++ != ':') + goto done; + --len; + minute = hParseInt(&pszTS, &len); + if(minute < 0 || minute > 59) goto done; + + if(len == 0 || *pszTS++ != ':') goto done; + --len; + second = hParseInt(&pszTS, &len); + if(second < 0 || second > 60) goto done; + + /* Now let's see if we have secfrac */ + if(len > 0 && *pszTS == '.') { + --len; + const unsigned char *pszStart = ++pszTS; + secfrac = hParseInt(&pszTS, &len); + secfracPrecision = (int) (pszTS - pszStart); + } else { + secfracPrecision = 0; + secfrac = 0; + } + + /* check the timezone */ + if(len == 0) goto done; + + if(*pszTS == 'Z') { + OffsetHour = 0; + OffsetMinute = 0; + OffsetMode = '+'; + --len; + pszTS++; /* eat Z */ + } else if((*pszTS == '+') || (*pszTS == '-')) { + OffsetMode = *pszTS; + --len; + pszTS++; + + OffsetHour = hParseInt(&pszTS, &len); + if(OffsetHour < 0 || OffsetHour > 23) + goto done; + + if(len == 0 || *pszTS++ != ':') + goto done; + --len; + OffsetMinute = hParseInt(&pszTS, &len); + if(OffsetMinute < 0 || OffsetMinute > 59) + goto done; + } else { + /* there MUST be TZ information */ + goto done; + } + + if(len > 0) { + if(*pszTS != ' ') /* if it is not a space, it can not be a "good" time */ + goto done; + } + + /* we had success, so update parse pointer */ + *parsed = orglen - len; + + if(value != NULL) { + if(data->fmt_mode == FMT_AS_STRING) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } else { + int64_t timestamp = syslogTime2time_t(year, month, day, + hour, minute, second, OffsetHour, OffsetMinute, OffsetMode); + if(data->fmt_mode == FMT_AS_TIMESTAMP_UX_MS) { + timestamp *= 1000; + /* simulate pow(), do not use math lib! */ + int div = 1; + if(secfracPrecision == 1) { + secfrac *= 100; + } else if(secfracPrecision == 2) { + secfrac *= 10; + } else if(secfracPrecision > 3) { + for(int i = 0 ; i < (secfracPrecision - 3) ; ++i) + div *= 10; + } + timestamp += secfrac / div; + } + *value = json_object_new_int64(timestamp); + } + } + + r = 0; /* success */ +done: + return r; +} +PARSER_Construct(RFC5424Date) +{ + int r = 0; + struct data_RFC5424Date *data = + (struct data_RFC5424Date*) calloc(1, sizeof(struct data_RFC5424Date)); + data->fmt_mode = FMT_AS_STRING; + + if(json == NULL) + goto done; + + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(!strcmp(key, "format")) { + const char *fmtmode = json_object_get_string(val); + if(!strcmp(fmtmode, "timestamp-unix")) { + data->fmt_mode = FMT_AS_TIMESTAMP_UX; + } else if(!strcmp(fmtmode, "timestamp-unix-ms")) { + data->fmt_mode = FMT_AS_TIMESTAMP_UX_MS; + } else if(!strcmp(fmtmode, "string")) { + data->fmt_mode = FMT_AS_STRING; + } else { + ln_errprintf(ctx, 0, "invalid value for date-rfc5424:format %s", + fmtmode); + } + } else { + ln_errprintf(ctx, 0, "invalid param for date-rfc5424 %s", key); + } + json_object_iter_next(&it); + } + +done: + *pdata = data; + return r; +} +PARSER_Destruct(RFC5424Date) +{ + free(pdata); +} + + +struct data_RFC3164Date { + enum FMT_MODE fmt_mode; +}; +/** + * Parse a RFC3164 Date. + */ +PARSER_Parse(RFC3164Date) + const unsigned char *p; + size_t len, orglen; + struct data_RFC3164Date *const data = (struct data_RFC3164Date*) pdata; + /* variables to temporarily hold time information while we parse */ + int year; + int month; + int day; + int hour; /* 24 hour clock */ + int minute; + int second; + + p = (unsigned char*) npb->str + *offs; + orglen = len = npb->strLen - *offs; + /* If we look at the month (Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec), + * we may see the following character sequences occur: + * + * J(an/u(n/l)), Feb, Ma(r/y), A(pr/ug), Sep, Oct, Nov, Dec + * + * We will use this for parsing, as it probably is the + * fastest way to parse it. + */ + if(len < 3) + goto done; + + switch(*p++) + { + case 'j': + case 'J': + if(*p == 'a' || *p == 'A') { + ++p; + if(*p == 'n' || *p == 'N') { + ++p; + month = 1; + } else + goto done; + } else if(*p == 'u' || *p == 'U') { + ++p; + if(*p == 'n' || *p == 'N') { + ++p; + month = 6; + } else if(*p == 'l' || *p == 'L') { + ++p; + month = 7; + } else + goto done; + } else + goto done; + break; + case 'f': + case 'F': + if(*p == 'e' || *p == 'E') { + ++p; + if(*p == 'b' || *p == 'B') { + ++p; + month = 2; + } else + goto done; + } else + goto done; + break; + case 'm': + case 'M': + if(*p == 'a' || *p == 'A') { + ++p; + if(*p == 'r' || *p == 'R') { + ++p; + month = 3; + } else if(*p == 'y' || *p == 'Y') { + ++p; + month = 5; + } else + goto done; + } else + goto done; + break; + case 'a': + case 'A': + if(*p == 'p' || *p == 'P') { + ++p; + if(*p == 'r' || *p == 'R') { + ++p; + month = 4; + } else + goto done; + } else if(*p == 'u' || *p == 'U') { + ++p; + if(*p == 'g' || *p == 'G') { + ++p; + month = 8; + } else + goto done; + } else + goto done; + break; + case 's': + case 'S': + if(*p == 'e' || *p == 'E') { + ++p; + if(*p == 'p' || *p == 'P') { + ++p; + month = 9; + } else + goto done; + } else + goto done; + break; + case 'o': + case 'O': + if(*p == 'c' || *p == 'C') { + ++p; + if(*p == 't' || *p == 'T') { + ++p; + month = 10; + } else + goto done; + } else + goto done; + break; + case 'n': + case 'N': + if(*p == 'o' || *p == 'O') { + + ++p; + if(*p == 'v' || *p == 'V') { + ++p; + month = 11; + } else + goto done; + } else + goto done; + break; + case 'd': + case 'D': + if(*p == 'e' || *p == 'E') { + ++p; + if(*p == 'c' || *p == 'C') { + ++p; + month = 12; + } else + goto done; + } else + goto done; + break; + default: + goto done; + } + + len -= 3; + + /* done month */ + + if(len == 0 || *p++ != ' ') + goto done; + --len; + + /* we accept a slightly malformed timestamp with one-digit days. */ + if(*p == ' ') { + --len; + ++p; + } + + day = hParseInt(&p, &len); + if(day < 1 || day > 31) + goto done; + + if(len == 0 || *p++ != ' ') + goto done; + --len; + + /* time part */ + hour = hParseInt(&p, &len); + if(hour > 1970 && hour < 2100) { + /* if so, we assume this actually is a year. This is a format found + * e.g. in Cisco devices. + * + year = hour; + */ + + /* re-query the hour, this time it must be valid */ + if(len == 0 || *p++ != ' ') + goto done; + --len; + hour = hParseInt(&p, &len); + } + + if(hour < 0 || hour > 23) + goto done; + + if(len == 0 || *p++ != ':') + goto done; + --len; + minute = hParseInt(&p, &len); + if(minute < 0 || minute > 59) + goto done; + + if(len == 0 || *p++ != ':') + goto done; + --len; + second = hParseInt(&p, &len); + if(second < 0 || second > 60) + goto done; + + /* we provide support for an extra ":" after the date. While this is an + * invalid format, it occurs frequently enough (e.g. with Cisco devices) + * to permit it as a valid case. -- rgerhards, 2008-09-12 + */ + if(len > 0 && *p == ':') { + ++p; /* just skip past it */ + --len; + } + + /* we had success, so update parse pointer */ + *parsed = orglen - len; + if(value != NULL) { + if(data->fmt_mode == FMT_AS_STRING) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } else { + /* we assume year == current year, so let's obtain current year */ + struct tm tm; + const time_t curr = time(NULL); + gmtime_r(&curr, &tm); + year = tm.tm_year + 1900; + int64_t timestamp = syslogTime2time_t(year, month, day, + hour, minute, second, 0, 0, '+'); + if(data->fmt_mode == FMT_AS_TIMESTAMP_UX_MS) { + /* we do not have more precise info, just bring + * into common format! + */ + timestamp *= 1000; + } + *value = json_object_new_int64(timestamp); + } + } + r = 0; /* success */ +done: + return r; +} +PARSER_Construct(RFC3164Date) +{ + int r = 0; + struct data_RFC3164Date *data = (struct data_RFC3164Date*) calloc(1, sizeof(struct data_RFC3164Date)); + data->fmt_mode = FMT_AS_STRING; + + if(json == NULL) + goto done; + + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(!strcmp(key, "format")) { + const char *fmtmode = json_object_get_string(val); + if(!strcmp(fmtmode, "timestamp-unix")) { + data->fmt_mode = FMT_AS_TIMESTAMP_UX; + } else if(!strcmp(fmtmode, "timestamp-unix-ms")) { + data->fmt_mode = FMT_AS_TIMESTAMP_UX_MS; + } else if(!strcmp(fmtmode, "string")) { + data->fmt_mode = FMT_AS_STRING; + } else { + ln_errprintf(ctx, 0, "invalid value for date-rfc3164:format %s", + fmtmode); + } + } else { + ln_errprintf(ctx, 0, "invalid param for date-rfc3164 %s", key); + } + json_object_iter_next(&it); + } + +done: + *pdata = data; + return r; +} +PARSER_Destruct(RFC3164Date) +{ + free(pdata); +} + + +struct data_Number { + int64_t maxval; + enum FMT_MODE fmt_mode; +}; +/** + * Parse a Number. + * Note that a number is an abstracted concept. We always represent it + * as 64 bits (but may later change our mind if performance dictates so). + */ +PARSER_Parse(Number) + const char *c; + size_t i; + int64_t val = 0; + struct data_Number *const data = (struct data_Number*) pdata; + + enum FMT_MODE fmt_mode = FMT_AS_STRING; + int64_t maxval = 0; + if(data != NULL) { + fmt_mode = data->fmt_mode; + maxval = data->maxval; + } + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + + for (i = *offs; i < npb->strLen && myisdigit(c[i]); i++) + val = val * 10 + c[i] - '0'; + + if(maxval > 0 && val > maxval) { + LN_DBGPRINTF(npb->ctx, "number parser: val too large (max %" PRIu64 + ", actual %" PRIu64 ")", + maxval, val); + goto done; + } + + if (i == *offs) + goto done; + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + if(fmt_mode == FMT_AS_STRING) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } else { + *value = json_object_new_int64(val); + } + } + r = 0; /* success */ +done: + return r; +} + +PARSER_Construct(Number) +{ + int r = 0; + struct data_Number *data = (struct data_Number*) calloc(1, sizeof(struct data_Number)); + data->fmt_mode = FMT_AS_STRING; + + if(json == NULL) + goto done; + + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(!strcmp(key, "maxval")) { + errno = 0; + data->maxval = json_object_get_int64(val); + if(errno != 0) { + ln_errprintf(ctx, errno, "param 'maxval' must be integer but is: %s", + json_object_to_json_string(val)); + } + } else if(!strcmp(key, "format")) { + const char *fmtmode = json_object_get_string(val); + if(!strcmp(fmtmode, "number")) { + data->fmt_mode = FMT_AS_NUMBER; + } else if(!strcmp(fmtmode, "string")) { + data->fmt_mode = FMT_AS_STRING; + } else { + ln_errprintf(ctx, 0, "invalid value for number:format %s", + fmtmode); + } + } else { + ln_errprintf(ctx, 0, "invalid param for number: %s", key); + } + json_object_iter_next(&it); + } + +done: + *pdata = data; + return r; +} +PARSER_Destruct(Number) +{ + free(pdata); +} + +struct data_Float { + enum FMT_MODE fmt_mode; +}; +/** + * Parse a Real-number in floating-pt form. + */ +PARSER_Parse(Float) + const char *c; + size_t i; + const struct data_Float *const data = (struct data_Float*) pdata; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + + int isNeg = 0; + double val = 0; + int seen_point = 0; + double frac = 10; + + i = *offs; + + if (c[i] == '-') { + isNeg = 1; + i++; + } + + for (; i < npb->strLen; i++) { + if (c[i] == '.') { + if (seen_point != 0) + break; + seen_point = 1; + } else if (myisdigit(c[i])) { + if(seen_point) { + val += (c[i] - '0') / frac; + frac *= 10; + } else { + val = val * 10 + c[i] - '0'; + } + } else { + break; + } + } + if (i == *offs) + goto done; + + if(isNeg) + val *= -1; + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + if(data->fmt_mode == FMT_AS_STRING) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } else { + char *serialized = strndup(npb->str+(*offs), *parsed); + *value = json_object_new_double_s(val, serialized); + free(serialized); + } + } + r = 0; /* success */ +done: + return r; +} +PARSER_Construct(Float) +{ + int r = 0; + struct data_Float *data = (struct data_Float*) calloc(1, sizeof(struct data_Float)); + data->fmt_mode = FMT_AS_STRING; + + if(json == NULL) + goto done; + + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(!strcmp(key, "format")) { + const char *fmtmode = json_object_get_string(val); + if(!strcmp(fmtmode, "number")) { + data->fmt_mode = FMT_AS_NUMBER; + } else if(!strcmp(fmtmode, "string")) { + data->fmt_mode = FMT_AS_STRING; + } else { + ln_errprintf(ctx, 0, "invalid value for float:format %s", + fmtmode); + } + } else { + ln_errprintf(ctx, 0, "invalid param for float: %s", key); + } + json_object_iter_next(&it); + } + +done: + *pdata = data; + return r; +} +PARSER_Destruct(Float) +{ + free(pdata); +} + + +struct data_HexNumber { + uint64_t maxval; + enum FMT_MODE fmt_mode; +}; +/** + * Parse a hex Number. + * A hex number begins with 0x and contains only hex digits until the terminating + * whitespace. Note that if a non-hex character is deteced inside the number string, + * this is NOT considered to be a number. + */ +PARSER_Parse(HexNumber) + const char *c; + size_t i = *offs; + struct data_HexNumber *const data = (struct data_HexNumber*) pdata; + uint64_t maxval = data->maxval; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + + if(c[i] != '0' || c[i+1] != 'x') + goto done; + + uint64_t val = 0; + for (i += 2 ; i < npb->strLen && isxdigit(c[i]); i++) { + const char digit = tolower(c[i]); + val *= 16; + if(digit >= 'a' && digit <= 'f') + val += digit - 'a' + 10; + else + val += digit - '0'; + } + if (i == *offs || !isspace(c[i])) + goto done; + if(maxval > 0 && val > maxval) { + LN_DBGPRINTF(npb->ctx, "hexnumber parser: val too large (max %" PRIu64 + ", actual %" PRIu64 ")", + maxval, val); + goto done; + } + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + if(data->fmt_mode == FMT_AS_STRING) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } else { + *value = json_object_new_int64((int64_t) val); + } + } + r = 0; /* success */ +done: + return r; +} +PARSER_Construct(HexNumber) +{ + int r = 0; + struct data_HexNumber *data = (struct data_HexNumber*) calloc(1, sizeof(struct data_HexNumber)); + data->fmt_mode = FMT_AS_STRING; + + if(json == NULL) + goto done; + + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(!strcmp(key, "maxval")) { + errno = 0; + data->maxval = json_object_get_int64(val); + if(errno != 0) { + ln_errprintf(ctx, errno, "param 'maxval' must be integer but is: %s", + json_object_to_json_string(val)); + } + } else if(!strcmp(key, "format")) { + const char *fmtmode = json_object_get_string(val); + if(!strcmp(fmtmode, "number")) { + data->fmt_mode = FMT_AS_NUMBER; + } else if(!strcmp(fmtmode, "string")) { + data->fmt_mode = FMT_AS_STRING; + } else { + ln_errprintf(ctx, 0, "invalid value for hexnumber:format %s", + fmtmode); + } + } else { + ln_errprintf(ctx, 0, "invalid param for hexnumber: %s", key); + } + json_object_iter_next(&it); + } + +done: + *pdata = data; + return r; +} +PARSER_Destruct(HexNumber) +{ + free(pdata); +} + + +/** + * Parse a kernel timestamp. + * This is a fixed format, see + * https://git.kernel.org/cgit/linux/kernel/git/torvalds/linux.git/tree/kernel/printk/printk.c?id=refs/tags/v4.0#n1011 + * This is the code that generates it: + * sprintf(buf, "[%5lu.%06lu] ", (unsigned long)ts, rem_nsec / 1000); + * We accept up to 12 digits for ts, everything above that for sure is + * no timestamp. + */ +#define LEN_KERNEL_TIMESTAMP 14 +PARSER_Parse(KernelTimestamp) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + + i = *offs; + if(c[i] != '[' || i+LEN_KERNEL_TIMESTAMP > npb->strLen + || !myisdigit(c[i+1]) + || !myisdigit(c[i+2]) + || !myisdigit(c[i+3]) + || !myisdigit(c[i+4]) + || !myisdigit(c[i+5]) + ) + goto done; + i += 6; + for(int j = 0 ; j < 7 && i < npb->strLen && myisdigit(c[i]) ; ) + ++i, ++j; /* just scan */ + + if(i >= npb->strLen || c[i] != '.') + goto done; + + ++i; /* skip over '.' */ + + if( i+7 > npb->strLen + || !myisdigit(c[i+0]) + || !myisdigit(c[i+1]) + || !myisdigit(c[i+2]) + || !myisdigit(c[i+3]) + || !myisdigit(c[i+4]) + || !myisdigit(c[i+5]) + || c[i+6] != ']' + ) + goto done; + i += 7; + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + +/** + * Parse whitespace. + * This parses all whitespace until the first non-whitespace character + * is found. This is primarily a tool to skip to the next "word" if + * the exact number of whitspace characters (and type of whitespace) + * is not known. The current parsing position MUST be on a whitspace, + * else the parser does not match. + * This parser is also a forward-compatibility tool for the upcoming + * slsa (simple log structure analyser) tool. + */ +PARSER_Parse(Whitespace) + const char *c; + size_t i = *offs; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + + if(!isspace(c[i])) + goto done; + + for (i++ ; i < npb->strLen && isspace(c[i]); i++); + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse a word. + * A word is a SP-delimited entity. The parser always works, except if + * the offset is position on a space upon entry. + */ +PARSER_Parse(Word) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + + /* search end of word */ + while(i < npb->strLen && c[i] != ' ') + i++; + + if(i == *offs) + goto done; + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + + +struct data_StringTo { + const char *toFind; + size_t len; +}; +/** + * Parse everything up to a specific string. + * swisskid, 2015-01-21 + */ +PARSER_Parse(StringTo) + const char *c; + size_t i, j, m; + int chkstr; + struct data_StringTo *const data = (struct data_StringTo*) pdata; + const char *const toFind = data->toFind; + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + chkstr = 0; + + /* Total hunt for letter */ + while(chkstr == 0 && i < npb->strLen ) { + i++; + if(c[i] == toFind[0]) { + /* Found the first letter, now find the rest of the string */ + j = 1; + m = i+1; + while(m < npb->strLen && j < data->len ) { + if(c[m] != toFind[j]) + break; + if(j == data->len - 1) { /* full match? */ + chkstr = 1; + break; + } + j++; + m++; + } + } + } + if(i == *offs || i == npb->strLen || chkstr != 1) + goto done; + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + +PARSER_Construct(StringTo) +{ + int r = 0; + struct data_StringTo *data = (struct data_StringTo*) calloc(1, sizeof(struct data_StringTo)); + struct json_object *ed; + + if(json_object_object_get_ex(json, "extradata", &ed) == 0) { + ln_errprintf(ctx, 0, "string-to type needs 'extradata' parameter"); + r = LN_BADCONFIG ; + goto done; + } + data->toFind = strdup(json_object_get_string(ed)); + data->len = strlen(data->toFind); + + *pdata = data; +done: + if(r != 0) + free(data); + return r; +} +PARSER_Destruct(StringTo) +{ + struct data_StringTo *data = (struct data_StringTo*) pdata; + free((void*)data->toFind); + free(pdata); +} + +/** + * Parse a alphabetic word. + * A alpha word is composed of characters for which isalpha returns true. + * The parser dones if there is no alpha character at all. + */ +PARSER_Parse(Alpha) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + + /* search end of word */ + while(i < npb->strLen && isalpha(c[i])) + i++; + + if(i == *offs) { + goto done; + } + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + + +struct data_CharTo { + char *term_chars; + size_t n_term_chars; + char *data_for_display; +}; +/** + * Parse everything up to a specific character. + * The character must be the only char inside extra data passed to the parser. + * It is considered a format error if + * a) the to-be-parsed buffer is already positioned on the terminator character + * b) there is no terminator until the end of the buffer + * In those cases, the parsers declares itself as not being successful, in all + * other cases a string is extracted. + */ +PARSER_Parse(CharTo) + size_t i; + struct data_CharTo *const data = (struct data_CharTo*) pdata; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + i = *offs; + + /* search end of word */ + int found = 0; + while(i < npb->strLen && !found) { + for(size_t j = 0 ; j < data->n_term_chars ; ++j) { + if(npb->str[i] == data->term_chars[j]) { + found = 1; + break; + } + } + if(!found) + ++i; + } + + if(i == *offs || i == npb->strLen || !found) + goto done; + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; +done: + return r; +} +PARSER_Construct(CharTo) +{ + int r = 0; + LN_DBGPRINTF(ctx, "in parser_construct charTo"); + struct data_CharTo *data = (struct data_CharTo*) calloc(1, sizeof(struct data_CharTo)); + struct json_object *ed; + + if(json_object_object_get_ex(json, "extradata", &ed) == 0) { + ln_errprintf(ctx, 0, "char-to type needs 'extradata' parameter"); + r = LN_BADCONFIG ; + goto done; + } + data->term_chars = strdup(json_object_get_string(ed)); + data->n_term_chars = strlen(data->term_chars); + *pdata = data; +done: + if(r != 0) + free(data); + return r; +} +PARSER_DataForDisplay(CharTo) +{ + struct data_CharTo *data = (struct data_CharTo*) pdata; + if(data->data_for_display == NULL) { + data->data_for_display = malloc(8+data->n_term_chars+2); + if(data->data_for_display != NULL) { + memcpy(data->data_for_display, "char-to{", 8); + size_t i, j; + for(j = 0, i = 8 ; j < data->n_term_chars ; ++j, ++i) { + data->data_for_display[i] = data->term_chars[j]; + } + data->data_for_display[i++] = '}'; + data->data_for_display[i] = '\0'; + } + } + return (data->data_for_display == NULL ) ? "malloc error" : data->data_for_display; +} +PARSER_Destruct(CharTo) +{ + struct data_CharTo *const data = (struct data_CharTo*) pdata; + free(data->data_for_display); + free(data->term_chars); + free(pdata); +} + + + +struct data_Literal { + const char *lit; + const char *json_conf; +}; +/** + * Parse a specific literal. + */ +PARSER_Parse(Literal) + struct data_Literal *const data = (struct data_Literal*) pdata; + const char *const lit = data->lit; + size_t i = *offs; + size_t j; + + for(j = 0 ; i < npb->strLen ; ++j) { + if(lit[j] != npb->str[i]) + break; + ++i; + } + + *parsed = j; /* we must always return how far we parsed! */ + if(lit[j] == '\0') { + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; + } + return r; +} +PARSER_DataForDisplay(Literal) +{ + struct data_Literal *data = (struct data_Literal*) pdata; + return data->lit; +} +PARSER_JsonConf(Literal) +{ + struct data_Literal *data = (struct data_Literal*) pdata; + return data->json_conf; +} +PARSER_Construct(Literal) +{ + int r = 0; + struct data_Literal *data = (struct data_Literal*) calloc(1, sizeof(struct data_Literal)); + struct json_object *text; + + if(json_object_object_get_ex(json, "text", &text) == 0) { + ln_errprintf(ctx, 0, "literal type needs 'text' parameter"); + r = LN_BADCONFIG ; + goto done; + } + data->lit = strdup(json_object_get_string(text)); + data->json_conf = strdup(json_object_to_json_string(json)); + + *pdata = data; +done: + if(r != 0) + free(data); + return r; +} +PARSER_Destruct(Literal) +{ + struct data_Literal *data = (struct data_Literal*) pdata; + free((void*)data->lit); + free((void*)data->json_conf); + free(pdata); +} +/* for path compaction, we need a special handler to combine two + * literal data elements. + */ +int +ln_combineData_Literal(void *const porg, void *const padd) +{ + struct data_Literal *const __restrict__ org = porg; + struct data_Literal *const __restrict__ add = padd; + int r = 0; + const size_t len = strlen(org->lit); + const size_t add_len = strlen(add->lit); + char *const newlit = (char*)realloc((void*)org->lit, len+add_len+1); + CHKN(newlit); + org->lit = newlit; + memcpy((char*)org->lit+len, add->lit, add_len+1); +done: return r; +} + + +struct data_CharSeparated { + char *term_chars; + size_t n_term_chars; +}; +/** + * Parse everything up to a specific character, or up to the end of string. + * The character must be the only char inside extra data passed to the parser. + * This parser always returns success. + * By nature of the parser, it is required that end of string or the separator + * follows this field in rule. + */ +PARSER_Parse(CharSeparated) + struct data_CharSeparated *const data = (struct data_CharSeparated*) pdata; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + i = *offs; + + /* search end of word */ + int found = 0; + while(i < npb->strLen && !found) { + for(size_t j = 0 ; j < data->n_term_chars ; ++j) { + if(npb->str[i] == data->term_chars[j]) { + found = 1; + break; + } + } + if(!found) + ++i; + } + + /* success, persist */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ + return r; +} +PARSER_Construct(CharSeparated) +{ + int r = 0; + struct data_CharSeparated *data = (struct data_CharSeparated*) calloc(1, sizeof(struct data_CharSeparated)); + struct json_object *ed; + + if(json_object_object_get_ex(json, "extradata", &ed) == 0) { + ln_errprintf(ctx, 0, "char-separated type needs 'extradata' parameter"); + r = LN_BADCONFIG ; + goto done; + } + + data->term_chars = strdup(json_object_get_string(ed)); + data->n_term_chars = strlen(data->term_chars); + *pdata = data; +done: + if(r != 0) + free(data); + return r; +} +PARSER_Destruct(CharSeparated) +{ + struct data_CharSeparated *const data = (struct data_CharSeparated*) pdata; + free(data->term_chars); + free(pdata); +} + + +/** + * Just get everything till the end of string. + */ +PARSER_Parse(Rest) + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + + /* silence the warning about unused variable */ + (void)npb->str; + /* success, persist */ + *parsed = npb->strLen - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; + return r; +} + +/** + * Parse a possibly quoted string. In this initial implementation, escaping of the quote + * char is not supported. A quoted string is one start starts with a double quote, + * has some text (not containing double quotes) and ends with the first double + * quote character seen. The extracted string does NOT include the quote characters. + * swisskid, 2015-01-21 + */ +PARSER_Parse(OpQuotedString) + const char *c; + size_t i; + char *cstr = NULL; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + + if(c[i] != '"') { + while(i < npb->strLen && c[i] != ' ') + i++; + + if(i == *offs) + goto done; + + /* success, persist */ + *parsed = i - *offs; + /* create JSON value to save quoted string contents */ + CHKN(cstr = strndup((char*)c + *offs, *parsed)); + } else { + ++i; + + /* search end of string */ + while(i < npb->strLen && c[i] != '"') + i++; + + if(i == npb->strLen || c[i] != '"') + goto done; + /* success, persist */ + *parsed = i + 1 - *offs; /* "eat" terminal double quote */ + /* create JSON value to save quoted string contents */ + CHKN(cstr = strndup((char*)c + *offs + 1, *parsed - 2)); + } + CHKN(*value = json_object_new_string(cstr)); + + r = 0; /* success */ +done: + free(cstr); + return r; +} + + +/** + * Parse a quoted string. In this initial implementation, escaping of the quote + * char is not supported. A quoted string is one start starts with a double quote, + * has some text (not containing double quotes) and ends with the first double + * quote character seen. The extracted string does NOT include the quote characters. + * rgerhards, 2011-01-14 + */ +PARSER_Parse(QuotedString) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + if(i + 2 > npb->strLen) + goto done; /* needs at least 2 characters */ + + if(c[i] != '"') + goto done; + ++i; + + /* search end of string */ + while(i < npb->strLen && c[i] != '"') + i++; + + if(i == npb->strLen || c[i] != '"') + goto done; + + /* success, persist */ + *parsed = i + 1 - *offs; /* "eat" terminal double quote */ + /* create JSON value to save quoted string contents */ + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse an ISO date, that is YYYY-MM-DD (exactly this format). + * Note: we do manual loop unrolling -- this is fast AND efficient. + * rgerhards, 2011-01-14 + */ +PARSER_Parse(ISODate) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + + if(*offs+10 > npb->strLen) + goto done; /* if it is not 10 chars, it can't be an ISO date */ + + /* year */ + if(!myisdigit(c[i])) goto done; + if(!myisdigit(c[i+1])) goto done; + if(!myisdigit(c[i+2])) goto done; + if(!myisdigit(c[i+3])) goto done; + if(c[i+4] != '-') goto done; + /* month */ + if(c[i+5] == '0') { + if(c[i+6] < '1' || c[i+6] > '9') goto done; + } else if(c[i+5] == '1') { + if(c[i+6] < '0' || c[i+6] > '2') goto done; + } else { + goto done; + } + if(c[i+7] != '-') goto done; + /* day */ + if(c[i+8] == '0') { + if(c[i+9] < '1' || c[i+9] > '9') goto done; + } else if(c[i+8] == '1' || c[i+8] == '2') { + if(!myisdigit(c[i+9])) goto done; + } else if(c[i+8] == '3') { + if(c[i+9] != '0' && c[i+9] != '1') goto done; + } else { + goto done; + } + + /* success, persist */ + *parsed = 10; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + +/** + * Parse a Cisco interface spec. Sample for such a spec are: + * outside:192.168.52.102/50349 + * inside:192.168.1.15/56543 (192.168.1.112/54543) + * outside:192.168.1.13/50179 (192.168.1.13/50179)(LOCAL\some.user) + * outside:192.168.1.25/41850(LOCAL\RG-867G8-DEL88D879BBFFC8) + * inside:192.168.1.25/53 (192.168.1.25/53) (some.user) + * 192.168.1.15/0(LOCAL\RG-867G8-DEL88D879BBFFC8) + * From this, we conclude the format is: + * [interface:]ip/port [SP (ip2/port2)] [[SP](username)] + * In order to match, this syntax must start on a non-whitespace char + * other than colon. + */ +PARSER_Parse(CiscoInterfaceSpec) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + + if(c[i] == ':' || isspace(c[i])) goto done; + + /* first, check if we have an interface. We do this by trying + * to detect if we have an IP. If we have, obviously no interface + * is present. Otherwise, we check if we have a valid interface. + */ + int bHaveInterface = 0; + size_t idxInterface = 0; + size_t lenInterface = 0; + int bHaveIP = 0; + size_t lenIP; + size_t idxIP = i; + if(ln_v2_parseIPv4(npb, &i, NULL, &lenIP, NULL) == 0) { + bHaveIP = 1; + i += lenIP - 1; /* position on delimiter */ + } else { + idxInterface = i; + while(i < npb->strLen) { + if(isspace(c[i])) goto done; + if(c[i] == ':') + break; + ++i; + } + lenInterface = i - idxInterface; + bHaveInterface = 1; + } + if(i == npb->strLen) goto done; + ++i; /* skip over colon */ + + /* we now utilize our other parser helpers */ + if(!bHaveIP) { + idxIP = i; + if(ln_v2_parseIPv4(npb, &i, NULL, &lenIP, NULL) != 0) goto done; + i += lenIP; + } + if(i == npb->strLen || c[i] != '/') goto done; + ++i; /* skip slash */ + const size_t idxPort = i; + size_t lenPort; + if(ln_v2_parseNumber(npb, &i, NULL, &lenPort, NULL) != 0) goto done; + i += lenPort; + if(i == npb->strLen) goto success; + + /* check if optional second ip/port is present + * We assume we must at least have 5 chars [" (::1)"] + */ + int bHaveIP2 = 0; + size_t idxIP2 = 0, lenIP2 = 0; + size_t idxPort2 = 0, lenPort2 = 0; + if(i+5 < npb->strLen && c[i] == ' ' && c[i+1] == '(') { + size_t iTmp = i+2; /* skip over " (" */ + idxIP2 = iTmp; + if(ln_v2_parseIPv4(npb, &iTmp, NULL, &lenIP2, NULL) == 0) { + iTmp += lenIP2; + if(i < npb->strLen || c[iTmp] == '/') { + ++iTmp; /* skip slash */ + idxPort2 = iTmp; + if(ln_v2_parseNumber(npb, &iTmp, NULL, &lenPort2, NULL) == 0) { + iTmp += lenPort2; + if(iTmp < npb->strLen && c[iTmp] == ')') { + i = iTmp + 1; /* match, so use new index */ + bHaveIP2 = 1; + } + } + } + } + } + + /* check if optional username is present + * We assume we must at least have 3 chars ["(n)"] + */ + int bHaveUser = 0; + size_t idxUser = 0; + size_t lenUser = 0; + if( (i+2 < npb->strLen && c[i] == '(' && !isspace(c[i+1]) ) + || (i+3 < npb->strLen && c[i] == ' ' && c[i+1] == '(' && !isspace(c[i+2])) ) { + idxUser = i + ((c[i] == ' ') ? 2 : 1); /* skip [SP]'(' */ + size_t iTmp = idxUser; + while(iTmp < npb->strLen && !isspace(c[iTmp]) && c[iTmp] != ')') + ++iTmp; /* just scan */ + if(iTmp < npb->strLen && c[iTmp] == ')') { + i = iTmp + 1; /* we have a match, so use new index */ + bHaveUser = 1; + lenUser = iTmp - idxUser; + } + } + + /* all done, save data */ + if(value == NULL) + goto success; + + CHKN(*value = json_object_new_object()); + json_object *json; + if(bHaveInterface) { + CHKN(json = json_object_new_string_len(c+idxInterface, lenInterface)); + json_object_object_add_ex(*value, "interface", json, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } + CHKN(json = json_object_new_string_len(c+idxIP, lenIP)); + json_object_object_add_ex(*value, "ip", json, JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + CHKN(json = json_object_new_string_len(c+idxPort, lenPort)); + json_object_object_add_ex(*value, "port", json, JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + if(bHaveIP2) { + CHKN(json = json_object_new_string_len(c+idxIP2, lenIP2)); + json_object_object_add_ex(*value, "ip2", json, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + CHKN(json = json_object_new_string_len(c+idxPort2, lenPort2)); + json_object_object_add_ex(*value, "port2", json, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } + if(bHaveUser) { + CHKN(json = json_object_new_string_len(c+idxUser, lenUser)); + json_object_object_add_ex(*value, "user", json, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } + +success: /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ +done: + if(r != 0 && value != NULL && *value != NULL) { + json_object_put(*value); + *value = NULL; /* to be on the save side */ + } + return r; +} + +/** + * Parse a duration. A duration is similar to a timestamp, except that + * it tells about time elapsed. As such, hours can be larger than 23 + * and hours may also be specified by a single digit (this, for example, + * is commonly done in Cisco software). + * Note: we do manual loop unrolling -- this is fast AND efficient. + */ +PARSER_Parse(Duration) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + + /* hour is a bit tricky */ + if(!myisdigit(c[i])) goto done; + ++i; + if(myisdigit(c[i])) + ++i; + if(c[i] == ':') + ++i; + else + goto done; + + if(i+5 > npb->strLen) + goto done;/* if it is not 5 chars from here, it can't be us */ + + if(c[i] < '0' || c[i] > '5') goto done; + if(!myisdigit(c[i+1])) goto done; + if(c[i+2] != ':') goto done; + if(c[i+3] < '0' || c[i+3] > '5') goto done; + if(!myisdigit(c[i+4])) goto done; + + /* success, persist */ + *parsed = (i + 5) - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + +/** + * Parse a timestamp in 24hr format (exactly HH:MM:SS). + * Note: we do manual loop unrolling -- this is fast AND efficient. + * rgerhards, 2011-01-14 + */ +PARSER_Parse(Time24hr) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + + if(*offs+8 > npb->strLen) + goto done; /* if it is not 8 chars, it can't be us */ + + /* hour */ + if(c[i] == '0' || c[i] == '1') { + if(!myisdigit(c[i+1])) goto done; + } else if(c[i] == '2') { + if(c[i+1] < '0' || c[i+1] > '3') goto done; + } else { + goto done; + } + /* TODO: the code below is a duplicate of 24hr parser - create common function */ + if(c[i+2] != ':') goto done; + if(c[i+3] < '0' || c[i+3] > '5') goto done; + if(!myisdigit(c[i+4])) goto done; + if(c[i+5] != ':') goto done; + if(c[i+6] < '0' || c[i+6] > '5') goto done; + if(!myisdigit(c[i+7])) goto done; + + /* success, persist */ + *parsed = 8; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + +/** + * Parse a timestamp in 12hr format (exactly HH:MM:SS). + * Note: we do manual loop unrolling -- this is fast AND efficient. + * TODO: the code below is a duplicate of 24hr parser - create common function? + * rgerhards, 2011-01-14 + */ +PARSER_Parse(Time12hr) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = npb->str; + i = *offs; + + if(*offs+8 > npb->strLen) + goto done; /* if it is not 8 chars, it can't be us */ + + /* hour */ + if(c[i] == '0') { + if(!myisdigit(c[i+1])) goto done; + } else if(c[i] == '1') { + if(c[i+1] < '0' || c[i+1] > '2') goto done; + } else { + goto done; + } + if(c[i+2] != ':') goto done; + if(c[i+3] < '0' || c[i+3] > '5') goto done; + if(!myisdigit(c[i+4])) goto done; + if(c[i+5] != ':') goto done; + if(c[i+6] < '0' || c[i+6] > '5') goto done; + if(!myisdigit(c[i+7])) goto done; + + /* success, persist */ + *parsed = 8; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + + +/* helper to IPv4 address parser, checks the next set of numbers. + * Syntax 1 to 3 digits, value together not larger than 255. + * @param[in] npb->str parse buffer + * @param[in/out] offs offset into buffer, updated if successful + * @return 0 if OK, 1 otherwise + */ +static int +chkIPv4AddrByte(npb_t *const npb, size_t *offs) +{ + int val = 0; + int r = 1; /* default: done -- simplifies things */ + const char *c; + size_t i = *offs; + + c = npb->str; + if(i == npb->strLen || !myisdigit(c[i])) + goto done; + val = c[i++] - '0'; + if(i < npb->strLen && myisdigit(c[i])) { + val = val * 10 + c[i++] - '0'; + if(i < npb->strLen && myisdigit(c[i])) + val = val * 10 + c[i++] - '0'; + } + if(val > 255) /* cannot be a valid IP address byte! */ + goto done; + + *offs = i; + r = 0; +done: + return r; +} + +/** + * Parser for IPv4 addresses. + */ +PARSER_Parse(IPv4) + const char *c; + size_t i; + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + i = *offs; + if(i + 7 > npb->strLen) { + /* IPv4 addr requires at least 7 characters */ + goto done; + } + c = npb->str; + + /* byte 1*/ + if(chkIPv4AddrByte(npb, &i) != 0) goto done; + if(i == npb->strLen || c[i++] != '.') goto done; + /* byte 2*/ + if(chkIPv4AddrByte(npb, &i) != 0) goto done; + if(i == npb->strLen || c[i++] != '.') goto done; + /* byte 3*/ + if(chkIPv4AddrByte(npb, &i) != 0) goto done; + if(i == npb->strLen || c[i++] != '.') goto done; + /* byte 4 - we do NOT need any char behind it! */ + if(chkIPv4AddrByte(npb, &i) != 0) goto done; + + /* if we reach this point, we found a valid IP address */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + + +/* skip past the IPv6 address block, parse pointer is set to + * first char after the block. Returns an error if already at end + * of string. + * @param[in] npb->str parse buffer + * @param[in/out] offs offset into buffer, updated if successful + * @return 0 if OK, 1 otherwise + */ +static int +skipIPv6AddrBlock(npb_t *const npb, + size_t *const __restrict__ offs) +{ + int j; + if(*offs == npb->strLen) + return 1; + + for(j = 0 ; j < 4 && *offs+j < npb->strLen && isxdigit(npb->str[*offs+j]) ; ++j) + /*just skip*/ ; + + *offs += j; + return 0; +} + +/** + * Parser for IPv6 addresses. + * Bases on RFC4291 Section 2.2. The address must be followed + * by whitespace or end-of-string, else it is not considered + * a valid address. This prevents false positives. + */ +PARSER_Parse(IPv6) + const char *c; + size_t i; + size_t beginBlock; /* last block begin in case we need IPv4 parsing */ + int hasIPv4 = 0; + int nBlocks = 0; /* how many blocks did we already have? */ + int bHad0Abbrev = 0; /* :: already used? */ + + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + i = *offs; + if(i + 2 > npb->strLen) { + /* IPv6 addr requires at least 2 characters ("::") */ + goto done; + } + c = npb->str; + + /* check that first block is non-empty */ + if(! ( isxdigit(c[i]) || (c[i] == ':' && c[i+1] == ':') ) ) + goto done; + + /* try for all potential blocks plus one more (so we see errors!) */ + for(int j = 0 ; j < 9 ; ++j) { + beginBlock = i; + if(skipIPv6AddrBlock(npb, &i) != 0) goto done; + nBlocks++; + if(i == npb->strLen) goto chk_ok; + if(isspace(c[i])) goto chk_ok; + if(c[i] == '.'){ /* IPv4 processing! */ + hasIPv4 = 1; + break; + } + if(c[i] != ':') goto done; + i++; /* "eat" ':' */ + if(i == npb->strLen) goto chk_ok; + /* check for :: */ + if(bHad0Abbrev) { + if(c[i] == ':') goto done; + } else { + if(c[i] == ':') { + bHad0Abbrev = 1; + ++i; + if(i == npb->strLen) goto chk_ok; + } + } + } + + if(hasIPv4) { + size_t ipv4_parsed; + --nBlocks; + /* prevent pure IPv4 address to be recognized */ + if(beginBlock == *offs) goto done; + i = beginBlock; + if(ln_v2_parseIPv4(npb, &i, NULL, &ipv4_parsed, NULL) != 0) + goto done; + i += ipv4_parsed; + } + +chk_ok: /* we are finished parsing, check if things are ok */ + if(nBlocks > 8) goto done; + if(bHad0Abbrev && nBlocks >= 8) goto done; + /* now check if trailing block is missing. Note that i is already + * on next character, so we need to go two back. Two are always + * present, else we would not reach this code here. + */ + if(c[i-1] == ':' && c[i-2] != ':') goto done; + + /* if we reach this point, we found a valid IP address */ + *parsed = i - *offs; + if(value != NULL) { + *value = json_object_new_string_len(npb->str+(*offs), *parsed); + } + r = 0; /* success */ +done: + return r; +} + +/* check if a char is valid inside a name of the iptables motif. + * We try to keep the set as slim as possible, because the iptables + * parser may otherwise create a very broad match (especially the + * inclusion of simple words like "DF" cause grief here). + * Note: we have taken the permitted set from iptables log samples. + * Report bugs if we missed some additional rules. + */ +static inline int +isValidIPTablesNameChar(const char c) +{ + /* right now, upper case only is valid */ + return ('A' <= c && c <= 'Z') ? 1 : 0; +} + +/* helper to iptables parser, parses out a a single name=value pair + */ +static int +parseIPTablesNameValue(npb_t *const npb, + size_t *const __restrict__ offs, + struct json_object *const __restrict__ valroot) +{ + int r = LN_WRONGPARSER; + size_t i = *offs; + char *name = NULL; + + const size_t iName = i; + while(i < npb->strLen && isValidIPTablesNameChar(npb->str[i])) + ++i; + if(i == iName || (i < npb->strLen && npb->str[i] != '=' && npb->str[i] != ' ')) + goto done; /* no name at all! */ + + const ssize_t lenName = i - iName; + + ssize_t iVal = -1; + size_t lenVal = i - iVal; + if(i < npb->strLen && npb->str[i] != ' ') { + /* we have a real value (not just a flag name like "DF") */ + ++i; /* skip '=' */ + iVal = i; + while(i < npb->strLen && !isspace(npb->str[i])) + ++i; + lenVal = i - iVal; + } + + /* parsing OK */ + *offs = i; + r = 0; + + if(valroot == NULL) + goto done; + + CHKN(name = malloc(lenName+1)); + memcpy(name, npb->str+iName, lenName); + name[lenName] = '\0'; + json_object *json; + if(iVal == -1) { + json = NULL; + } else { + CHKN(json = json_object_new_string_len(npb->str+iVal, lenVal)); + } + json_object_object_add(valroot, name, json); +done: + free(name); + return r; +} + +/** + * Parser for iptables logs (the structured part). + * This parser is named "v2-iptables" because of a traditional + * parser named "iptables", which we do not want to replace, at + * least right now (we may re-think this before the first release). + * For performance reasons, this works in two stages. In the first + * stage, we only detect if the motif is correct. The second stage is + * only called when we know it is. In it, we go once again over the + * message again and actually extract the data. This is done because + * data extraction is relatively expensive and in most cases we will + * have much more frequent mismatches than matches. + * Note that this motif must have at least one field, otherwise it + * could detect things that are not iptables to be it. Further limits + * may be imposed in the future as we see additional need. + * added 2015-04-30 rgerhards + */ +PARSER_Parse(v2IPTables) + size_t i = *offs; + int nfields = 0; + + /* stage one */ + while(i < npb->strLen) { + CHKR(parseIPTablesNameValue(npb, &i, NULL)); + ++nfields; + /* exactly one SP is permitted between fields */ + if(i < npb->strLen && npb->str[i] == ' ') + ++i; + } + + if(nfields < 2) { + FAIL(LN_WRONGPARSER); + } + + /* success, persist */ + *parsed = i - *offs; + r = 0; + + /* stage two */ + if(value == NULL) + goto done; + + i = *offs; + CHKN(*value = json_object_new_object()); + while(i < npb->strLen) { + CHKR(parseIPTablesNameValue(npb, &i, *value)); + while(i < npb->strLen && isspace(npb->str[i])) + ++i; + } + +done: + if(r != 0 && value != NULL && *value != NULL) { + json_object_put(*value); + *value = NULL; + } + return r; +} + +/** + * Parse JSON. This parser tries to find JSON data inside a message. + * If it finds valid JSON, it will extract it. Extra data after the + * JSON is permitted. + * Note: the json-c JSON parser treats whitespace after the actual + * json to be part of the json. So in essence, any whitespace is + * processed by this parser. We use the same semantics to keep things + * neatly in sync. If json-c changes for some reason or we switch to + * an alternate json lib, we probably need to be sure to keep that + * behaviour, and probably emulate it. + * added 2015-04-28 by rgerhards, v1.1.2 + */ +PARSER_Parse(JSON) + const size_t i = *offs; + struct json_tokener *tokener = NULL; + + if(npb->str[i] != '{' && npb->str[i] != ']') { + /* this can't be json, see RFC4627, Sect. 2 + * see this bug in json-c: + * https://github.com/json-c/json-c/issues/181 + * In any case, it's better to do this quick check, + * even if json-c did not have the bug because this + * check here is much faster than calling the parser. + */ + goto done; + } + + if((tokener = json_tokener_new()) == NULL) + goto done; + + struct json_object *const json + = json_tokener_parse_ex(tokener, npb->str+i, (int) (npb->strLen - i)); + + if(json == NULL) + goto done; + + /* success, persist */ + *parsed = (i + tokener->char_offset) - *offs; + r = 0; /* success */ + + if(value == NULL) { + json_object_put(json); + } else { + *value = json; + } + +done: + if(tokener != NULL) + json_tokener_free(tokener); + return r; +} + + +/* check if a char is valid inside a name of a NameValue list + * The set of valid characters may be extended if there is good + * need to do so. We have selected the current set carefully, but + * may have overlooked some cases. + */ +static inline int +isValidNameChar(const char c) +{ + return (isalnum(c) + || c == '.' + || c == '_' + || c == '-' + ) ? 1 : 0; +} +/* helper to NameValue parser, parses out a a single name=value pair + * + * name must be alphanumeric characters, value must be non-whitespace + * characters, if quoted than with symmetric quotes. Supported formats + * - name=value + * - name="value" + * - name='value' + * Note "name=" is valid and means a field with empty value. + * TODO: so far, quote characters are not permitted WITHIN quoted values. + */ +static int +parseNameValue(npb_t *const npb, + size_t *const __restrict__ offs, + struct json_object *const __restrict__ valroot) +{ + int r = LN_WRONGPARSER; + size_t i = *offs; + char *name = NULL; + + const size_t iName = i; + while(i < npb->strLen && isValidNameChar(npb->str[i])) + ++i; + if(i == iName || npb->str[i] != '=') + goto done; /* no name at all! */ + + const size_t lenName = i - iName; + ++i; /* skip '=' */ + + const size_t iVal = i; + while(i < npb->strLen && !isspace(npb->str[i])) + ++i; + const size_t lenVal = i - iVal; + + /* parsing OK */ + *offs = i; + r = 0; + + if(valroot == NULL) + goto done; + + CHKN(name = malloc(lenName+1)); + memcpy(name, npb->str+iName, lenName); + name[lenName] = '\0'; + json_object *json; + CHKN(json = json_object_new_string_len(npb->str+iVal, lenVal)); + json_object_object_add(valroot, name, json); +done: + free(name); + return r; +} + +/** + * Parse CEE syslog. + * This essentially is a JSON parser, with additional restrictions: + * The message must start with "@cee:" and json must immediately follow (whitespace permitted). + * after the JSON, there must be no other non-whitespace characters. + * In other words: the message must consist of a single JSON object, + * only. + * added 2015-04-28 by rgerhards, v1.1.2 + */ +PARSER_Parse(CEESyslog) + size_t i = *offs; + struct json_tokener *tokener = NULL; + struct json_object *json = NULL; + + if(npb->strLen < i + 7 || /* "@cee:{}" is minimum text */ + npb->str[i] != '@' || + npb->str[i+1] != 'c' || + npb->str[i+2] != 'e' || + npb->str[i+3] != 'e' || + npb->str[i+4] != ':') + goto done; + + /* skip whitespace */ + for(i += 5 ; i < npb->strLen && isspace(npb->str[i]) ; ++i) + /* just skip */; + + if(i == npb->strLen || npb->str[i] != '{') + goto done; + /* note: we do not permit arrays in CEE mode */ + + if((tokener = json_tokener_new()) == NULL) + goto done; + + json = json_tokener_parse_ex(tokener, npb->str+i, (int) (npb->strLen - i)); + + if(json == NULL) + goto done; + + if(i + tokener->char_offset != npb->strLen) + goto done; + + /* success, persist */ + *parsed = npb->strLen; + r = 0; /* success */ + + if(value != NULL) { + *value = json; + json = NULL; /* do NOT free below! */ + } + +done: + if(tokener != NULL) + json_tokener_free(tokener); + if(json != NULL) + json_object_put(json); + return r; +} + +/** + * Parser for name/value pairs. + * On entry must point to alnum char. All following chars must be + * name/value pairs delimited by whitespace up until the end of string. + * For performance reasons, this works in two stages. In the first + * stage, we only detect if the motif is correct. The second stage is + * only called when we know it is. In it, we go once again over the + * message again and actually extract the data. This is done because + * data extraction is relatively expensive and in most cases we will + * have much more frequent mismatches than matches. + * added 2015-04-25 rgerhards + */ +PARSER_Parse(NameValue) + size_t i = *offs; + + /* stage one */ + while(i < npb->strLen) { + CHKR(parseNameValue(npb, &i, NULL)); + while(i < npb->strLen && isspace(npb->str[i])) + ++i; + } + + /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ + + /* stage two */ + if(value == NULL) + goto done; + + i = *offs; + CHKN(*value = json_object_new_object()); + while(i < npb->strLen) { + CHKR(parseNameValue(npb, &i, *value)); + while(i < npb->strLen && isspace(npb->str[i])) + ++i; + } + + /* TODO: fix mem leak if alloc json fails */ + +done: + return r; +} + +/** + * Parse a MAC layer address. + * The standard (IEEE 802) format for printing MAC-48 addresses in + * human-friendly form is six groups of two hexadecimal digits, + * separated by hyphens (-) or colons (:), in transmission order + * (e.g. 01-23-45-67-89-ab or 01:23:45:67:89:ab ). + * This form is also commonly used for EUI-64. + * from: http://en.wikipedia.org/wiki/MAC_address + * + * This parser must start on a hex digit. + * added 2015-05-04 by rgerhards, v1.1.2 + */ +PARSER_Parse(MAC48) + size_t i = *offs; + char delim; + + if(npb->strLen < i + 17 || /* this motif has exactly 17 characters */ + !isxdigit(npb->str[i]) || + !isxdigit(npb->str[i+1]) + ) + FAIL(LN_WRONGPARSER); + + if(npb->str[i+2] == ':') + delim = ':'; + else if(npb->str[i+2] == '-') + delim = '-'; + else + FAIL(LN_WRONGPARSER); + + /* first byte ok */ + if(!isxdigit(npb->str[i+3]) || + !isxdigit(npb->str[i+4]) || + npb->str[i+5] != delim || /* 2nd byte ok */ + !isxdigit(npb->str[i+6]) || + !isxdigit(npb->str[i+7]) || + npb->str[i+8] != delim || /* 3rd byte ok */ + !isxdigit(npb->str[i+9]) || + !isxdigit(npb->str[i+10]) || + npb->str[i+11] != delim || /* 4th byte ok */ + !isxdigit(npb->str[i+12]) || + !isxdigit(npb->str[i+13]) || + npb->str[i+14] != delim || /* 5th byte ok */ + !isxdigit(npb->str[i+15]) || + !isxdigit(npb->str[i+16]) /* 6th byte ok */ + ) + FAIL(LN_WRONGPARSER); + + /* success, persist */ + *parsed = 17; + r = 0; /* success */ + + if(value != NULL) { + CHKN(*value = json_object_new_string_len(npb->str+i, 17)); + } + +done: + return r; +} + + +/* This parses the extension value and updates the index + * to point to the end of it. + */ +static int +cefParseExtensionValue(npb_t *const npb, + size_t *__restrict__ iEndVal) +{ + int r = 0; + size_t i = *iEndVal; + size_t iLastWordBegin; + /* first find next unquoted equal sign and record begin of + * last word in front of it - this is the actual end of the + * current name/value pair and the begin of the next one. + */ + int hadSP = 0; + int inEscape = 0; + for(iLastWordBegin = 0 ; i < npb->strLen ; ++i) { + if(inEscape) { + if(npb->str[i] != '=' && + npb->str[i] != '\\' && + npb->str[i] != 'r' && + npb->str[i] != 'n') + FAIL(LN_WRONGPARSER); + inEscape = 0; + } else { + if(npb->str[i] == '=') { + break; + } else if(npb->str[i] == '\\') { + inEscape = 1; + } else if(npb->str[i] == ' ') { + hadSP = 1; + } else { + if(hadSP) { + iLastWordBegin = i; + hadSP = 0; + } + } + } + } + + /* Note: iLastWordBegin can never be at offset zero, because + * the CEF header starts there! + */ + if(i < npb->strLen) { + *iEndVal = (iLastWordBegin == 0) ? i : iLastWordBegin - 1; + } else { + *iEndVal = i; + } +done: + return r; +} + +/* must be positioned on first char of name, returns index + * of end of name. + * Note: ArcSight violates the CEF spec ifself: they generate + * leading underscores in their extension names, which are + * definetly not alphanumeric. We still accept them... + * They also seem to use dots. + */ +static int +cefParseName(npb_t *const npb, + size_t *const __restrict__ i) +{ + int r = 0; + while(*i < npb->strLen && npb->str[*i] != '=') { + if(!(isalnum(npb->str[*i]) || npb->str[*i] == '_' || npb->str[*i] == '.')) + FAIL(LN_WRONGPARSER); + ++(*i); + } +done: + return r; +} + +/* parse CEF extensions. They are basically name=value + * pairs with the ugly exception that values may contain + * spaces but need NOT to be quoted. Thankfully, at least + * names are specified as being alphanumeric without spaces + * in them. So we must add a lookahead parser to check if + * a word is a name (and thus the begin of a new pair) or + * not. This is done by subroutines. + */ +static int +cefParseExtensions(npb_t *const npb, + size_t *const __restrict__ offs, + json_object *const __restrict__ jroot) +{ + int r = 0; + size_t i = *offs; + size_t iName, lenName; + size_t iValue, lenValue; + char *name = NULL; + char *value = NULL; + + while(i < npb->strLen) { + while(i < npb->strLen && npb->str[i] == ' ') + ++i; + iName = i; + CHKR(cefParseName(npb, &i)); + if(i+1 >= npb->strLen || npb->str[i] != '=') + FAIL(LN_WRONGPARSER); + lenName = i - iName; + ++i; /* skip '=' */ + + iValue = i; + CHKR(cefParseExtensionValue(npb, &i)); + lenValue = i - iValue; + + ++i; /* skip past value */ + + if(jroot != NULL) { + CHKN(name = malloc(sizeof(char) * (lenName + 1))); + memcpy(name, npb->str+iName, lenName); + name[lenName] = '\0'; + CHKN(value = malloc(sizeof(char) * (lenValue + 1))); + /* copy value but escape it */ + size_t iDst = 0; + for(size_t iSrc = 0 ; iSrc < lenValue ; ++iSrc) { + if(npb->str[iValue+iSrc] == '\\') { + ++iSrc; /* we know the next char must exist! */ + switch(npb->str[iValue+iSrc]) { + case '=': value[iDst] = '='; + break; + case 'n': value[iDst] = '\n'; + break; + case 'r': value[iDst] = '\r'; + break; + case '\\': value[iDst] = '\\'; + break; + default: break; + } + } else { + value[iDst] = npb->str[iValue+iSrc]; + } + ++iDst; + } + value[iDst] = '\0'; + json_object *json; + CHKN(json = json_object_new_string(value)); + json_object_object_add(jroot, name, json); + free(name); name = NULL; + free(value); value = NULL; + } + } + + *offs = npb->strLen; /* this parser consume everything or fails */ + +done: + free(name); + free(value); + return r; +} + +/* gets a CEF header field. Must be positioned on the + * first char after the '|' in front of field. + * Note that '|' may be escaped as "\|", which also means + * we need to supprot "\\" (see CEF spec for details). + * We return the string in *val, if val is non-null. In + * that case we allocate memory that the caller must free. + * This is necessary because there are potentially escape + * sequences inside the string. + */ +static int +cefGetHdrField(npb_t *const npb, + size_t *const __restrict__ offs, + char **val) +{ + int r = 0; + size_t i = *offs; + assert(npb->str[i] != '|'); + while(i < npb->strLen && npb->str[i] != '|') { + if(npb->str[i] == '\\') { + ++i; /* skip esc char */ + if(npb->str[i] != '\\' && npb->str[i] != '|') + FAIL(LN_WRONGPARSER); + } + ++i; /* scan to next delimiter */ + } + + if(npb->str[i] != '|') + FAIL(LN_WRONGPARSER); + + const size_t iBegin = *offs; + /* success, persist */ + *offs = i + 1; + + if(val == NULL) { + r = 0; + goto done; + } + + const size_t len = i - iBegin; + CHKN(*val = malloc(len + 1)); + size_t iDst = 0; + for(size_t iSrc = 0 ; iSrc < len ; ++iSrc) { + if(npb->str[iBegin+iSrc] == '\\') + ++iSrc; /* we already checked above that this is OK! */ + (*val)[iDst++] = npb->str[iBegin+iSrc]; + } + (*val)[iDst] = 0; + r = 0; +done: + return r; +} + +/** + * Parser for ArcSight Common Event Format (CEF) version 0. + * added 2015-05-05 by rgerhards, v1.1.2 + */ +PARSER_Parse(CEF) + size_t i = *offs; + char *vendor = NULL; + char *product = NULL; + char *version = NULL; + char *sigID = NULL; + char *name = NULL; + char *severity = NULL; + + /* minumum header: "CEF:0|x|x|x|x|x|x|" --> 17 chars */ + if(npb->strLen < i + 17 || + npb->str[i] != 'C' || + npb->str[i+1] != 'E' || + npb->str[i+2] != 'F' || + npb->str[i+3] != ':' || + npb->str[i+4] != '0' || + npb->str[i+5] != '|' + ) FAIL(LN_WRONGPARSER); + + i += 6; /* position on '|' */ + + CHKR(cefGetHdrField(npb, &i, (value == NULL) ? NULL : &vendor)); + CHKR(cefGetHdrField(npb, &i, (value == NULL) ? NULL : &product)); + CHKR(cefGetHdrField(npb, &i, (value == NULL) ? NULL : &version)); + CHKR(cefGetHdrField(npb, &i, (value == NULL) ? NULL : &sigID)); + CHKR(cefGetHdrField(npb, &i, (value == NULL) ? NULL : &name)); + CHKR(cefGetHdrField(npb, &i, (value == NULL) ? NULL : &severity)); + ++i; /* skip over terminal '|' */ + + /* OK, we now know we have a good header. Now, we need + * to process extensions. + * This time, we do NOT pre-process the extension, but rather + * persist them directly to JSON. This is contrary to other + * parsers, but as the CEF header is pretty unique, this time + * it is exteremely unlike we will get a no-match during + * extension processing. Even if so, nothing bad happens, as + * the extracted data is discarded. But the regular case saves + * us processing time and complexity. The only time when we + * cannot directly process it is when the caller asks us not + * to persist the data. So this must be handled differently. + */ + size_t iBeginExtensions = i; + CHKR(cefParseExtensions(npb, &i, NULL)); + + /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ + + if(value != NULL) { + CHKN(*value = json_object_new_object()); + json_object *json; + CHKN(json = json_object_new_string(vendor)); + json_object_object_add(*value, "DeviceVendor", json); + CHKN(json = json_object_new_string(product)); + json_object_object_add(*value, "DeviceProduct", json); + CHKN(json = json_object_new_string(version)); + json_object_object_add(*value, "DeviceVersion", json); + CHKN(json = json_object_new_string(sigID)); + json_object_object_add(*value, "SignatureID", json); + CHKN(json = json_object_new_string(name)); + json_object_object_add(*value, "Name", json); + CHKN(json = json_object_new_string(severity)); + json_object_object_add(*value, "Severity", json); + + json_object *jext; + CHKN(jext = json_object_new_object()); + json_object_object_add(*value, "Extensions", jext); + + i = iBeginExtensions; + cefParseExtensions(npb, &i, jext); + } + +done: + if(r != 0 && value != NULL && *value != NULL) { + json_object_put(*value); + value = NULL; + } + free(vendor); + free(product); + free(version); + free(sigID); + free(name); + free(severity); + return r; +} + +/** + * Parser for Checkpoint LEA on-disk format. + * added 2015-06-18 by rgerhards, v1.1.2 + */ +PARSER_Parse(CheckpointLEA) + size_t i = *offs; + size_t iName, lenName; + size_t iValue, lenValue; + int foundFields = 0; + char *name = NULL; + char *val = NULL; + + while(i < npb->strLen) { + while(i < npb->strLen && npb->str[i] == ' ') /* skip leading SP */ + ++i; + if(i == npb->strLen) { /* OK if just trailing space */ + if(foundFields == 0) + FAIL(LN_WRONGPARSER); + break; /* we are done with the loop, all processed */ + } else { + ++foundFields; + } + iName = i; + /* TODO: do a stricter check? ... but we don't have a spec */ + while(i < npb->strLen && npb->str[i] != ':') { + ++i; + } + if(i+1 >= npb->strLen || npb->str[i] != ':') + FAIL(LN_WRONGPARSER); + lenName = i - iName; + ++i; /* skip ':' */ + + while(i < npb->strLen && npb->str[i] == ' ') /* skip leading SP */ + ++i; + iValue = i; + while(i < npb->strLen && npb->str[i] != ';') { + ++i; + } + if(i+1 > npb->strLen || npb->str[i] != ';') + FAIL(LN_WRONGPARSER); + lenValue = i - iValue; + ++i; /* skip ';' */ + + if(value != NULL) { + CHKN(name = malloc(sizeof(char) * (lenName + 1))); + memcpy(name, npb->str+iName, lenName); + name[lenName] = '\0'; + CHKN(val = malloc(sizeof(char) * (lenValue + 1))); + memcpy(val, npb->str+iValue, lenValue); + val[lenValue] = '\0'; + if(*value == NULL) + CHKN(*value = json_object_new_object()); + json_object *json; + CHKN(json = json_object_new_string(val)); + json_object_object_add(*value, name, json); + free(name); name = NULL; + free(val); val = NULL; + } + } + + /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ + +done: + free(name); + free(val); + if(r != 0 && value != NULL && *value != NULL) { + json_object_put(*value); + value = NULL; + } + return r; +} + + +/* helper to repeat parser constructor: checks that dot field name + * is only present if there is one field inside the "parser" list. + * returns 1 if ok, 0 otherwise. + */ +static int +chkNoDupeDotInParserDefs(ln_ctx ctx, struct json_object *parsers) +{ + int r = 1; + int nParsers = 0; + int nDots = 0; + if(json_object_get_type(parsers) == json_type_array) { + const int maxparsers = json_object_array_length(parsers); + for(int i = 0 ; i < maxparsers ; ++i) { + ++nParsers; + struct json_object *const parser + = json_object_array_get_idx(parsers, i); + struct json_object *fname; + json_object_object_get_ex(parser, "name", &fname); + if(fname != NULL) { + if(!strcmp(json_object_get_string(fname), ".")) + ++nDots; + } + } + } + if(nParsers > 1 && nDots > 0) { + ln_errprintf(ctx, 0, "'repeat' parser supports dot name only " + "if single parser is used in 'parser' part, invalid " + "construct: %s", json_object_get_string(parsers)); + r = 0; + } + return r; +} + +/** + * "repeat" special parser. + */ +PARSER_Parse(Repeat) + struct data_Repeat *const data = (struct data_Repeat*) pdata; + struct ln_pdag *endNode = NULL; + size_t strtoffs = *offs; + size_t lastKnownGood = strtoffs; + struct json_object *json_arr = NULL; + const size_t parsedTo_save = npb->parsedTo; + + do { + struct json_object *parsed_value = json_object_new_object(); + r = ln_normalizeRec(npb, data->parser, strtoffs, 1, + parsed_value, &endNode); + strtoffs = npb->parsedTo; + LN_DBGPRINTF(npb->ctx, "repeat parser returns %d, parsed %zu, json: %s", + r, npb->parsedTo, json_object_to_json_string(parsed_value)); + + if(r != 0) { + json_object_put(parsed_value); + if(data->permitMismatchInParser) { + strtoffs = lastKnownGood; /* go back to final match */ + LN_DBGPRINTF(npb->ctx, "mismatch in repeat, " + "parse ptr back to %zd", strtoffs); + goto success; + } else { + goto done; + } + } + + if(json_arr == NULL) { + json_arr = json_object_new_array(); + } + + /* check for name=".", which means we need to place the + * value only into to array. As we do not have direct + * access to the key, we loop over our result as a work- + * around. + */ + struct json_object *toAdd = parsed_value; + struct json_object_iterator it = json_object_iter_begin(parsed_value); + struct json_object_iterator itEnd = json_object_iter_end(parsed_value); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(key[0] == '.' && key[1] == '\0') { + json_object_get(val); /* inc refcount! */ + toAdd = val; + } + json_object_iter_next(&it); + } + + json_object_array_add(json_arr, toAdd); + if(toAdd != parsed_value) + json_object_put(parsed_value); + LN_DBGPRINTF(npb->ctx, "arr: %s", json_object_to_json_string(json_arr)); + + /* now check if we shall continue */ + npb->parsedTo = 0; + lastKnownGood = strtoffs; /* record pos in case of fail in while */ + r = ln_normalizeRec(npb, data->while_cond, strtoffs, 1, NULL, &endNode); + LN_DBGPRINTF(npb->ctx, "repeat while returns %d, parsed %zu", + r, npb->parsedTo); + if(r == 0) + strtoffs = npb->parsedTo; + } while(r == 0); + +success: + /* success, persist */ + *parsed = strtoffs - *offs; + if(value == NULL) { + json_object_put(json_arr); + } else { + *value = json_arr; + } + npb->parsedTo = parsedTo_save; + r = 0; /* success */ +done: + if(r != 0 && json_arr != NULL) { + json_object_put(json_arr); + } + return r; +} +PARSER_Construct(Repeat) +{ + int r = 0; + struct data_Repeat *data = (struct data_Repeat*) calloc(1, sizeof(struct data_Repeat)); + struct ln_pdag *endnode; /* we need this fo ln_pdagAddParser, which updates its param! */ + + if(json == NULL) + goto done; + + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(!strcmp(key, "parser")) { + if(chkNoDupeDotInParserDefs(ctx, val) != 1) { + r = LN_BADCONFIG; + goto done; + } + endnode = data->parser = ln_newPDAG(ctx); + json_object_get(val); /* prevent free in pdagAddParser */ + CHKR(ln_pdagAddParser(ctx, &endnode, val)); + endnode->flags.isTerminal = 1; + } else if(!strcmp(key, "while")) { + endnode = data->while_cond = ln_newPDAG(ctx); + json_object_get(val); /* prevent free in pdagAddParser */ + CHKR(ln_pdagAddParser(ctx, &endnode, val)); + endnode->flags.isTerminal = 1; + } else if(!strcasecmp(key, "option.permitMismatchInParser")) { + data->permitMismatchInParser = json_object_get_boolean(val); + } else { + ln_errprintf(ctx, 0, "invalid param for hexnumber: %s", + json_object_to_json_string(val)); + } + json_object_iter_next(&it); + } + +done: + if(data->parser == NULL || data->while_cond == NULL) { + ln_errprintf(ctx, 0, "repeat parser needs 'parser','while' parameters"); + ln_destructRepeat(ctx, data); + r = LN_BADCONFIG; + } else { + *pdata = data; + } + return r; +} +PARSER_Destruct(Repeat) +{ + struct data_Repeat *const data = (struct data_Repeat*) pdata; + if(data->parser != NULL) + ln_pdagDelete(data->parser); + if(data->while_cond != NULL) + ln_pdagDelete(data->while_cond); + free(pdata); +} + + +/* string escaping modes */ +#define ST_ESC_NONE 0 +#define ST_ESC_BACKSLASH 1 +#define ST_ESC_DOUBLE 2 +#define ST_ESC_BOTH 3 +struct data_String { + enum { ST_QUOTE_AUTO = 0, ST_QUOTE_NONE = 1, ST_QUOTE_REQD = 2 } + quoteMode; + struct { + unsigned strip_quotes : 1; + unsigned esc_md : 2; + } flags; + char qchar_begin; + char qchar_end; + char perm_chars[256]; // TODO: make this bit-wise, so we need only 32 bytes +}; +static inline void +stringSetPermittedChar(struct data_String *const data, char c, int val) +{ +#if 0 + const int i = (unsigned) c / 8; + const int shft = (unsigned) c % 8; + const unsigned mask = ~(1 << shft); + perm_arr[i] = (perm_arr[i] & (0xff +#endif + data->perm_chars[(unsigned)c] = val; +} +static inline int +stringIsPermittedChar(struct data_String *const data, char c) +{ + return data->perm_chars[(unsigned)c]; +} +static void +stringAddPermittedCharArr(struct data_String *const data, + const char *const optval) +{ + const size_t nchars = strlen(optval); + for(size_t i = 0 ; i < nchars ; ++i) { + stringSetPermittedChar(data, optval[i], 1); + } +} +static void +stringAddPermittedFromTo(struct data_String *const data, + const unsigned char from, + const unsigned char to) +{ + assert(from <= to); + for(size_t i = from ; i <= to ; ++i) { + stringSetPermittedChar(data, (char) i, 1); + } +} +static inline void +stringAddPermittedChars(struct data_String *const data, + struct json_object *const val) +{ + const char *const optval = json_object_get_string(val); + if(optval == NULL) + return; + stringAddPermittedCharArr(data, optval); +} +static void +stringAddPermittedCharsViaArray(ln_ctx ctx, struct data_String *const data, + struct json_object *const arr) +{ + const int nelem = json_object_array_length(arr); + for(int i = 0 ; i < nelem ; ++i) { + struct json_object *const elem + = json_object_array_get_idx(arr, i); + struct json_object_iterator it = json_object_iter_begin(elem); + struct json_object_iterator itEnd = json_object_iter_end(elem); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(!strcasecmp(key, "chars")) { + stringAddPermittedChars(data, val); + } else if(!strcasecmp(key, "class")) { + const char *const optval = json_object_get_string(val); + if(!strcasecmp(optval, "digit")) { + stringAddPermittedCharArr(data, "0123456789"); + } else if(!strcasecmp(optval, "hexdigit")) { + stringAddPermittedCharArr(data, "0123456789aAbBcCdDeEfF"); + } else if(!strcasecmp(optval, "alpha")) { + stringAddPermittedFromTo(data, 'a', 'z'); + stringAddPermittedFromTo(data, 'A', 'Z'); + } else if(!strcasecmp(optval, "alnum")) { + stringAddPermittedCharArr(data, "0123456789"); + stringAddPermittedFromTo(data, 'a', 'z'); + stringAddPermittedFromTo(data, 'A', 'Z'); + } else { + ln_errprintf(ctx, 0, "invalid character class '%s'", + optval); + } + } + json_object_iter_next(&it); + } + } +} +/** + * generic string parser + */ +PARSER_Parse(String) + assert(npb->str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + struct data_String *const data = (struct data_String*) pdata; + size_t i = *offs; + int bHaveQuotes = 0; + int bHadEndQuote = 0; + int bHadEscape = 0; + + if(i == npb->strLen) goto done; + + if((data->quoteMode == ST_QUOTE_AUTO) && (npb->str[i] == data->qchar_begin)) { + bHaveQuotes = 1; + ++i; + } else if(data->quoteMode == ST_QUOTE_REQD) { + if(npb->str[i] == data->qchar_begin) { + bHaveQuotes = 1; + ++i; + } else { + goto done; + } + } + + /* scan string */ + while(i < npb->strLen) { + if(bHaveQuotes) { + if(npb->str[i] == data->qchar_end) { + if(data->flags.esc_md == ST_ESC_DOUBLE + || data->flags.esc_md == ST_ESC_BOTH) { + /* may be escaped, need to check! */ + if(i+1 < npb->strLen + && npb->str[i+1] == data->qchar_end) { + bHadEscape = 1; + ++i; + } else { /* not escaped -> terminal */ + bHadEndQuote = 1; + break; + } + } else { + bHadEndQuote = 1; + break; + } + } + } + + if( npb->str[i] == '\\' + && i+1 < npb->strLen + && (data->flags.esc_md == ST_ESC_BACKSLASH + || data->flags.esc_md == ST_ESC_BOTH) ) { + bHadEscape = 1; + i++; /* skip esc char */ + } + + /* terminating conditions */ + if(!bHaveQuotes && npb->str[i] == ' ') + break; + if(!stringIsPermittedChar(data, npb->str[i])) + break; + i++; + } + + if(bHaveQuotes && !bHadEndQuote) + goto done; + + if(i == *offs) + goto done; + + const size_t trmChkIdx = (bHaveQuotes) ? i+1 : i; + if(npb->str[trmChkIdx] != ' ' && trmChkIdx != npb->strLen) + goto done; + + /* success, persist */ + *parsed = i - *offs; + if(bHadEndQuote) + ++(*parsed); /* skip quote */ + if(value != NULL) { + size_t strt; + size_t len; + if(bHaveQuotes && data->flags.strip_quotes) { + strt = *offs + 1; + len = *parsed - 2; /* del begin AND end quote! */ + } else { + strt = *offs; + len = *parsed; + } + char *const cstr = strndup(npb->str+strt, len); + CHKN(cstr); + if(bHadEscape) { + /* need to post-process string... */ + for(size_t j = 0 ; cstr[j] != '\0' ; j++) { + if( ( + cstr[j] == data->qchar_end + && cstr[j+1] == data->qchar_end + && (data->flags.esc_md == ST_ESC_DOUBLE + || data->flags.esc_md == ST_ESC_BOTH) + ) + || + ( + cstr[j] == '\\' + && (data->flags.esc_md == ST_ESC_BACKSLASH + || data->flags.esc_md == ST_ESC_BOTH) + ) ) { + /* we need to remove the escape character */ + memmove(cstr+j, cstr+j+1, len-j); + } + } + } + *value = json_object_new_string(cstr); + free(cstr); + } + r = 0; /* success */ +done: + return r; +} + +PARSER_Construct(String) +{ + int r = 0; + struct data_String *const data = (struct data_String*) calloc(1, sizeof(struct data_String)); + data->quoteMode = ST_QUOTE_AUTO; + data->flags.strip_quotes = 1; + data->flags.esc_md = ST_ESC_BOTH; + data->qchar_begin = '"'; + data->qchar_end = '"'; + memset(data->perm_chars, 0xff, sizeof(data->perm_chars)); + + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + const char *key = json_object_iter_peek_name(&it); + struct json_object *const val = json_object_iter_peek_value(&it); + if(!strcasecmp(key, "quoting.mode")) { + const char *const optval = json_object_get_string(val); + if(!strcasecmp(optval, "auto")) { + data->quoteMode = ST_QUOTE_AUTO; + } else if(!strcasecmp(optval, "none")) { + data->quoteMode = ST_QUOTE_NONE; + } else if(!strcasecmp(optval, "required")) { + data->quoteMode = ST_QUOTE_REQD; + } else { + ln_errprintf(ctx, 0, "invalid quoting.mode for string parser: %s", + optval); + r = LN_BADCONFIG; + goto done; + } + } else if(!strcasecmp(key, "quoting.escape.mode")) { + const char *const optval = json_object_get_string(val); + if(!strcasecmp(optval, "none")) { + data->flags.esc_md = ST_ESC_NONE; + } else if(!strcasecmp(optval, "backslash")) { + data->flags.esc_md = ST_ESC_BACKSLASH; + } else if(!strcasecmp(optval, "double")) { + data->flags.esc_md = ST_ESC_DOUBLE; + } else if(!strcasecmp(optval, "both")) { + data->flags.esc_md = ST_ESC_BOTH; + } else { + ln_errprintf(ctx, 0, "invalid quoting.escape.mode for string " + "parser: %s", optval); + r = LN_BADCONFIG; + goto done; + } + } else if(!strcasecmp(key, "quoting.char.begin")) { + const char *const optval = json_object_get_string(val); + if(strlen(optval) != 1) { + ln_errprintf(ctx, 0, "quoting.char.begin must " + "be exactly one character but is: '%s'", optval); + r = LN_BADCONFIG; + goto done; + } + data->qchar_begin = *optval; + } else if(!strcasecmp(key, "quoting.char.end")) { + const char *const optval = json_object_get_string(val); + if(strlen(optval) != 1) { + ln_errprintf(ctx, 0, "quoting.char.end must " + "be exactly one character but is: '%s'", optval); + r = LN_BADCONFIG; + goto done; + } + data->qchar_end = *optval; + } else if(!strcasecmp(key, "matching.permitted")) { + memset(data->perm_chars, 0x00, sizeof(data->perm_chars)); + if(json_object_is_type(val, json_type_string)) { + stringAddPermittedChars(data, val); + } else if(json_object_is_type(val, json_type_array)) { + stringAddPermittedCharsViaArray(ctx, data, val); + } else { + ln_errprintf(ctx, 0, "matching.permitted is invalid " + "object type, given as '%s", + json_object_to_json_string(val)); + } + } else { + ln_errprintf(ctx, 0, "invalid param for hexnumber: %s", + json_object_to_json_string(val)); + } + json_object_iter_next(&it); + } + + if(data->quoteMode == ST_QUOTE_NONE) + data->flags.esc_md = ST_ESC_NONE; + *pdata = data; +done: + if(r != 0) { + free(data); + } + return r; +} +PARSER_Destruct(String) +{ + free(pdata); +} diff --git a/src/parser.h b/src/parser.h new file mode 100644 index 0000000..38be62d --- /dev/null +++ b/src/parser.h @@ -0,0 +1,94 @@ +/* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2015 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef LIBLOGNORM_PARSER_H_INCLUDED +#define LIBLOGNORM_PARSER_H_INCLUDED +#include "pdag.h" + +/** + * Parser interface + * @param[in] str the to-be-parsed string + * @param[in] strLen length of the to-be-parsed string + * @param[in] offs an offset into the string + * @param[out] parsed bytes + * @param[out] ptr to json object containing parsed data (can be unused) + * if NULL on input, object is NOT persisted + * @return 0 on success, something else otherwise + */ +// TODO #warning check how to handle "value" - does it need to be set to NULL? + +#define PARSERDEF_NO_DATA(parser) \ + int ln_v2_parse##parser(npb_t *npb, size_t *offs, void *const, size_t *parsed, struct json_object **value) + +#define PARSERDEF(parser) \ + int ln_construct##parser(ln_ctx ctx, json_object *const json, void **pdata); \ + int ln_v2_parse##parser(npb_t *npb, size_t *offs, void *const, size_t *parsed, struct json_object **value); \ + void ln_destruct##parser(ln_ctx ctx, void *const pdata) + +PARSERDEF(RFC5424Date); +PARSERDEF(RFC3164Date); +PARSERDEF(Number); +PARSERDEF(Float); +PARSERDEF(HexNumber); +PARSERDEF_NO_DATA(KernelTimestamp); +PARSERDEF_NO_DATA(Whitespace); +PARSERDEF_NO_DATA(Word); +PARSERDEF(StringTo); +PARSERDEF_NO_DATA(Alpha); +PARSERDEF(Literal); +PARSERDEF(CharTo); +PARSERDEF(CharSeparated); +PARSERDEF(Repeat); +PARSERDEF(String); +PARSERDEF_NO_DATA(Rest); +PARSERDEF_NO_DATA(OpQuotedString); +PARSERDEF_NO_DATA(QuotedString); +PARSERDEF_NO_DATA(ISODate); +PARSERDEF_NO_DATA(Time12hr); +PARSERDEF_NO_DATA(Time24hr); +PARSERDEF_NO_DATA(Duration); +PARSERDEF_NO_DATA(IPv4); +PARSERDEF_NO_DATA(IPv6); +PARSERDEF_NO_DATA(JSON); +PARSERDEF_NO_DATA(CEESyslog); +PARSERDEF_NO_DATA(v2IPTables); +PARSERDEF_NO_DATA(CiscoInterfaceSpec); +PARSERDEF_NO_DATA(MAC48); +PARSERDEF_NO_DATA(CEF); +PARSERDEF_NO_DATA(CheckpointLEA); +PARSERDEF_NO_DATA(NameValue); + +#undef PARSERDEF_NO_DATA + +/* utility functions */ +int ln_combineData_Literal(void *const org, void *const add); + +/* definitions for friends */ +struct data_Repeat { + ln_pdag *parser; + ln_pdag *while_cond; + int permitMismatchInParser; +}; + +#endif /* #ifndef LIBLOGNORM_PARSER_H_INCLUDED */ diff --git a/src/pdag.c b/src/pdag.c new file mode 100644 index 0000000..f8ab1ec --- /dev/null +++ b/src/pdag.c @@ -0,0 +1,1718 @@ +/** + * @file pdag.c + * @brief Implementation of the parse dag object. + * @class ln_pdag pdag.h + *//* + * Copyright 2015 by Rainer Gerhards and Adiscon GmbH. + * + * Released under ASL 2.0. + */ +#include "config.h" +#include +#include +#include +#include +#include +#include +#include + +#include "liblognorm.h" +#include "v1_liblognorm.h" +#include "v1_ptree.h" +#include "lognorm.h" +#include "samp.h" +#include "pdag.h" +#include "annot.h" +#include "internal.h" +#include "parser.h" +#include "helpers.h" + +void ln_displayPDAGComponentAlternative(struct ln_pdag *dag, int level); +void ln_displayPDAGComponent(struct ln_pdag *dag, int level); + +#ifdef ADVANCED_STATS +uint64_t advstats_parsers_called = 0; +uint64_t advstats_parsers_success = 0; +int advstats_max_pathlen = 0; +int advstats_pathlens[ADVSTATS_MAX_ENTITIES]; +int advstats_max_backtracked = 0; +int advstats_backtracks[ADVSTATS_MAX_ENTITIES]; +int advstats_max_parser_calls = 0; +int advstats_parser_calls[ADVSTATS_MAX_ENTITIES]; +int advstats_max_lit_parser_calls = 0; +int advstats_lit_parser_calls[ADVSTATS_MAX_ENTITIES]; +#endif + +/* parser lookup table + * This is a memory- and cache-optimized way of calling parsers. + * VERY IMPORTANT: the initialization must be done EXACTLY in the + * order of parser IDs (also see comment in pdag.h). + * + * Rough guideline for assigning priorites: + * 0 is highest, 255 lowest. 255 should be reserved for things that + * *really* should only be run as last resort --> rest. Also keep in + * mind that the user-assigned priority is put in the upper 24 bits, so + * parser-specific priorities only count when the user has assigned + * no priorities (which is expected to be common) or user-assigned + * priorities are equal for some parsers. + */ +#ifdef ADVANCED_STATS +#define PARSER_ENTRY_NO_DATA(identifier, parser, prio) \ +{ identifier, prio, NULL, ln_v2_parse##parser, NULL, 0, 0 } +#define PARSER_ENTRY(identifier, parser, prio) \ +{ identifier, prio, ln_construct##parser, ln_v2_parse##parser, ln_destruct##parser, 0, 0 } +#else +#define PARSER_ENTRY_NO_DATA(identifier, parser, prio) \ +{ identifier, prio, NULL, ln_v2_parse##parser, NULL } +#define PARSER_ENTRY(identifier, parser, prio) \ +{ identifier, prio, ln_construct##parser, ln_v2_parse##parser, ln_destruct##parser } +#endif +static struct ln_parser_info parser_lookup_table[] = { + PARSER_ENTRY("literal", Literal, 4), + PARSER_ENTRY("repeat", Repeat, 4), + PARSER_ENTRY("date-rfc3164", RFC3164Date, 8), + PARSER_ENTRY("date-rfc5424", RFC5424Date, 8), + PARSER_ENTRY("number", Number, 16), + PARSER_ENTRY("float", Float, 16), + PARSER_ENTRY("hexnumber", HexNumber, 16), + PARSER_ENTRY_NO_DATA("kernel-timestamp", KernelTimestamp, 16), + PARSER_ENTRY_NO_DATA("whitespace", Whitespace, 4), + PARSER_ENTRY_NO_DATA("ipv4", IPv4, 4), + PARSER_ENTRY_NO_DATA("ipv6", IPv6, 4), + PARSER_ENTRY_NO_DATA("word", Word, 32), + PARSER_ENTRY_NO_DATA("alpha", Alpha, 32), + PARSER_ENTRY_NO_DATA("rest", Rest, 255), + PARSER_ENTRY_NO_DATA("op-quoted-string", OpQuotedString, 64), + PARSER_ENTRY_NO_DATA("quoted-string", QuotedString, 64), + PARSER_ENTRY_NO_DATA("date-iso", ISODate, 8), + PARSER_ENTRY_NO_DATA("time-24hr", Time24hr, 8), + PARSER_ENTRY_NO_DATA("time-12hr", Time12hr, 8), + PARSER_ENTRY_NO_DATA("duration", Duration, 16), + PARSER_ENTRY_NO_DATA("cisco-interface-spec", CiscoInterfaceSpec, 4), + PARSER_ENTRY_NO_DATA("name-value-list", NameValue, 8), + PARSER_ENTRY_NO_DATA("json", JSON, 4), + PARSER_ENTRY_NO_DATA("cee-syslog", CEESyslog, 4), + PARSER_ENTRY_NO_DATA("mac48", MAC48, 16), + PARSER_ENTRY_NO_DATA("cef", CEF, 4), + PARSER_ENTRY_NO_DATA("checkpoint-lea", CheckpointLEA, 4), + PARSER_ENTRY_NO_DATA("v2-iptables", v2IPTables, 4), + PARSER_ENTRY("string-to", StringTo, 32), + PARSER_ENTRY("char-to", CharTo, 32), + PARSER_ENTRY("char-sep", CharSeparated, 32), + PARSER_ENTRY("string", String, 32) +}; +#define NPARSERS (sizeof(parser_lookup_table)/sizeof(struct ln_parser_info)) +#define DFLT_USR_PARSER_PRIO 30000 /**< default priority if user has not specified it */ +static inline const char * +parserName(const prsid_t id) +{ + const char *name; + if(id == PRS_CUSTOM_TYPE) + name = "USER-DEFINED"; + else + name = parser_lookup_table[id].name; + return name; +} + +prsid_t +ln_parserName2ID(const char *const __restrict__ name) +{ + unsigned i; + + for( i = 0 + ; i < sizeof(parser_lookup_table) / sizeof(struct ln_parser_info) + ; ++i) { + if(!strcmp(parser_lookup_table[i].name, name)) { + return i; + } + } + return PRS_INVALID; +} + +/* find type pdag in table. If "bAdd" is set, add it if not + * already present, a new entry will be added. + * Returns NULL on error, ptr to type pdag entry otherwise + */ +struct ln_type_pdag * +ln_pdagFindType(ln_ctx ctx, const char *const __restrict__ name, const int bAdd) +{ + struct ln_type_pdag *td = NULL; + int i; + + LN_DBGPRINTF(ctx, "ln_pdagFindType, name '%s', bAdd: %d, nTypes %d", + name, bAdd, ctx->nTypes); + for(i = 0 ; i < ctx->nTypes ; ++i) { + if(!strcmp(ctx->type_pdags[i].name, name)) { + td = ctx->type_pdags + i; + goto done; + } + } + + if(!bAdd) { + LN_DBGPRINTF(ctx, "custom type '%s' not found", name); + goto done; + } + + /* type does not yet exist -- create entry */ + LN_DBGPRINTF(ctx, "custom type '%s' does not yet exist, adding...", name); + struct ln_type_pdag *newarr; + newarr = realloc(ctx->type_pdags, sizeof(struct ln_type_pdag) * (ctx->nTypes+1)); + if(newarr == NULL) { + LN_DBGPRINTF(ctx, "ln_pdagFindTypeAG: alloc newarr failed"); + goto done; + } + ctx->type_pdags = newarr; + td = ctx->type_pdags + ctx->nTypes; + ++ctx->nTypes; + td->name = strdup(name); + td->pdag = ln_newPDAG(ctx); +done: + return td; +} + +/* we clear some multiple times, but as long as we have no loops + * (dag!) we have no real issue. + */ +static void +ln_pdagComponentClearVisited(struct ln_pdag *const dag) +{ + dag->flags.visited = 0; + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *prs = dag->parsers+i; + ln_pdagComponentClearVisited(prs->node); + } +} +static void +ln_pdagClearVisited(ln_ctx ctx) +{ + for(int i = 0 ; i < ctx->nTypes ; ++i) + ln_pdagComponentClearVisited(ctx->type_pdags[i].pdag); + ln_pdagComponentClearVisited(ctx->pdag); +} + +/** + * Process a parser defintion. Note that a single defintion can potentially + * contain many parser instances. + * @return parser node ptr or NULL (on error) + */ +ln_parser_t* +ln_newParser(ln_ctx ctx, + json_object *prscnf) +{ + ln_parser_t *node = NULL; + json_object *json; + const char *val; + prsid_t prsid; + struct ln_type_pdag *custType = NULL; + const char *name = NULL; + const char *textconf = json_object_to_json_string(prscnf); + int assignedPrio = DFLT_USR_PARSER_PRIO; + int parserPrio; + + json_object_object_get_ex(prscnf, "type", &json); + if(json == NULL) { + ln_errprintf(ctx, 0, "parser type missing in config: %s", + json_object_to_json_string(prscnf)); + goto done; + } + val = json_object_get_string(json); + if(*val == '@') { + prsid = PRS_CUSTOM_TYPE; + custType = ln_pdagFindType(ctx, val, 0); + parserPrio = 16; /* hopefully relatively specific... */ + if(custType == NULL) { + ln_errprintf(ctx, 0, "unknown user-defined type '%s'", val); + goto done; + } + } else { + prsid = ln_parserName2ID(val); + if(prsid == PRS_INVALID) { + ln_errprintf(ctx, 0, "invalid field type '%s'", val); + goto done; + } + parserPrio = parser_lookup_table[prsid].prio; + } + + json_object_object_get_ex(prscnf, "name", &json); + if(json == NULL || !strcmp(json_object_get_string(json), "-")) { + name = NULL; + } else { + name = strdup(json_object_get_string(json)); + } + + json_object_object_get_ex(prscnf, "priority", &json); + if(json != NULL) { + assignedPrio = json_object_get_int(json); + } + LN_DBGPRINTF(ctx, "assigned priority is %d", assignedPrio); + + /* we need to remove already processed items from the config, so + * that we can pass the remaining parameters to the parser. + */ + json_object_object_del(prscnf, "type"); + json_object_object_del(prscnf, "priority"); + if(name != NULL) + json_object_object_del(prscnf, "name"); + + /* got all data items */ + if((node = calloc(1, sizeof(ln_parser_t))) == NULL) { + LN_DBGPRINTF(ctx, "lnNewParser: alloc node failed"); + free((void*)name); + goto done; + } + + node->node = NULL; + node->prio = ((assignedPrio << 8) & 0xffffff00) | (parserPrio & 0xff); + node->name = name; + node->prsid = prsid; + node->conf = strdup(textconf); + if(prsid == PRS_CUSTOM_TYPE) { + node->custType = custType; + } else { + if(parser_lookup_table[prsid].construct != NULL) { + parser_lookup_table[prsid].construct(ctx, prscnf, &node->parser_data); + } + } +done: + return node; +} + + +struct ln_pdag* +ln_newPDAG(ln_ctx ctx) +{ + struct ln_pdag *dag; + + if((dag = calloc(1, sizeof(struct ln_pdag))) == NULL) + goto done; + + dag->refcnt = 1; + dag->ctx = ctx; + ctx->nNodes++; +done: return dag; +} + +/* note: we must NOT free the parser itself, because + * it is stored inside a parser table (so no single + * alloc for the parser!). + */ +static void +pdagDeletePrs(ln_ctx ctx, ln_parser_t *const __restrict__ prs) +{ + // TODO: be careful here: once we move to real DAG from tree, we + // cannot simply delete the next node! (refcount? something else?) + if(prs->node != NULL) + ln_pdagDelete(prs->node); + free((void*)prs->name); + free((void*)prs->conf); + if(prs->parser_data != NULL) + parser_lookup_table[prs->prsid].destruct(ctx, prs->parser_data); +} + +void +ln_pdagDelete(struct ln_pdag *const __restrict__ pdag) +{ + if(pdag == NULL) + goto done; + + LN_DBGPRINTF(pdag->ctx, "delete %p[%d]: %s", pdag, pdag->refcnt, pdag->rb_id); + --pdag->refcnt; + if(pdag->refcnt > 0) + goto done; + + if(pdag->tags != NULL) + json_object_put(pdag->tags); + + for(int i = 0 ; i < pdag->nparsers ; ++i) { + pdagDeletePrs(pdag->ctx, pdag->parsers+i); + } + free(pdag->parsers); + free((void*)pdag->rb_id); + free((void*)pdag->rb_file); + free(pdag); +done: return; +} + + +/** + * pdag optimizer step: literal path compaction + * + * We compress as much as possible and evalute the path down to + * the first non-compressable element. Note that we must NOT + * compact those literals that are either terminal nodes OR + * contain names so that the literal is to be parsed out. + */ +static inline int +optLitPathCompact(ln_ctx ctx, ln_parser_t *prs) +{ + int r = 0; + + while(prs != NULL) { + /* note the NOT prefix in the condition below! */ + if(!( prs->prsid == PRS_LITERAL + && prs->name == NULL + && prs->node->flags.isTerminal == 0 + && prs->node->refcnt == 1 + && prs->node->nparsers == 1 + /* we need to do some checks on the child as well */ + && prs->node->parsers[0].prsid == PRS_LITERAL + && prs->node->parsers[0].name == NULL + && prs->node->parsers[0].node->refcnt == 1) + ) + goto done; + + /* ok, we have two compactable literals in a row, let's compact the nodes */ + ln_parser_t *child_prs = prs->node->parsers; + LN_DBGPRINTF(ctx, "opt path compact: add %p to %p", child_prs, prs); + CHKR(ln_combineData_Literal(prs->parser_data, child_prs->parser_data)); + ln_pdag *const node_del = prs->node; + prs->node = child_prs->node; + child_prs->node = NULL; /* remove, else this would be destructed! */ + ln_pdagDelete(node_del); + } +done: + return r; +} + + +static int +qsort_parserCmp(const void *v1, const void *v2) +{ + const ln_parser_t *const p1 = (const ln_parser_t *const) v1; + const ln_parser_t *const p2 = (const ln_parser_t *const) v2; + return p1->prio - p2->prio; +} + +static int +ln_pdagComponentOptimize(ln_ctx ctx, struct ln_pdag *const dag) +{ + int r = 0; + +for(int i = 0 ; i < dag->nparsers ; ++i) { /* TODO: remove when confident enough */ + ln_parser_t *prs = dag->parsers+i; + LN_DBGPRINTF(ctx, "pre sort, parser %d:%s[%d]", i, prs->name, prs->prio); +} + /* first sort parsers in priority order */ + if(dag->nparsers > 1) { + qsort(dag->parsers, dag->nparsers, sizeof(ln_parser_t), qsort_parserCmp); + } +for(int i = 0 ; i < dag->nparsers ; ++i) { /* TODO: remove when confident enough */ + ln_parser_t *prs = dag->parsers+i; + LN_DBGPRINTF(ctx, "post sort, parser %d:%s[%d]", i, prs->name, prs->prio); +} + + /* now on to rest of processing */ + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *prs = dag->parsers+i; + LN_DBGPRINTF(dag->ctx, "optimizing %p: field %d type '%s', name '%s': '%s':", + prs->node, i, parserName(prs->prsid), prs->name, + (prs->prsid == PRS_LITERAL) ? ln_DataForDisplayLiteral(dag->ctx, prs->parser_data) + : "UNKNOWN"); + + optLitPathCompact(ctx, prs); + + ln_pdagComponentOptimize(ctx, prs->node); + } + return r; +} + + +static void +deleteComponentID(struct ln_pdag *const __restrict__ dag) +{ + free((void*)dag->rb_id); + dag->rb_id = NULL; + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *prs = dag->parsers+i; + deleteComponentID(prs->node); + } +} +/* fixes rb_ids for this node as well as it predecessors. + * This is required if the ALTERNATIVE parser type is used, + * which will create component IDs for each of it's invocations. + * As such, we do not only fix the string, but know that all + * children also need fixning. We do this be simply deleting + * all of their rb_ids, as we know they will be visited again. + * Note: if we introduce the same situation by new functionality, + * we may need to review this code here as well. Also note + * that the component ID will not be 100% correct after our fix, + * because that ID could acutally be created by two sets of rules. + * But this is the best we can do. + */ +static void +fixComponentID(struct ln_pdag *const __restrict__ dag, const char *const new) +{ + char *updated; + const char *const curr = dag->rb_id; + int i; + int len = (int) strlen(curr); + for(i = 0 ; i < len ; ++i){ + if(curr[i] != new [i]) + break; + } + if(i >= 1 && curr[i-1] == '%') + --i; + if(asprintf(&updated, "%.*s[%s|%s]", i, curr, curr+i, new+i) == -1) + goto done; + deleteComponentID(dag); + dag->rb_id = updated; +done: return; +} +/** + * Assign human-readable identifiers (names) to each node. These are + * later used in stats, debug output and whereever else this may make + * sense. + */ +static void +ln_pdagComponentSetIDs(ln_ctx ctx, struct ln_pdag *const dag, const char *prefix) +{ + char *id = NULL; + + if(prefix == NULL) + goto done; + if(dag->rb_id == NULL) { + dag->rb_id = strdup(prefix); + } else { + LN_DBGPRINTF(ctx, "rb_id already exists - fixing as good as " + "possible. This happens with ALTERNATIVE parser. " + "old: '%s', new: '%s'", + dag->rb_id, prefix); + fixComponentID(dag, prefix); + LN_DBGPRINTF(ctx, "\"fixed\" rb_id: %s", dag->rb_id); + prefix = dag->rb_id; + } + /* now on to rest of processing */ + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *prs = dag->parsers+i; + if(prs->prsid == PRS_LITERAL) { + if(prs->name == NULL) { + if(asprintf(&id, "%s%s", prefix, + ln_DataForDisplayLiteral(dag->ctx, prs->parser_data)) == -1) + goto done; + } else { + if(asprintf(&id, "%s%%%s:%s:%s%%", prefix, + prs->name, + parserName(prs->prsid), + ln_DataForDisplayLiteral(dag->ctx, prs->parser_data)) == -1) + goto done; + } + } else { + if(asprintf(&id, "%s%%%s:%s%%", prefix, + prs->name ? prs->name : "-", + parserName(prs->prsid)) == -1) + goto done; + } + ln_pdagComponentSetIDs(ctx, prs->node, id); + free(id); + } +done: return; +} + +/** + * Optimize the pdag. + * This includes all components. + */ +int +ln_pdagOptimize(ln_ctx ctx) +{ + int r = 0; + + for(int i = 0 ; i < ctx->nTypes ; ++i) { + LN_DBGPRINTF(ctx, "optimizing component %s\n", ctx->type_pdags[i].name); + ln_pdagComponentOptimize(ctx, ctx->type_pdags[i].pdag); + ln_pdagComponentSetIDs(ctx, ctx->type_pdags[i].pdag, ""); + } + + LN_DBGPRINTF(ctx, "optimizing main pdag component"); + ln_pdagComponentOptimize(ctx, ctx->pdag); + LN_DBGPRINTF(ctx, "finished optimizing main pdag component"); + ln_pdagComponentSetIDs(ctx, ctx->pdag, ""); +LN_DBGPRINTF(ctx, "---AFTER OPTIMIZATION------------------"); +ln_displayPDAG(ctx); +LN_DBGPRINTF(ctx, "======================================="); + return r; +} + + +#define LN_INTERN_PDAG_STATS_NPARSERS 100 +/* data structure for pdag statistics */ +struct pdag_stats { + int nodes; + int term_nodes; + int parsers; + int max_nparsers; + int nparsers_cnt[LN_INTERN_PDAG_STATS_NPARSERS]; + int nparsers_100plus; + int *prs_cnt; +}; + +/** + * Recursive step of statistics gatherer. + */ +static int +ln_pdagStatsRec(ln_ctx ctx, struct ln_pdag *const dag, struct pdag_stats *const stats) +{ + if(dag->flags.visited) + return 0; + dag->flags.visited = 1; + stats->nodes++; + if(dag->flags.isTerminal) + stats->term_nodes++; + if(dag->nparsers > stats->max_nparsers) + stats->max_nparsers = dag->nparsers; + if(dag->nparsers >= LN_INTERN_PDAG_STATS_NPARSERS) + stats->nparsers_100plus++; + else + stats->nparsers_cnt[dag->nparsers]++; + stats->parsers += dag->nparsers; + int max_path = 0; + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *prs = dag->parsers+i; + if(prs->prsid != PRS_CUSTOM_TYPE) + stats->prs_cnt[prs->prsid]++; + const int path_len = ln_pdagStatsRec(ctx, prs->node, stats); + if(path_len > max_path) + max_path = path_len; + } + return max_path + 1; +} + + +static void +ln_pdagStatsExtended(ln_ctx ctx, struct ln_pdag *const dag, FILE *const fp, int level) +{ + char indent[2048]; + + if(level > 1023) + level = 1023; + memset(indent, ' ', level * 2); + indent[level * 2] = '\0'; + + if(dag->stats.called > 0) { + fprintf(fp, "%u, %u, %s\n", + dag->stats.called, + dag->stats.backtracked, + dag->rb_id); + } + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *const prs = dag->parsers+i; + if(prs->node->stats.called > 0) { + ln_pdagStatsExtended(ctx, prs->node, fp, level+1); + } + } +} + +/** + * Gather pdag statistics for a *specific* pdag. + * + * Data is sent to given file ptr. + */ +static void +ln_pdagStats(ln_ctx ctx, struct ln_pdag *const dag, FILE *const fp, const int extendedStats) +{ + struct pdag_stats *const stats = calloc(1, sizeof(struct pdag_stats)); + stats->prs_cnt = calloc(NPARSERS, sizeof(int)); + //ln_pdagClearVisited(ctx); + const int longest_path = ln_pdagStatsRec(ctx, dag, stats); + + fprintf(fp, "nodes.............: %4d\n", stats->nodes); + fprintf(fp, "terminal nodes....: %4d\n", stats->term_nodes); + fprintf(fp, "parsers entries...: %4d\n", stats->parsers); + fprintf(fp, "longest path......: %4d\n", longest_path); + + fprintf(fp, "Parser Type Counts:\n"); + for(prsid_t i = 0 ; i < NPARSERS ; ++i) { + if(stats->prs_cnt[i] != 0) + fprintf(fp, "\t%20s: %d\n", parserName(i), stats->prs_cnt[i]); + } + + int pp = 0; + fprintf(fp, "Parsers per Node:\n"); + fprintf(fp, "\tmax:\t%4d\n", stats->max_nparsers); + for(int i = 0 ; i < 100 ; ++i) { + pp += stats->nparsers_cnt[i]; + if(stats->nparsers_cnt[i] != 0) + fprintf(fp, "\t%d:\t%4d\n", i, stats->nparsers_cnt[i]); + } + + free(stats->prs_cnt); + free(stats); + + if(extendedStats) { + fprintf(fp, "Usage Statistics:\n" + "-----------------\n"); + fprintf(fp, "called, backtracked, rule\n"); + ln_pdagComponentClearVisited(dag); + ln_pdagStatsExtended(ctx, dag, fp, 0); + } +} + + +/** + * Gather and output pdag statistics for the full pdag (ctx) + * including all disconnected components (type defs). + * + * Data is sent to given file ptr. + */ +void +ln_fullPdagStats(ln_ctx ctx, FILE *const fp, const int extendedStats) +{ + if(ctx->ptree != NULL) { + /* we need to handle the old cruft */ + ln_fullPTreeStats(ctx, fp, extendedStats); + return; + } + + fprintf(fp, "User-Defined Types\n" + "==================\n"); + fprintf(fp, "number types: %d\n", ctx->nTypes); + for(int i = 0 ; i < ctx->nTypes ; ++i) + fprintf(fp, "type: %s\n", ctx->type_pdags[i].name); + + for(int i = 0 ; i < ctx->nTypes ; ++i) { + fprintf(fp, "\n" + "type PDAG: %s\n" + "----------\n", ctx->type_pdags[i].name); + ln_pdagStats(ctx, ctx->type_pdags[i].pdag, fp, extendedStats); + } + + fprintf(fp, "\n" + "Main PDAG\n" + "=========\n"); + ln_pdagStats(ctx, ctx->pdag, fp, extendedStats); + +#ifdef ADVANCED_STATS + const uint64_t parsers_failed = advstats_parsers_called - advstats_parsers_success; + fprintf(fp, "\n" + "Advanced Runtime Stats\n" + "======================\n"); + fprintf(fp, "These are actual number from analyzing the control flow " + "at runtime.\n"); + fprintf(fp, "Note that literal matching is also done via parsers. As such, \n" + "it is expected that fail rates increase with the size of the \n" + "rule base.\n"); + fprintf(fp, "\n"); + fprintf(fp, "Parser Calls:\n"); + fprintf(fp, "total....: %10" PRIu64 "\n", advstats_parsers_called); + fprintf(fp, "succesful: %10" PRIu64 "\n", advstats_parsers_success); + fprintf(fp, "failed...: %10" PRIu64 " [%d%%]\n", + parsers_failed, + (int) ((parsers_failed * 100) / advstats_parsers_called) ); + fprintf(fp, "\nIndividual Parser Calls " + "(never called parsers are not shown):\n"); + for( size_t i = 0 + ; i < sizeof(parser_lookup_table) / sizeof(struct ln_parser_info) + ; ++i) { + if(parser_lookup_table[i].called > 0) { + const uint64_t failed = parser_lookup_table[i].called + - parser_lookup_table[i].success; + fprintf(fp, "%20s: %10" PRIu64 " [%5.2f%%] " + "success: %10" PRIu64 " [%5.1f%%] " + "fail: %10" PRIu64 " [%5.1f%%]" + "\n", + parser_lookup_table[i].name, + parser_lookup_table[i].called, + (float)(parser_lookup_table[i].called * 100) + / advstats_parsers_called, + parser_lookup_table[i].success, + (float)(parser_lookup_table[i].success * 100) + / parser_lookup_table[i].called, + failed, + (float)(failed * 100) + / parser_lookup_table[i].called + ); + } + } + + uint64_t total_len; + uint64_t total_cnt; + fprintf(fp, "\n"); + fprintf(fp, "\n" + "Path Length Statistics\n" + "----------------------\n" + "The regular path length is the number of nodes being visited,\n" + "where each node potentially evaluates several parsers. The\n" + "parser call statistic is the number of parsers called along\n" + "the path. That number is higher, as multiple parsers may be\n" + "called at each node. The number of literal parser calls is\n" + "given explicitely, as they use almost no time to process.\n" + "\n" + ); + total_len = 0; + total_cnt = 0; + fprintf(fp, "Path Length\n"); + for(int i = 0 ; i < ADVSTATS_MAX_ENTITIES ; ++i) { + if(advstats_pathlens[i] > 0 ) { + fprintf(fp, "%3d: %d\n", i, advstats_pathlens[i]); + total_len += i * advstats_pathlens[i]; + total_cnt += advstats_pathlens[i]; + } + } + fprintf(fp, "avg: %f\n", (double) total_len / (double) total_cnt); + fprintf(fp, "max: %d\n", advstats_max_pathlen); + fprintf(fp, "\n"); + + total_len = 0; + total_cnt = 0; + fprintf(fp, "Nbr Backtracked\n"); + for(int i = 0 ; i < ADVSTATS_MAX_ENTITIES ; ++i) { + if(advstats_backtracks[i] > 0 ) { + fprintf(fp, "%3d: %d\n", i, advstats_backtracks[i]); + total_len += i * advstats_backtracks[i]; + total_cnt += advstats_backtracks[i]; + } + } + fprintf(fp, "avg: %f\n", (double) total_len / (double) total_cnt); + fprintf(fp, "max: %d\n", advstats_max_backtracked); + fprintf(fp, "\n"); + + /* we calc some stats while we output */ + total_len = 0; + total_cnt = 0; + fprintf(fp, "Parser Calls\n"); + for(int i = 0 ; i < ADVSTATS_MAX_ENTITIES ; ++i) { + if(advstats_parser_calls[i] > 0 ) { + fprintf(fp, "%3d: %d\n", i, advstats_parser_calls[i]); + total_len += i * advstats_parser_calls[i]; + total_cnt += advstats_parser_calls[i]; + } + } + fprintf(fp, "avg: %f\n", (double) total_len / (double) total_cnt); + fprintf(fp, "max: %d\n", advstats_max_parser_calls); + fprintf(fp, "\n"); + + total_len = 0; + total_cnt = 0; + fprintf(fp, "LITERAL Parser Calls\n"); + for(int i = 0 ; i < ADVSTATS_MAX_ENTITIES ; ++i) { + if(advstats_lit_parser_calls[i] > 0 ) { + fprintf(fp, "%3d: %d\n", i, advstats_lit_parser_calls[i]); + total_len += i * advstats_lit_parser_calls[i]; + total_cnt += advstats_lit_parser_calls[i]; + } + } + fprintf(fp, "avg: %f\n", (double) total_len / (double) total_cnt); + fprintf(fp, "max: %d\n", advstats_max_lit_parser_calls); + fprintf(fp, "\n"); +#endif +} + +/** + * Check if the provided dag is a leaf. This means that it + * does not contain any subdags. + * @return 1 if it is a leaf, 0 otherwise + */ +static inline int +isLeaf(struct ln_pdag *dag) +{ + return dag->nparsers == 0 ? 1 : 0; +} + + +/** + * Add a parser instance to the pdag at the current position. + * + * @param[in] ctx + * @param[in] prscnf json parser config *object* (no array!) + * @param[in] pdag current pdag position (to which parser is to be added) + * @param[in/out] nextnode contains point to the next node, either + * an existing one or one newly created. + * + * The nextnode parameter permits to use this function to create + * multiple parsers alternative parsers with a single run. To do so, + * set nextnode=NULL on first call. On successive calls, keep the + * value. If a value is present, we will not accept non-identical + * parsers which point to different nodes - this will result in an + * error. + * + * IMPORTANT: the caller is responsible to update its pdag pointer + * to the nextnode value when he is done adding parsers. + * + * If a parser of the same type with identical data already exists, + * it is "resued", which means the function is effectively used to + * walk the path. This is used during parser construction to + * navigate to new parts of the pdag. + */ +static int +ln_pdagAddParserInstance(ln_ctx ctx, + json_object *const __restrict__ prscnf, + struct ln_pdag *const __restrict__ pdag, + struct ln_pdag **nextnode) +{ + int r; + ln_parser_t *newtab; + LN_DBGPRINTF(ctx, "ln_pdagAddParserInstance: %s, nextnode %p", + json_object_to_json_string(prscnf), *nextnode); + ln_parser_t *const parser = ln_newParser(ctx, prscnf); + CHKN(parser); + LN_DBGPRINTF(ctx, "pdag: %p, parser %p", pdag, parser); + /* check if we already have this parser, if so, merge + */ + int i; + for(i = 0 ; i < pdag->nparsers ; ++i) { + LN_DBGPRINTF(ctx, "parser comparison:\n%s\n%s", pdag->parsers[i].conf, parser->conf); + if( pdag->parsers[i].prsid == parser->prsid + && !strcmp(pdag->parsers[i].conf, parser->conf)) { + // FIXME: the current ->conf object is depending on + // the order of json elements. We should do a JSON + // comparison (a bit more complex). For now, it + // works like we do it now. + // FIXME: if nextnode is set, check we can actually combine, + // else err out + *nextnode = pdag->parsers[i].node; + r = 0; + LN_DBGPRINTF(ctx, "merging with pdag %p", pdag); + pdagDeletePrs(ctx, parser); /* no need for data items */ + goto done; + } + } + /* if we reach this point, we have a new parser type */ + if(*nextnode == NULL) { + CHKN(*nextnode = ln_newPDAG(ctx)); /* we need a new node */ + } else { + (*nextnode)->refcnt++; + } + parser->node = *nextnode; + newtab = realloc(pdag->parsers, (pdag->nparsers+1) * sizeof(ln_parser_t)); + CHKN(newtab); + pdag->parsers = newtab; + memcpy(pdag->parsers+pdag->nparsers, parser, sizeof(ln_parser_t)); + pdag->nparsers++; + + r = 0; + +done: + free(parser); + return r; +} + +static int ln_pdagAddParserInternal(ln_ctx ctx, struct ln_pdag **pdag, const int mode, json_object *const prscnf, +struct ln_pdag **nextnode); + +/** + * add parsers to current pdag. This is used + * to add parsers stored in an array. The mode specifies + * how parsers shall be added. + */ +#define PRS_ADD_MODE_SEQ 0 +#define PRS_ADD_MODE_ALTERNATIVE 1 +static int +ln_pdagAddParsers(ln_ctx ctx, + json_object *const prscnf, + const int mode, + struct ln_pdag **pdag, + struct ln_pdag **p_nextnode) +{ + int r = LN_BADCONFIG; + struct ln_pdag *dag = *pdag; + struct ln_pdag *nextnode = *p_nextnode; + + const int lenarr = json_object_array_length(prscnf); + for(int i = 0 ; i < lenarr ; ++i) { + struct json_object *const curr_prscnf = + json_object_array_get_idx(prscnf, i); + LN_DBGPRINTF(ctx, "parser %d: %s", i, json_object_to_json_string(curr_prscnf)); + if(json_object_get_type(curr_prscnf) == json_type_array) { + struct ln_pdag *local_dag = dag; + CHKR(ln_pdagAddParserInternal(ctx, &local_dag, mode, + curr_prscnf, &nextnode)); + if(mode == PRS_ADD_MODE_SEQ) { + dag = local_dag; + } + } else { + CHKR(ln_pdagAddParserInstance(ctx, curr_prscnf, dag, &nextnode)); + } + if(mode == PRS_ADD_MODE_SEQ) { + dag = nextnode; + *p_nextnode = nextnode; + nextnode = NULL; + } + } + + if(mode != PRS_ADD_MODE_SEQ) + dag = nextnode; + *pdag = dag; + r = 0; +done: + return r; +} + +/* add a json parser config object. Note that this object may contain + * multiple parser instances. Additionally, moves the pdag object to + * the next node, which is either newly created or previously existed. + */ +static int +ln_pdagAddParserInternal(ln_ctx ctx, struct ln_pdag **pdag, + const int mode, json_object *const prscnf, struct ln_pdag **nextnode) +{ + int r = LN_BADCONFIG; + struct ln_pdag *dag = *pdag; + + LN_DBGPRINTF(ctx, "ln_pdagAddParserInternal: %s", json_object_to_json_string(prscnf)); + if(json_object_get_type(prscnf) == json_type_object) { + /* check for special types we need to handle here */ + struct json_object *json; + json_object_object_get_ex(prscnf, "type", &json); + const char *const ftype = json_object_get_string(json); + if(!strcmp(ftype, "alternative")) { + json_object_object_get_ex(prscnf, "parser", &json); + if(json_object_get_type(json) != json_type_array) { + ln_errprintf(ctx, 0, "alternative type needs array of parsers. " + "Object: '%s', type is %s", + json_object_to_json_string(prscnf), + json_type_to_name(json_object_get_type(json))); + goto done; + } + CHKR(ln_pdagAddParsers(ctx, json, PRS_ADD_MODE_ALTERNATIVE, &dag, nextnode)); + } else { + CHKR(ln_pdagAddParserInstance(ctx, prscnf, dag, nextnode)); + if(mode == PRS_ADD_MODE_SEQ) + dag = *nextnode; + } + } else if(json_object_get_type(prscnf) == json_type_array) { + CHKR(ln_pdagAddParsers(ctx, prscnf, PRS_ADD_MODE_SEQ, &dag, nextnode)); + } else { + ln_errprintf(ctx, 0, "bug: prscnf object of wrong type. Object: '%s'", + json_object_to_json_string(prscnf)); + goto done; + } + *pdag = dag; + +done: + return r; +} + +/* add a json parser config object. Note that this object may contain + * multiple parser instances. Additionally, moves the pdag object to + * the next node, which is either newly created or previously existed. + */ +int +ln_pdagAddParser(ln_ctx ctx, struct ln_pdag **pdag, json_object *const prscnf) +{ + struct ln_pdag *nextnode = NULL; + int r = ln_pdagAddParserInternal(ctx, pdag, PRS_ADD_MODE_SEQ, prscnf, &nextnode); + json_object_put(prscnf); + return r; +} + + +void +ln_displayPDAGComponent(struct ln_pdag *dag, int level) +{ + char indent[2048]; + + if(level > 1023) + level = 1023; + memset(indent, ' ', level * 2); + indent[level * 2] = '\0'; + + LN_DBGPRINTF(dag->ctx, "%ssubDAG%s %p (children: %d parsers, ref %d) [called %u, backtracked %u]", + indent, dag->flags.isTerminal ? " [TERM]" : "", dag, dag->nparsers, dag->refcnt, + dag->stats.called, dag->stats.backtracked); + +for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *const prs = dag->parsers+i; + LN_DBGPRINTF(dag->ctx, "%sfield type '%s', name '%s': '%s': called %u", indent, + parserName(prs->prsid), + dag->parsers[i].name, + (prs->prsid == PRS_LITERAL) ? ln_DataForDisplayLiteral(dag->ctx, prs->parser_data) : "UNKNOWN", + dag->parsers[i].node->stats.called); +} + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *const prs = dag->parsers+i; + LN_DBGPRINTF(dag->ctx, "%sfield type '%s', name '%s': '%s':", indent, + parserName(prs->prsid), + dag->parsers[i].name, + (prs->prsid == PRS_LITERAL) ? ln_DataForDisplayLiteral(dag->ctx, prs->parser_data) : + "UNKNOWN"); + if(prs->prsid == PRS_REPEAT) { + struct data_Repeat *const data = (struct data_Repeat*) prs->parser_data; + LN_DBGPRINTF(dag->ctx, "%sparser:", indent); + ln_displayPDAGComponent(data->parser, level + 1); + LN_DBGPRINTF(dag->ctx, "%swhile:", indent); + ln_displayPDAGComponent(data->while_cond, level + 1); + LN_DBGPRINTF(dag->ctx, "%send repeat def", indent); + } + ln_displayPDAGComponent(dag->parsers[i].node, level + 1); + } +} + + +void ln_displayPDAGComponentAlternative(struct ln_pdag *dag, int level) +{ + char indent[2048]; + + if(level > 1023) + level = 1023; + memset(indent, ' ', level * 2); + indent[level * 2] = '\0'; + + LN_DBGPRINTF(dag->ctx, "%s%p[ref %d]: %s", indent, dag, dag->refcnt, dag->rb_id); + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_displayPDAGComponentAlternative(dag->parsers[i].node, level + 1); + } +} + + +/* developer debug aid, to be used for example as follows: + * LN_DBGPRINTF(dag->ctx, "---------------------------------------"); + * ln_displayPDAG(dag); + * LN_DBGPRINTF(dag->ctx, "======================================="); + */ +void +ln_displayPDAG(ln_ctx ctx) +{ + ln_pdagClearVisited(ctx); + for(int i = 0 ; i < ctx->nTypes ; ++i) { + LN_DBGPRINTF(ctx, "COMPONENT: %s", ctx->type_pdags[i].name); + ln_displayPDAGComponent(ctx->type_pdags[i].pdag, 0); + } + + LN_DBGPRINTF(ctx, "MAIN COMPONENT:"); + ln_displayPDAGComponent(ctx->pdag, 0); + + LN_DBGPRINTF(ctx, "MAIN COMPONENT (alternative):"); + ln_displayPDAGComponentAlternative(ctx->pdag, 0); +} + + +/* the following is a quick hack, which should be moved to the + * string class. + */ +static inline void dotAddPtr(es_str_t **str, void *p) +{ + char buf[64]; + int i; + i = snprintf(buf, sizeof(buf), "l%p", p); + es_addBuf(str, buf, i); +} +struct data_Literal { const char *lit; }; // TODO remove when this hack is no longe needed +/** + * recursive handler for DOT graph generator. + */ +static void +ln_genDotPDAGGraphRec(struct ln_pdag *dag, es_str_t **str) +{ + char s_refcnt[16]; + LN_DBGPRINTF(dag->ctx, "in dot: %p, visited %d", dag, (int) dag->flags.visited); + if(dag->flags.visited) + return; /* already processed this subpart */ + dag->flags.visited = 1; + dotAddPtr(str, dag); + snprintf(s_refcnt, sizeof(s_refcnt), "%d", dag->refcnt); + s_refcnt[sizeof(s_refcnt)-1] = '\0'; + es_addBufConstcstr(str, " [ label=\""); + es_addBuf(str, s_refcnt, strlen(s_refcnt)); + es_addBufConstcstr(str, "\""); + + if(isLeaf(dag)) { + es_addBufConstcstr(str, " style=\"bold\""); + } + es_addBufConstcstr(str, "]\n"); + + /* display field subdags */ + + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *const prs = dag->parsers+i; + dotAddPtr(str, dag); + es_addBufConstcstr(str, " -> "); + dotAddPtr(str, prs->node); + es_addBufConstcstr(str, " [label=\""); + es_addBuf(str, parserName(prs->prsid), strlen(parserName(prs->prsid))); + es_addBufConstcstr(str, ":"); + //es_addStr(str, node->name); + if(prs->prsid == PRS_LITERAL) { + for(const char *p = ((struct data_Literal*)prs->parser_data)->lit ; *p ; ++p) { + // TODO: handle! if(*p == '\\') + //es_addChar(str, '\\'); + if(*p != '\\' && *p != '"') + es_addChar(str, *p); + } + } + es_addBufConstcstr(str, "\""); + es_addBufConstcstr(str, " style=\"dotted\"]\n"); + ln_genDotPDAGGraphRec(prs->node, str); + } +} + + +void +ln_genDotPDAGGraph(struct ln_pdag *dag, es_str_t **str) +{ + ln_pdagClearVisited(dag->ctx); + es_addBufConstcstr(str, "digraph pdag {\n"); + ln_genDotPDAGGraphRec(dag, str); + es_addBufConstcstr(str, "}\n"); +} + +/** + * recursive handler for statistics DOT graph generator. + */ +static void +ln_genStatsDotPDAGGraphRec(struct ln_pdag *dag, FILE *const __restrict__ fp) +{ + if(dag->flags.visited) + return; /* already processed this subpart */ + dag->flags.visited = 1; + fprintf(fp, "l%p [ label=\"%u:%u\"", dag, + dag->stats.called, dag->stats.backtracked); + + if(isLeaf(dag)) { + fprintf(fp, " style=\"bold\""); + } + fprintf(fp, "]\n"); + + /* display field subdags */ + + for(int i = 0 ; i < dag->nparsers ; ++i) { + ln_parser_t *const prs = dag->parsers+i; + if(prs->node->stats.called == 0) + continue; + fprintf(fp, "l%p -> l%p [label=\"", dag, prs->node); + if(prs->prsid == PRS_LITERAL) { + for(const char *p = ((struct data_Literal*)prs->parser_data)->lit ; *p ; ++p) { + if(*p != '\\' && *p != '"') + fputc(*p, fp); + } + } else { + fprintf(fp, "%s", parserName(prs->prsid)); + } + fprintf(fp, "\" style=\"dotted\"]\n"); + ln_genStatsDotPDAGGraphRec(prs->node, fp); + } +} + + +static void +ln_genStatsDotPDAGGraph(struct ln_pdag *dag, FILE *const fp) +{ + ln_pdagClearVisited(dag->ctx); + fprintf(fp, "digraph pdag {\n"); + ln_genStatsDotPDAGGraphRec(dag, fp); + fprintf(fp, "}\n"); +} + +void +ln_fullPDagStatsDOT(ln_ctx ctx, FILE *const fp) +{ + ln_genStatsDotPDAGGraph(ctx->pdag, fp); +} + + +static inline int +addOriginalMsg(const char *str, const size_t strLen, struct json_object *const json) +{ + int r = 1; + struct json_object *value; + + value = json_object_new_string_len(str, strLen); + if (value == NULL) { + goto done; + } + json_object_object_add(json, ORIGINAL_MSG_KEY, value); + r = 0; +done: + return r; +} + +static char * +strrev(char *const __restrict__ str) +{ + char ch; + size_t i = strlen(str)-1,j=0; + while(i>j) + { + ch = str[i]; + str[i]= str[j]; + str[j] = ch; + i--; + j++; + } + return str; +} + +/* note: "originalmsg" is NOT added as metadata in order to keep + * backwards compatible. + */ +static inline void +addRuleMetadata(npb_t *const __restrict__ npb, + struct json_object *const json, + struct ln_pdag *const __restrict__ endNode) +{ + ln_ctx ctx = npb->ctx; + struct json_object *meta = NULL; + struct json_object *meta_rule = NULL; + struct json_object *value; + + if(ctx->opts & LN_CTXOPT_ADD_RULE) { /* matching rule mockup */ + if(meta_rule == NULL) + meta_rule = json_object_new_object(); + char *cstr = strrev(es_str2cstr(npb->rule, NULL)); + json_object_object_add(meta_rule, RULE_MOCKUP_KEY, + json_object_new_string(cstr)); + free(cstr); + } + + if(ctx->opts & LN_CTXOPT_ADD_RULE_LOCATION) { + if(meta_rule == NULL) + meta_rule = json_object_new_object(); + struct json_object *const location = json_object_new_object(); + value = json_object_new_string(endNode->rb_file); + json_object_object_add(location, "file", value); + value = json_object_new_int((int)endNode->rb_lineno); + json_object_object_add(location, "line", value); + json_object_object_add(meta_rule, RULE_LOCATION_KEY, location); + } + + if(meta_rule != NULL) { + if(meta == NULL) + meta = json_object_new_object(); + json_object_object_add(meta, META_RULE_KEY, meta_rule); + } + +#ifdef ADVANCED_STATS + /* complete execution path */ + if(ctx->opts & LN_CTXOPT_ADD_EXEC_PATH) { + if(meta == NULL) + meta = json_object_new_object(); + char hdr[128]; + const size_t lenhdr + = snprintf(hdr, sizeof(hdr), "[PATHLEN:%d, PARSER CALLS gen:%d, literal:%d]", + npb->astats.pathlen, npb->astats.parser_calls, + npb->astats.lit_parser_calls); + es_addBuf(&npb->astats.exec_path, hdr, lenhdr); + char * cstr = es_str2cstr(npb->astats.exec_path, NULL); + value = json_object_new_string(cstr); + if (value != NULL) { + json_object_object_add(meta, EXEC_PATH_KEY, value); + } + free(cstr); + } +#endif + + if(meta != NULL) + json_object_object_add(json, META_KEY, meta); +} + + +/** + * add unparsed string to event. + */ +static inline int +addUnparsedField(const char *str, const size_t strLen, const size_t offs, struct json_object *json) +{ + int r = 1; + struct json_object *value; + + CHKR(addOriginalMsg(str, strLen, json)); + + value = json_object_new_string(str + offs); + if (value == NULL) { + goto done; + } + json_object_object_add(json, UNPARSED_DATA_KEY, value); + + r = 0; +done: + return r; +} + + +/* Do some fixup to the json that we cannot do on a lower layer */ +static int +fixJSON(struct ln_pdag *dag, + struct json_object **value, + struct json_object *json, + const ln_parser_t *const prs) + +{ + int r = LN_WRONGPARSER; + + if(prs->name == NULL) { + if (*value != NULL) { + /* Free the unneeded value */ + json_object_put(*value); + } + } else if(prs->name[0] == '.' && prs->name[1] == '\0') { + if(json_object_get_type(*value) == json_type_object) { + struct json_object_iterator it = json_object_iter_begin(*value); + struct json_object_iterator itEnd = json_object_iter_end(*value); + while (!json_object_iter_equal(&it, &itEnd)) { + struct json_object *const val = json_object_iter_peek_value(&it); + json_object_get(val); + json_object_object_add(json, json_object_iter_peek_name(&it), val); + json_object_iter_next(&it); + } + json_object_put(*value); + } else { + LN_DBGPRINTF(dag->ctx, "field name is '.', but json type is %s", + json_type_to_name(json_object_get_type(*value))); + json_object_object_add_ex(json, prs->name, *value, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } + } else { + int isDotDot = 0; + struct json_object *valDotDot = NULL; + if(json_object_get_type(*value) == json_type_object) { + /* TODO: this needs to be speeded up by just checking the first + * member and ensuring there is only one member. This requires + * extensions to libfastjson. + */ + int nSubobj = 0; + struct json_object_iterator it = json_object_iter_begin(*value); + struct json_object_iterator itEnd = json_object_iter_end(*value); + while (!json_object_iter_equal(&it, &itEnd)) { + ++nSubobj; + const char *key = json_object_iter_peek_name(&it); + if(key[0] == '.' && key[1] == '.' && key[2] == '\0') { + isDotDot = 1; + valDotDot = json_object_iter_peek_value(&it); + } else { + isDotDot = 0; + } + json_object_iter_next(&it); + } + if(nSubobj != 1) + isDotDot = 0; + } + if(isDotDot) { + LN_DBGPRINTF(dag->ctx, "subordinate field name is '..', combining"); + json_object_get(valDotDot); + json_object_put(*value); + json_object_object_add_ex(json, prs->name, valDotDot, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } else { + json_object_object_add_ex(json, prs->name, *value, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } + } + r = 0; + return r; +} + +// TODO: streamline prototype when done with changes + +static int +tryParser(npb_t *const __restrict__ npb, + struct ln_pdag *dag, + size_t *offs, + size_t *const __restrict__ pParsed, + struct json_object **value, + const ln_parser_t *const prs + ) +{ + int r; + struct ln_pdag *endNode = NULL; + size_t parsedTo = npb->parsedTo; +# ifdef ADVANCED_STATS + char hdr[16]; + const size_t lenhdr + = snprintf(hdr, sizeof(hdr), "%d:", npb->astats.recursion_level); + es_addBuf(&npb->astats.exec_path, hdr, lenhdr); + if(prs->prsid == PRS_LITERAL) { + es_addChar(&npb->astats.exec_path, '\''); + es_addBuf(&npb->astats.exec_path, + ln_DataForDisplayLiteral(dag->ctx, + prs->parser_data), + strlen(ln_DataForDisplayLiteral(dag->ctx, + prs->parser_data)) + ); + es_addChar(&npb->astats.exec_path, '\''); + } else if(parser_lookup_table[prs->prsid].parser + == ln_v2_parseCharTo) { + es_addBuf(&npb->astats.exec_path, + ln_DataForDisplayCharTo(dag->ctx, + prs->parser_data), + strlen(ln_DataForDisplayCharTo(dag->ctx, + prs->parser_data)) + ); + } else { + es_addBuf(&npb->astats.exec_path, + parserName(prs->prsid), + strlen(parserName(prs->prsid)) ); + } + es_addChar(&npb->astats.exec_path, ','); +# endif + + if(prs->prsid == PRS_CUSTOM_TYPE) { + if(*value == NULL) + *value = json_object_new_object(); + LN_DBGPRINTF(dag->ctx, "calling custom parser '%s'", prs->custType->name); + r = ln_normalizeRec(npb, prs->custType->pdag, *offs, 1, *value, &endNode); + LN_DBGPRINTF(dag->ctx, "called CUSTOM PARSER '%s', result %d, " + "offs %zd, *pParsed %zd", prs->custType->name, r, *offs, *pParsed); + *pParsed = npb->parsedTo - *offs; + #ifdef ADVANCED_STATS + es_addBuf(&npb->astats.exec_path, hdr, lenhdr); + es_addBuf(&npb->astats.exec_path, "[R:USR],", 8); + #endif + } else { + r = parser_lookup_table[prs->prsid].parser(npb, + offs, prs->parser_data, pParsed, (prs->name == NULL) ? NULL : value); + } + LN_DBGPRINTF(npb->ctx, "parser lookup returns %d, pParsed %zu", r, *pParsed); + npb->parsedTo = parsedTo; + +#ifdef ADVANCED_STATS + ++advstats_parsers_called; + ++npb->astats.parser_calls; + if(prs->prsid == PRS_LITERAL) + ++npb->astats.lit_parser_calls; + if(r == 0) + ++advstats_parsers_success; + if(prs->prsid != PRS_CUSTOM_TYPE) { + ++parser_lookup_table[prs->prsid].called; + if(r == 0) + ++parser_lookup_table[prs->prsid].success; + } +#endif + return r; +} + + +static void +add_str_reversed(npb_t *const __restrict__ npb, + const char *const __restrict__ str, + const size_t len) +{ + ssize_t i; + for(i = len - 1 ; i >= 0 ; --i) { + es_addChar(&npb->rule, str[i]); + } +} + + +/* Add the current parser to the mockup rule. + * Note: we add reversed strings, because we can call this + * function effectively only when walking upwards the tree. + * This means deepest entries come first. We solve this somewhat + * elegantly by reversion strings, and then reversion the string + * once more when we emit it, so that we get the right order. + */ +static inline void +add_rule_to_mockup(npb_t *const __restrict__ npb, + const ln_parser_t *const __restrict__ prs) +{ + if(prs->prsid == PRS_LITERAL) { + const char *const val = + ln_DataForDisplayLiteral(npb->ctx, + prs->parser_data); + add_str_reversed(npb, val, strlen(val)); + } else { + /* note: name/value order must also be reversed! */ + es_addChar(&npb->rule, '%'); + add_str_reversed(npb, + parserName(prs->prsid), + strlen(parserName(prs->prsid)) ); + es_addChar(&npb->rule, ':'); + if(prs->name == NULL) { + es_addChar(&npb->rule, '-'); + } else { + add_str_reversed(npb, prs->name, strlen(prs->name)); + } + es_addChar(&npb->rule, '%'); + } +} + +/** + * Recursive step of the normalizer. It walks the parse dag and calls itself + * recursively when this is appropriate. It also implements backtracking in + * those (hopefully rare) cases where it is required. + * + * @param[in] dag current tree to process + * @param[in] string string to be matched against (the to-be-normalized data) + * @param[in] strLen length of the to-be-matched string + * @param[in] offs start position in input data + * @param[out] pPrasedTo ptr to position up to which the the parsing succed in max + * @param[in/out] json ... that is being created during normalization + * @param[out] endNode if a match was found, this is the matching node (undefined otherwise) + * + * @return regular liblognorm error code (0->OK, something else->error) + * TODO: can we use parameter block to prevent pushing params to the stack? + */ +int +ln_normalizeRec(npb_t *const __restrict__ npb, + struct ln_pdag *dag, + const size_t offs, + const int bPartialMatch, + struct json_object *json, + struct ln_pdag **endNode + ) +{ + int r = LN_WRONGPARSER; + int localR; + size_t i; + size_t iprs; + size_t parsedTo = npb->parsedTo; + size_t parsed = 0; + struct json_object *value; + +LN_DBGPRINTF(dag->ctx, "%zu: enter parser, dag node %p, json %p", offs, dag, json); + + ++dag->stats.called; +#ifdef ADVANCED_STATS + ++npb->astats.pathlen; + ++npb->astats.recursion_level; +#endif + + /* now try the parsers */ + for(iprs = 0 ; iprs < dag->nparsers && r != 0 ; ++iprs) { + const ln_parser_t *const prs = dag->parsers + iprs; + if(dag->ctx->debug) { + LN_DBGPRINTF(dag->ctx, "%zu/%d:trying '%s' parser for field '%s', " + "data '%s'", + offs, bPartialMatch, parserName(prs->prsid), prs->name, + (prs->prsid == PRS_LITERAL) + ? ln_DataForDisplayLiteral(dag->ctx, prs->parser_data) + : "UNKNOWN"); + } + i = offs; + value = NULL; + localR = tryParser(npb, dag, &i, &parsed, &value, prs); + if(localR == 0) { + parsedTo = i + parsed; + /* potential hit, need to verify */ + LN_DBGPRINTF(dag->ctx, "%zu: potential hit, trying subtree %p", + offs, prs->node); + r = ln_normalizeRec(npb, prs->node, parsedTo, + bPartialMatch, json, endNode); + LN_DBGPRINTF(dag->ctx, "%zu: subtree returns %d, parsedTo %zu", offs, r, parsedTo); + if(r == 0) { + LN_DBGPRINTF(dag->ctx, "%zu: parser matches at %zu", offs, i); + CHKR(fixJSON(dag, &value, json, prs)); + if(npb->ctx->opts & LN_CTXOPT_ADD_RULE) { + add_rule_to_mockup(npb, prs); + } + } else { + ++dag->stats.backtracked; + #ifdef ADVANCED_STATS + ++npb->astats.backtracked; + es_addBuf(&npb->astats.exec_path, "[B]", 3); + #endif + LN_DBGPRINTF(dag->ctx, "%zu nonmatch, backtracking required, parsed to=%zu", + offs, parsedTo); + if (value != NULL) { /* Free the value if it was created */ + json_object_put(value); + } + } + } + /* did we have a longer parser --> then update */ + if(parsedTo > npb->parsedTo) + npb->parsedTo = parsedTo; + LN_DBGPRINTF(dag->ctx, "parsedTo %zu, *pParsedTo %zu", parsedTo, npb->parsedTo); + } + +LN_DBGPRINTF(dag->ctx, "offs %zu, strLen %zu, isTerm %d", offs, npb->strLen, dag->flags.isTerminal); + if(dag->flags.isTerminal && (offs == npb->strLen || bPartialMatch)) { + *endNode = dag; + r = 0; + goto done; + } + +done: + LN_DBGPRINTF(dag->ctx, "%zu returns %d, pParsedTo %zu, parsedTo %zu", + offs, r, npb->parsedTo, parsedTo); +# ifdef ADVANCED_STATS + --npb->astats.recursion_level; +# endif + return r; +} + +int +ln_normalize(ln_ctx ctx, const char *str, const size_t strLen, struct json_object **json_p) +{ + int r; + struct ln_pdag *endNode = NULL; + /* old cruft */ + if(ctx->version == 1) { + r = ln_v1_normalize(ctx, str, strLen, json_p); + goto done; + } + /* end old cruft */ + + npb_t npb; + memset(&npb, 0, sizeof(npb)); + npb.ctx = ctx; + npb.str = str; + npb.strLen = strLen; + if(ctx->opts & LN_CTXOPT_ADD_RULE) { + npb.rule = es_newStr(1024); + } +# ifdef ADVANCED_STATS + npb.astats.exec_path = es_newStr(1024); +# endif + + if(*json_p == NULL) { + CHKN(*json_p = json_object_new_object()); + } + + r = ln_normalizeRec(&npb, ctx->pdag, 0, 0, *json_p, &endNode); + + if(ctx->debug) { + if(r == 0) { + LN_DBGPRINTF(ctx, "final result for normalizer: parsedTo %zu, endNode %p, " + "isTerminal %d, tagbucket %p", + npb.parsedTo, endNode, endNode->flags.isTerminal, endNode->tags); + } else { + LN_DBGPRINTF(ctx, "final result for normalizer: parsedTo %zu, endNode %p", + npb.parsedTo, endNode); + } + } + LN_DBGPRINTF(ctx, "DONE, final return is %d", r); + if(r == 0 && endNode->flags.isTerminal) { + /* success, finalize event */ + if(endNode->tags != NULL) { + /* add tags to an event */ + json_object_get(endNode->tags); + json_object_object_add(*json_p, "event.tags", endNode->tags); + CHKR(ln_annotate(ctx, *json_p, endNode->tags)); + } + if(ctx->opts & LN_CTXOPT_ADD_ORIGINALMSG) { + /* originalmsg must be kept outside of metadata for + * backward compatibility reasons. + */ + json_object_object_add(*json_p, ORIGINAL_MSG_KEY, + json_object_new_string_len(str, strLen)); + } + addRuleMetadata(&npb, *json_p, endNode); + r = 0; + } else { + addUnparsedField(str, strLen, npb.parsedTo, *json_p); + } + + if(ctx->opts & LN_CTXOPT_ADD_RULE) { + es_deleteStr(npb.rule); + } + +#ifdef ADVANCED_STATS + if(r != 0) + es_addBuf(&npb.astats.exec_path, "[FAILED]", 8); + else if(!endNode->flags.isTerminal) + es_addBuf(&npb.astats.exec_path, "[FAILED:NON-TERMINAL]", 21); + if(npb.astats.pathlen < ADVSTATS_MAX_ENTITIES) + advstats_pathlens[npb.astats.pathlen]++; + if(npb.astats.pathlen > advstats_max_pathlen) { + advstats_max_pathlen = npb.astats.pathlen; + } + if(npb.astats.backtracked < ADVSTATS_MAX_ENTITIES) + advstats_backtracks[npb.astats.backtracked]++; + if(npb.astats.backtracked > advstats_max_backtracked) { + advstats_max_backtracked = npb.astats.backtracked; + } + + /* parser calls */ + if(npb.astats.parser_calls < ADVSTATS_MAX_ENTITIES) + advstats_parser_calls[npb.astats.parser_calls]++; + if(npb.astats.parser_calls > advstats_max_parser_calls) { + advstats_max_parser_calls = npb.astats.parser_calls; + } + if(npb.astats.lit_parser_calls < ADVSTATS_MAX_ENTITIES) + advstats_lit_parser_calls[npb.astats.lit_parser_calls]++; + if(npb.astats.lit_parser_calls > advstats_max_lit_parser_calls) { + advstats_max_lit_parser_calls = npb.astats.lit_parser_calls; + } + + es_deleteStr(npb.astats.exec_path); +#endif +done: return r; +} diff --git a/src/pdag.h b/src/pdag.h new file mode 100644 index 0000000..a9a1a3d --- /dev/null +++ b/src/pdag.h @@ -0,0 +1,268 @@ +/** + * @file pdag.h + * @brief The parse DAG object. + * @class ln_pdag pdag.h + *//* + * Copyright 2015 by Rainer Gerhards and Adiscon GmbH. + * + * Released under ASL 2.0. + */ +#ifndef LIBLOGNORM_PDAG_H_INCLUDED +#define LIBLOGNORM_PDAG_H_INCLUDED +#include +#include +#include + +#define META_KEY "metadata" +#define ORIGINAL_MSG_KEY "originalmsg" +#define UNPARSED_DATA_KEY "unparsed-data" +#define EXEC_PATH_KEY "exec-path" +#define META_RULE_KEY "rule" +#define RULE_MOCKUP_KEY "mockup" +#define RULE_LOCATION_KEY "location" + +typedef struct ln_pdag ln_pdag; /**< the parse DAG object */ +typedef struct ln_parser_s ln_parser_t; +typedef struct npb npb_t; +typedef uint8_t prsid_t; + +struct ln_type_pdag; + +/** + * parser IDs. + * + * These identfy a parser. VERY IMPORTANT: they must start at zero + * and continously increment. They must exactly match the index + * of the respective parser inside the parser lookup table. + */ +#define PRS_LITERAL 0 +#define PRS_REPEAT 1 +#if 0 +#define PRS_DATE_RFC3164 1 +#define PRS_DATE_RFC5424 2 +#define PRS_NUMBER 3 +#define PRS_FLOAT 4 +#define PRS_HEXNUMBER 5 +#define PRS_KERNEL_TIMESTAMP 6 +#define PRS_WHITESPACE 7 +#define PRS_IPV4 8 +#define PRS_IPV6 9 +#define PRS_WORD 10 +#define PRS_ALPHA 11 +#define PRS_REST 12 +#define PRS_OP_QUOTED_STRING 13 +#define PRS_QUOTED_STRING 14 +#define PRS_DATE_ISO 15 +#define PRS_TIME_24HR 16 +#define PRS_TIME_12HR 17 +#define PRS_DURATION 18 +#define PRS_CISCO_INTERFACE_SPEC 19 +#define PRS_NAME_VALUE_LIST 20 +#define PRS_JSON 21 +#define PRS_CEE_SYSLOG 22 +#define PRS_MAC48 23 +#define PRS_CEF 24 +#define PRS_CHECKPOINT_LEA 25 +#define PRS_v2_IPTABLES 26 +#define PRS_STRING_TO 27 +#define PRS_CHAR_TO 28 +#define PRS_CHAR_SEP 29 +#endif + +#define PRS_CUSTOM_TYPE 254 +#define PRS_INVALID 255 +/* NOTE: current max limit on parser ID is 255, because we use uint8_t + * for the prsid_t type (which gains cache performance). If more parsers + * come up, the type must be modified. + */ +/** + * object describing a specific parser instance. + */ +struct ln_parser_s { + prsid_t prsid; /**< parser ID (for lookup table) */ + ln_pdag *node; /**< node to branch to if parser succeeded */ + void *parser_data; /**< opaque data that the field-parser understands */ + struct ln_type_pdag *custType; /**< points to custom type, if such is used */ + int prio; /**< priority (combination of user- and parser-specific parts) */ + const char *name; /**< field name */ + const char *conf; /**< configuration as printable json for comparison reasons */ +}; + +struct ln_parser_info { + const char *name; /**< parser name as used in rule base */ + int prio; /**< parser specific prio in range 0..255 */ + int (*construct)(ln_ctx ctx, json_object *const json, void **); + int (*parser)(npb_t *npb, size_t*, void *const, + size_t*, struct json_object **); /**< parser to use */ + void (*destruct)(ln_ctx, void *const); /* note: destructor is only needed if parser data exists */ +#ifdef ADVANCED_STATS + uint64_t called; + uint64_t success; +#endif +}; + + +/* parse DAG object + */ +struct ln_pdag { + ln_ctx ctx; /**< our context */ // TODO: why do we need it? + ln_parser_t *parsers; /* array of parsers to try */ + prsid_t nparsers; /**< current table size (prsid_t slighly abused) */ + struct { + unsigned isTerminal:1; /**< designates this node a terminal sequence */ + unsigned visited:1; /**< work var for recursive procedures */ + } flags; + struct json_object *tags; /**< tags to assign to events of this type */ + int refcnt; /**< reference count for deleting tracking */ + struct { + unsigned called; + unsigned backtracked; /**< incremented when backtracking was initiated */ + unsigned terminated; + } stats; /**< usage statistics */ + const char *rb_id; /**< human-readable rulebase identifier, for stats etc */ + + // experimental, move outside later + const char *rb_file; + unsigned int rb_lineno; +}; + +#ifdef ADVANCED_STATS +struct advstats { + int pathlen; + int parser_calls; /**< parser calls in general during path */ + int lit_parser_calls; /**< same just for the literal parser */ + int backtracked; + int recursion_level; + es_str_t *exec_path; +}; +#define ADVSTATS_MAX_ENTITIES 100 +extern int advstats_max_pathlen; +extern int advstats_pathlens[ADVSTATS_MAX_ENTITIES]; +extern int advstats_max_backtracked; +extern int advstats_backtracks[ADVSTATS_MAX_ENTITIES]; +#endif + +/** the "normalization paramater block" (npb) + * This structure is passed to all normalization routines including + * parsers. It contains data that commonly needs to be passed, + * like the to be parsed string and its length, as well as read/write + * data which is used to track information over the general + * normalization process (like the execution path, if requested). + * The main purpose is to save stack writes by eliminating the + * need for using multiple function parameters. Note that it + * must be carefully considered which items to add to the + * npb - those that change from recursion level to recursion + * level are NOT to be placed here. + */ +struct npb { + ln_ctx ctx; + const char *str; /**< to-be-normalized message */ + size_t strLen; /**< length of it */ + size_t parsedTo; /**< up to which byte could this be parsed? */ + es_str_t *rule; /**< a mock-up of the rule used to parse */ + es_str_t *exec_path; +#ifdef ADVANCED_STATS + int pathlen; + int backtracked; + int recursion_level; + struct advstats astats; +#endif +}; + +/* Methods */ + +/** + * Allocates and initializes a new parse DAG node. + * @memberof ln_pdag + * + * @param[in] ctx current library context. This MUST match the + * context of the parent. + * @param[in] parent pointer to the new node inside the parent + * + * @return pointer to new node or NULL on error + */ +struct ln_pdag* ln_newPDAG(ln_ctx ctx); + + +/** + * Free a parse DAG and destruct all members. + * @memberof ln_pdag + * + * @param[in] DAG pointer to pdag to free + */ +void ln_pdagDelete(struct ln_pdag *DAG); + + +/** + * Add parser to dag node. + * Works on unoptimzed dag. + * + * @param[in] pdag pointer to pdag to modify + * @param[in] parser parser definition + * @returns 0 on success, something else otherwise + */ +int ln_pdagAddParser(ln_ctx ctx, struct ln_pdag **pdag, json_object *); + + +/** + * Display the content of a pdag (debug function). + * This is a debug aid that spits out a textual representation + * of the provided pdag via multiple calls of the debug callback. + * + * @param DAG pdag to display + */ +void ln_displayPDAG(ln_ctx ctx); + + +/** + * Generate a DOT graph. + * Well, actually it does not generate the graph itself, but a + * control file that is suitable for the GNU DOT tool. Such a file + * can be very useful to understand complex sample databases + * (not to mention that it is probably fun for those creating + * samples). + * The dot commands are appended to the provided string. + * + * @param[in] DAG pdag to display + * @param[out] str string which receives the DOT commands. + */ +void ln_genDotPDAGGraph(struct ln_pdag *DAG, es_str_t **str); + + +/** + * Build a pdag based on the provided string, but only if necessary. + * The passed-in DAG is searched and traversed for str. If a node exactly + * matching str is found, that node is returned. If no exact match is found, + * a new node is added. Existing nodes may be split, if a so-far common + * prefix needs to be split in order to add the new node. + * + * @param[in] DAG root of the current DAG + * @param[in] str string to be added + * @param[in] offs offset into str where match needs to start + * (this is required for recursive calls to handle + * common prefixes) + * @return NULL on error, otherwise the pdag leaf that + * corresponds to the parameters passed. + */ +struct ln_pdag * ln_buildPDAG(struct ln_pdag *DAG, es_str_t *str, size_t offs); + + +prsid_t ln_parserName2ID(const char *const __restrict__ name); +int ln_pdagOptimize(ln_ctx ctx); +void ln_fullPdagStats(ln_ctx ctx, FILE *const fp, const int); +ln_parser_t * ln_newLiteralParser(ln_ctx ctx, char lit); +ln_parser_t* ln_newParser(ln_ctx ctx, json_object *const prscnf); +struct ln_type_pdag * ln_pdagFindType(ln_ctx ctx, const char *const __restrict__ name, const int bAdd); +void ln_fullPDagStatsDOT(ln_ctx ctx, FILE *const fp); + +/* friends */ +int +ln_normalizeRec(npb_t *const __restrict__ npb, + struct ln_pdag *dag, + const size_t offs, + const int bPartialMatch, + struct json_object *json, + struct ln_pdag **endNode +); + +#endif /* #ifndef LOGNORM_PDAG_H_INCLUDED */ diff --git a/src/samp.c b/src/samp.c new file mode 100644 index 0000000..fc00baf --- /dev/null +++ b/src/samp.c @@ -0,0 +1,1191 @@ +/* samp.c -- code for ln_samp objects. + * This code handles rulebase processing. Rulebases have been called + * "sample bases" in the early days of liblognorm, thus the name. + * + * Copyright 2010-2018 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include +#include +#include + +#include "liblognorm.h" +#include "lognorm.h" +#include "samp.h" +#include "internal.h" +#include "parser.h" +#include "pdag.h" +#include "v1_liblognorm.h" +#include "v1_ptree.h" + +void +ln_sampFree(ln_ctx __attribute__((unused)) ctx, struct ln_samp *samp) +{ + free(samp); +} + +static int +ln_parseLegacyFieldDescr(ln_ctx ctx, + const char *const buf, + const size_t lenBuf, + size_t *bufOffs, + es_str_t **str, + json_object **prscnf) +{ + int r = 0; + char *cstr; /* for debug mode strings */ + char *ftype = NULL; + char name[MAX_FIELDNAME_LEN]; + size_t iDst; + struct json_object *json = NULL; + char *ed = NULL; + es_size_t i = *bufOffs; + es_str_t *edata = NULL; + + for( iDst = 0 + ; iDst < (MAX_FIELDNAME_LEN - 1) && i < lenBuf && buf[i] != ':' + ; ++iDst) { + name[iDst] = buf[i++]; + } + name[iDst] = '\0'; + if(iDst == (MAX_FIELDNAME_LEN - 1)) { + ln_errprintf(ctx, 0, "field name too long in: %s", buf+(*bufOffs)); + FAIL(LN_INVLDFDESCR); + } + if(i == lenBuf) { + ln_errprintf(ctx, 0, "field definition wrong in: %s", buf+(*bufOffs)); + FAIL(LN_INVLDFDESCR); + } + + if(iDst == 0) { + FAIL(LN_INVLDFDESCR); + } + + if(ctx->debug) { + ln_dbgprintf(ctx, "parsed field: '%s'", name); + } + + if(buf[i] != ':') { + ln_errprintf(ctx, 0, "missing colon in: %s", buf+(*bufOffs)); + FAIL(LN_INVLDFDESCR); + } + ++i; /* skip ':' */ + + /* parse and process type (trailing whitespace must be trimmed) */ + es_emptyStr(*str); + size_t j = i; + /* scan for terminator */ + while(j < lenBuf && buf[j] != ':' && buf[j] != '{' && buf[j] != '%') + ++j; + /* now trim trailing space backwards */ + size_t next = j; + --j; + while(j >= i && isspace(buf[j])) + --j; + /* now copy */ + while(i <= j) { + CHKR(es_addChar(str, buf[i++])); + } + /* finally move i to consumed position */ + i = next; + + if(i == lenBuf) { + ln_errprintf(ctx, 0, "premature end (missing %%?) in: %s", buf+(*bufOffs)); + FAIL(LN_INVLDFDESCR); + } + + ftype = es_str2cstr(*str, NULL); + ln_dbgprintf(ctx, "field type '%s', i %d", ftype, i); + + if(buf[i] == '{') { + struct json_tokener *tokener = json_tokener_new(); + json = json_tokener_parse_ex(tokener, buf+i, (int) (lenBuf - i)); + if(json == NULL) { + ln_errprintf(ctx, 0, "invalid json in '%s'", buf+i); + } + i += tokener->char_offset; + json_tokener_free(tokener); + } + + if(buf[i] == '%') { + i++; + } else { + /* parse extra data */ + CHKN(edata = es_newStr(8)); + i++; + while(i < lenBuf) { + if(buf[i] == '%') { + ++i; + break; /* end of field */ + } + CHKR(es_addChar(&edata, buf[i++])); + } + es_unescapeStr(edata); + if(ctx->debug) { + cstr = es_str2cstr(edata, NULL); + ln_dbgprintf(ctx, "parsed extra data: '%s'", cstr); + free(cstr); + } + } + + struct json_object *val; + *prscnf = json_object_new_object(); + CHKN(val = json_object_new_string(name)); + json_object_object_add(*prscnf, "name", val); + CHKN(val = json_object_new_string(ftype)); + json_object_object_add(*prscnf, "type", val); + if(edata != NULL) { + ed = es_str2cstr(edata, " "); + CHKN(val = json_object_new_string(ed)); + json_object_object_add(*prscnf, "extradata", val); + } + if(json != NULL) { + /* now we need to merge the json params into the main object */ + struct json_object_iterator it = json_object_iter_begin(json); + struct json_object_iterator itEnd = json_object_iter_end(json); + while (!json_object_iter_equal(&it, &itEnd)) { + struct json_object *const v = json_object_iter_peek_value(&it); + json_object_get(v); + json_object_object_add(*prscnf, json_object_iter_peek_name(&it), v); + json_object_iter_next(&it); + } + } + + *bufOffs = i; +done: + free(ed); + if(edata != NULL) + es_deleteStr(edata); + free(ftype); + if(json != NULL) + json_object_put(json); + return r; +} + +/** + * Extract a field description from a sample. + * The field description is added to the tail of the current + * subtree's field list. The parse buffer must be position on the + * leading '%' that starts a field definition. It is a program error + * if this condition is not met. + * + * Note that we break up the object model and access ptree members + * directly. Let's consider us a friend of ptree. This is necessary + * to optimize the structure for a high-speed parsing process. + * + * @param[in] str a temporary work string. This is passed in to save the + * creation overhead + * @returns 0 on success, something else otherwise + */ +static int +addFieldDescr(ln_ctx ctx, struct ln_pdag **pdag, es_str_t *rule, + size_t *bufOffs, es_str_t **str) +{ + int r = 0; + es_size_t i = *bufOffs; + char *ftype = NULL; + const char *buf; + es_size_t lenBuf; + struct json_object *prs_config = NULL; + + buf = (const char*)es_getBufAddr(rule); + lenBuf = es_strlen(rule); + assert(buf[i] == '%'); + ++i; /* "eat" ':' */ + + /* skip leading whitespace in field name */ + while(i < lenBuf && isspace(buf[i])) + ++i; + /* check if we have new-style json config */ + if(buf[i] == '{' || buf[i] == '[') { + struct json_tokener *tokener = json_tokener_new(); + prs_config = json_tokener_parse_ex(tokener, buf+i, (int) (lenBuf - i)); + i += tokener->char_offset; + json_tokener_free(tokener); + if(prs_config == NULL || i == lenBuf || buf[i] != '%') { + ln_errprintf(ctx, 0, "invalid json in '%s'", buf+i); + r = -1; + goto done; + } + *bufOffs = i+1; /* eat '%' - if above ensures it is present */ + } else { + *bufOffs = i; + CHKR(ln_parseLegacyFieldDescr(ctx, buf, lenBuf, bufOffs, str, &prs_config)); + } + + CHKR(ln_pdagAddParser(ctx, pdag, prs_config)); + +done: + free(ftype); + return r; +} + + +/** + * Construct a literal parser json definition. + */ +static json_object * +newLiteralParserJSONConf(char lit) +{ + char buf[] = "x"; + buf[0] = lit; + struct json_object *val; + struct json_object *prscnf = json_object_new_object(); + + val = json_object_new_string("literal"); + json_object_object_add(prscnf, "type", val); + + val = json_object_new_string(buf); + json_object_object_add(prscnf, "text", val); + + return prscnf; +} + +/** + * Parse a Literal string out of the template and add it to the tree. + * This function is used to create the unoptimized tree. So we do + * one node for each character. These will be compacted by the optimizer + * in a later stage. The advantage is that we do not need to care about + * splitting the tree. As such the processing is fairly simple: + * + * for each character in literal (left-to-right): + * create literal parser object o + * add new DAG node o, advance to it + * + * @param[in] ctx the context + * @param[in/out] subtree on entry, current subtree, on exist newest + * deepest subtree + * @param[in] rule string with current rule + * @param[in/out] bufOffs parse pointer, up to which offset is parsed + * (is updated so that it points to first char after consumed + * string on exit). + * @param str a work buffer, provided to prevent creation of a new object + * @return 0 on success, something else otherwise + */ +static int +parseLiteral(ln_ctx ctx, struct ln_pdag **pdag, es_str_t *rule, + size_t *const __restrict__ bufOffs, es_str_t **str) +{ + int r = 0; + size_t i = *bufOffs; + unsigned char *buf = es_getBufAddr(rule); + const size_t lenBuf = es_strlen(rule); + const char *cstr = NULL; + + es_emptyStr(*str); + while(i < lenBuf) { + if(buf[i] == '%') { + if(i+1 < lenBuf && buf[i+1] != '%') { + break; /* field start is end of literal */ + } + if (++i == lenBuf) break; + } + CHKR(es_addChar(str, buf[i])); + ++i; + } + + es_unescapeStr(*str); + cstr = es_str2cstr(*str, NULL); + if(ctx->debug) { + ln_dbgprintf(ctx, "parsed literal: '%s'", cstr); + } + + *bufOffs = i; + + /* we now add the string to the tree */ + for(i = 0 ; cstr[i] != '\0' ; ++i) { + struct json_object *const prscnf = + newLiteralParserJSONConf(cstr[i]); + CHKN(prscnf); + CHKR(ln_pdagAddParser(ctx, pdag, prscnf)); + } + + r = 0; + +done: + free((void*)cstr); + return r; +} + + +/* Implementation note: + * We read in the sample, and split it into chunks of literal text and + * fields. Each literal text is added as whole to the tree, as is each + * field individually. To do so, we keep track of our current subtree + * root, which changes whenever a new part of the tree is build. It is + * set to the then-lowest part of the tree, where the next step sample + * data is to be added. + * + * This function processes the whole string or returns an error. + * + * format: literal1%field:type:extra-data%literal2 + * + * @returns the new dag root (or NULL in case of error) + */ +static int +addSampToTree(ln_ctx ctx, + es_str_t *rule, + ln_pdag *dag, + struct json_object *tagBucket) +{ + int r = -1; + es_str_t *str = NULL; + size_t i; + + CHKN(str = es_newStr(256)); + i = 0; + while(i < es_strlen(rule)) { + LN_DBGPRINTF(ctx, "addSampToTree %zu of %d", i, es_strlen(rule)); + CHKR(parseLiteral(ctx, &dag, rule, &i, &str)); + /* After the literal there can be field only*/ + if (i < es_strlen(rule)) { + CHKR(addFieldDescr(ctx, &dag, rule, &i, &str)); + if (i == es_strlen(rule)) { + /* finish the tree with empty literal to avoid false merging*/ + CHKR(parseLiteral(ctx, &dag, rule, &i, &str)); + } + } + } + + LN_DBGPRINTF(ctx, "end addSampToTree %zu of %d", i, es_strlen(rule)); + /* we are at the end of rule processing, so this node is a terminal */ + dag->flags.isTerminal = 1; + dag->tags = tagBucket; + dag->rb_file = strdup(ctx->conf_file); + dag->rb_lineno = ctx->conf_ln_nbr; + +done: + if(str != NULL) + es_deleteStr(str); + return r; +} + + + +/** + * get the initial word of a rule line that tells us the type of the + * line. + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[out] offs offset after "=" + * @param[out] str string with "linetype-word" (newly created) + * @returns 0 on success, something else otherwise + */ +static int +getLineType(const char *buf, es_size_t lenBuf, size_t *offs, es_str_t **str) +{ + int r = -1; + size_t i; + + *str = es_newStr(16); + for(i = 0 ; i < lenBuf && buf[i] != '=' ; ++i) { + CHKR(es_addChar(str, buf[i])); + } + + if(i < lenBuf) + ++i; /* skip over '=' */ + *offs = i; + +done: return r; +} + + +/** + * Get a new common prefix from the config file. That is actually everything from + * the current offset to the end of line. + * + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset after "=" + * @param[in/out] str string to store common offset. If NULL, it is created, + * otherwise it is emptied. + * @returns 0 on success, something else otherwise + */ +static int +getPrefix(const char *buf, es_size_t lenBuf, es_size_t offs, es_str_t **str) +{ + int r; + + if(*str == NULL) { + CHKN(*str = es_newStr(lenBuf - offs)); + } else { + es_emptyStr(*str); + } + + r = es_addBuf(str, (char*)buf + offs, lenBuf - offs); +done: return r; +} + +/** + * Extend the common prefix. This means that the line is concatenated + * to the prefix. This is useful if the same rulebase is to be used with + * different prefixes (well, not strictly necessary, but probably useful). + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset to-be-added text starts + * @returns 0 on success, something else otherwise + */ +static int +extendPrefix(ln_ctx ctx, const char *buf, es_size_t lenBuf, es_size_t offs) +{ + return es_addBuf(&ctx->rulePrefix, (char*)buf+offs, lenBuf - offs); +} + + +/** + * Add a tag to the tag bucket. Helper to processTags. + * @param[in] ctx current context + * @param[in] tagname string with tag name + * @param[out] tagBucket tagbucket to which new tags shall be added + * the tagbucket is created if it is NULL + * @returns 0 on success, something else otherwise + */ +static int +addTagStrToBucket(ln_ctx ctx, es_str_t *tagname, struct json_object **tagBucket) +{ + int r = -1; + char *cstr; + struct json_object *tag; + + if(*tagBucket == NULL) { + CHKN(*tagBucket = json_object_new_array()); + } + cstr = es_str2cstr(tagname, NULL); + ln_dbgprintf(ctx, "tag found: '%s'", cstr); + CHKN(tag = json_object_new_string(cstr)); + json_object_array_add(*tagBucket, tag); + free(cstr); + r = 0; + +done: return r; +} + + +/** + * Extract the tags and create a tag bucket out of them + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in,out] poffs offset where tags start, on exit and success + * offset after tag part (excluding ':') + * @param[out] tagBucket tagbucket to which new tags shall be added + * the tagbucket is created if it is NULL + * @returns 0 on success, something else otherwise + */ +static int +processTags(ln_ctx ctx, const char *buf, es_size_t lenBuf, es_size_t *poffs, struct json_object **tagBucket) +{ + int r = -1; + es_str_t *str = NULL; + es_size_t i; + + assert(poffs != NULL); + i = *poffs; + while(i < lenBuf && buf[i] != ':') { + if(buf[i] == ',') { + /* end of this tag */ + CHKR(addTagStrToBucket(ctx, str, tagBucket)); + es_deleteStr(str); + str = NULL; + } else { + if(str == NULL) { + CHKN(str = es_newStr(32)); + } + CHKR(es_addChar(&str, buf[i])); + } + ++i; + } + + if(buf[i] != ':') + goto done; + ++i; /* skip ':' */ + + if(str != NULL) { + CHKR(addTagStrToBucket(ctx, str, tagBucket)); + es_deleteStr(str); + } + + *poffs = i; + r = 0; + +done: return r; +} + + + +/** + * Process a new rule and add it to pdag. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset where rule starts + * @returns 0 on success, something else otherwise + */ +static int +processRule(ln_ctx ctx, const char *buf, es_size_t lenBuf, es_size_t offs) +{ + int r = -1; + es_str_t *str; + struct json_object *tagBucket = NULL; + + ln_dbgprintf(ctx, "rule line to add: '%s'", buf+offs); + CHKR(processTags(ctx, buf, lenBuf, &offs, &tagBucket)); + + if(offs == lenBuf) { + ln_errprintf(ctx, 0, "error: actual message sample part is missing"); + goto done; + } + if(ctx->rulePrefix == NULL) { + CHKN(str = es_newStr(lenBuf)); + } else { + CHKN(str = es_strdup(ctx->rulePrefix)); + } + CHKR(es_addBuf(&str, (char*)buf + offs, lenBuf - offs)); + addSampToTree(ctx, str, ctx->pdag, tagBucket); + es_deleteStr(str); + r = 0; +done: return r; +} + + +static int +getTypeName(ln_ctx ctx, + const char *const __restrict__ buf, + const size_t lenBuf, + size_t *const __restrict__ offs, + char *const __restrict__ dstbuf) +{ + int r = -1; + size_t iDst; + size_t i = *offs; + + if(buf[i] != '@') { + ln_errprintf(ctx, 0, "user-defined type name must " + "start with '@'"); + goto done; + } + for( iDst = 0 + ; i < lenBuf && buf[i] != ':' && iDst < MAX_TYPENAME_LEN - 1 + ; ++i, ++iDst) { + if(isspace(buf[i])) { + ln_errprintf(ctx, 0, "user-defined type name must " + "not contain whitespace"); + goto done; + } + dstbuf[iDst] = buf[i]; + } + dstbuf[iDst] = '\0'; + + if(i < lenBuf && buf[i] == ':') { + r = 0, + *offs = i+1; /* skip ":" */ + } +done: + return r; +} + +/** + * Process a type definition and add it to the PDAG + * disconnected components. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset where rule starts + * @returns 0 on success, something else otherwise + */ +static int +processType(ln_ctx ctx, + const char *const __restrict__ buf, + const size_t lenBuf, + size_t offs) +{ + int r = -1; + es_str_t *str; + char typename[MAX_TYPENAME_LEN]; + + ln_dbgprintf(ctx, "type line to add: '%s'", buf+offs); + CHKR(getTypeName(ctx, buf, lenBuf, &offs, typename)); + ln_dbgprintf(ctx, "type name is '%s'", typename); + + ln_dbgprintf(ctx, "type line to add: '%s'", buf+offs); + if(offs == lenBuf) { + ln_errprintf(ctx, 0, "error: actual message sample part is missing in type def"); + goto done; + } + // TODO: optimize + CHKN(str = es_newStr(lenBuf)); + CHKR(es_addBuf(&str, (char*)buf + offs, lenBuf - offs)); + struct ln_type_pdag *const td = ln_pdagFindType(ctx, typename, 1); + CHKN(td); + addSampToTree(ctx, str, td->pdag, NULL); + es_deleteStr(str); + r = 0; +done: return r; +} + + +/** + * Obtain a field name from a rule base line. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in/out] offs on entry: offset where tag starts, + * on exit: updated offset AFTER TAG and (':') + * @param [out] strTag obtained tag, if successful + * @returns 0 on success, something else otherwise + */ +static int +getFieldName(ln_ctx __attribute__((unused)) ctx, const char *buf, es_size_t lenBuf, es_size_t *offs, +es_str_t **strTag) +{ + int r = -1; + es_size_t i; + + i = *offs; + while(i < lenBuf && + (isalnum(buf[i]) || buf[i] == '_' || buf[i] == '.')) { + if(*strTag == NULL) { + CHKN(*strTag = es_newStr(32)); + } + CHKR(es_addChar(strTag, buf[i])); + ++i; + } + *offs = i; + r = 0; +done: return r; +} + + +/** + * Skip over whitespace. + * Skips any whitespace present at the offset. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in/out] offs on entry: offset first unprocessed position + */ +static void +skipWhitespace(ln_ctx __attribute__((unused)) ctx, const char *buf, es_size_t lenBuf, es_size_t *offs) +{ + while(*offs < lenBuf && isspace(buf[*offs])) { + (*offs)++; + } +} + + +/** + * Obtain an annotation (field) operation. + * This usually is a plus or minus sign followed by a field name + * followed (if plus) by an equal sign and the field value. On entry, + * offs must be positioned on the first unprocessed field (after ':' for + * the initial field!). Extra whitespace is detected and, if present, + * skipped. The obtained operation is added to the annotation set provided. + * Note that extracted string objects are passed to the annotation; thus it + * is vital NOT to free them (most importantly, this is *not* a memory leak). + * + * @param[in] ctx current context + * @param[in] annot active annotation set to which the operation is to be added + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in/out] offs on entry: offset where tag starts, + * on exit: updated offset AFTER TAG and (':') + * @param [out] strTag obtained tag, if successful + * @returns 0 on success, something else otherwise + */ +static int +getAnnotationOp(ln_ctx ctx, ln_annot *annot, const char *buf, es_size_t lenBuf, es_size_t *offs) +{ + int r = -1; + es_size_t i; + es_str_t *fieldName = NULL; + es_str_t *fieldVal = NULL; + ln_annot_opcode opc; + + i = *offs; + skipWhitespace(ctx, buf, lenBuf, &i); + if(i == lenBuf) { + r = 0; + goto done; /* nothing left to process (no error!) */ + } + + switch(buf[i]) { + case '+': + opc = ln_annot_ADD; + break; + case '#': + ln_dbgprintf(ctx, "inline comment in 'annotate' line: %s", buf); + *offs = lenBuf; + r = 0; + goto done; + case '-': + ln_dbgprintf(ctx, "annotate op '-' not yet implemented - failing"); + /*FALLTHROUGH*/ + default:ln_errprintf(ctx, 0, "invalid annotate operation '%c': %s", buf[i], buf+i); + goto fail; + } + i++; + + if(i == lenBuf) goto fail; /* nothing left to process */ + + CHKR(getFieldName(ctx, buf, lenBuf, &i, &fieldName)); + if(i == lenBuf) goto fail; /* nothing left to process */ + if(buf[i] != '=') goto fail; /* format error */ + i++; + + skipWhitespace(ctx, buf, lenBuf, &i); + if(buf[i] != '"') goto fail; /* format error */ + ++i; + + while(i < lenBuf && buf[i] != '"') { + if(fieldVal == NULL) { + CHKN(fieldVal = es_newStr(32)); + } + CHKR(es_addChar(&fieldVal, buf[i])); + ++i; + } + *offs = (i == lenBuf) ? i : i+1; + CHKR(ln_addAnnotOp(annot, opc, fieldName, fieldVal)); + r = 0; +done: return r; +fail: return -1; +} + + +/** + * Process a new annotation and add it to the annotation set. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset where annotation starts + * @returns 0 on success, something else otherwise + */ +static int +processAnnotate(ln_ctx ctx, const char *buf, es_size_t lenBuf, es_size_t offs) +{ + int r; + es_str_t *tag = NULL; + ln_annot *annot; + + ln_dbgprintf(ctx, "sample annotation to add: '%s'", buf+offs); + CHKR(getFieldName(ctx, buf, lenBuf, &offs, &tag)); + skipWhitespace(ctx, buf, lenBuf, &offs); + if(buf[offs] != ':' || tag == NULL) { + ln_dbgprintf(ctx, "invalid tag field in annotation, line is '%s'", buf); + r=-1; + goto done; + } + ++offs; + + /* we got an annotation! */ + CHKN(annot = ln_newAnnot(tag)); + + while(offs < lenBuf) { + CHKR(getAnnotationOp(ctx, annot, buf, lenBuf, &offs)); + } + + r = ln_addAnnotToSet(ctx->pas, annot); + +done: return r; +} + +/** + * Process include directive. This permits to add unlimited layers + * of include files. + * + * @param[in] ctx current context + * @param[in] buf line buffer, a C-string + * @param[in] offs offset where annotation starts + * @returns 0 on success, something else otherwise + */ +static int +processInclude(ln_ctx ctx, const char *buf, const size_t offs) +{ + int r; + const char *const conf_file_save = ctx->conf_file; + char *const fname = strdup(buf+offs); + size_t lenfname = strlen(fname); + const unsigned conf_ln_nbr_save = ctx->conf_ln_nbr; + + /* trim string - not optimized but also no need to */ + for(size_t i = lenfname - 1 ; i > 0 ; --i) { + if(isspace(fname[i])) { + fname[i] = '\0'; + --lenfname; + } + } + + CHKR(ln_loadSamples(ctx, fname)); + +done: + free(fname); + ctx->conf_file = conf_file_save; + ctx->conf_ln_nbr = conf_ln_nbr_save; + + return r; +} + +/** + * Reads a rule (sample) stored in buffer buf and creates a new ln_samp object + * out of it, which it adds to the pdag (if required). + * + * @param[ctx] ctx current library context + * @param[buf] cstr buffer containing the string contents of the sample + * @param[lenBuf] length of the sample contained within buf + * @return standard error code + */ +static int +ln_processSamp(ln_ctx ctx, const char *buf, const size_t lenBuf) +{ + int r = 0; + es_str_t *typeStr = NULL; + size_t offs; + + if(getLineType(buf, lenBuf, &offs, &typeStr) != 0) + goto done; + + if(!es_strconstcmp(typeStr, "prefix")) { + if(getPrefix(buf, lenBuf, offs, &ctx->rulePrefix) != 0) goto done; + } else if(!es_strconstcmp(typeStr, "extendprefix")) { + if(extendPrefix(ctx, buf, lenBuf, offs) != 0) goto done; + } else if(!es_strconstcmp(typeStr, "rule")) { + if(processRule(ctx, buf, lenBuf, offs) != 0) goto done; + } else if(!es_strconstcmp(typeStr, "type")) { + if(processType(ctx, buf, lenBuf, offs) != 0) goto done; + } else if(!es_strconstcmp(typeStr, "annotate")) { + if(processAnnotate(ctx, buf, lenBuf, offs) != 0) goto done; + } else if(!es_strconstcmp(typeStr, "include")) { + CHKR(processInclude(ctx, buf, offs)); + } else { + char *str; + str = es_str2cstr(typeStr, NULL); + ln_errprintf(ctx, 0, "invalid record type detected: '%s'", str); + free(str); + goto done; + } + +done: + if(typeStr != NULL) + es_deleteStr(typeStr); + return r; +} + + +/** + * Read a character from our sample source. + */ +static int +ln_sampReadChar(const ln_ctx ctx, FILE *const __restrict__ repo, const char **inpbuf) +{ + int c; + assert((repo != NULL && inpbuf == NULL) || (repo == NULL && inpbuf != NULL)); + if(repo == NULL) { + c = (**inpbuf == '\0') ? EOF : *(*inpbuf)++; + } else { + c = fgetc(repo); + } + return c; +} + +/* note: comments are only supported at beginning of line! */ +/* skip to end of line */ +void +ln_sampSkipCommentLine(ln_ctx ctx, FILE * const __restrict__ repo, const char **inpbuf) +{ + int c; + do { + c = ln_sampReadChar(ctx, repo, inpbuf); + } while(c != EOF && c != '\n'); + ++ctx->conf_ln_nbr; +} + + +/* this checks if in a multi-line rule, the next line seems to be a new + * rule, which would meand we have some unmatched percent signs inside + * our rule (what we call a "runaway rule"). This can easily happen and + * is otherwise hard to debug, so let's see if it is the case... + * @return 1 if this is a runaway rule, 0 if not + */ +int +ln_sampChkRunawayRule(ln_ctx ctx, FILE *const __restrict__ repo, const char **inpbuf) +{ + int r = 1; + fpos_t fpos; + char buf[6]; + int cont = 1; + int read; + + fgetpos(repo, &fpos); + while(cont) { + fpos_t inner_fpos; + fgetpos(repo, &inner_fpos); + if((read = fread(buf, sizeof(char), sizeof(buf)-1, repo)) == 0) { + r = 0; + goto done; + } + if(buf[0] == '\n') { + fsetpos(repo, &inner_fpos); + if(fread(buf, sizeof(char), 1, repo)) {}; /* skip '\n' */ + continue; + } else if(buf[0] == '#') { + fsetpos(repo, &inner_fpos); + const unsigned conf_ln_nbr_save = ctx->conf_ln_nbr; + ln_sampSkipCommentLine(ctx, repo, inpbuf); + ctx->conf_ln_nbr = conf_ln_nbr_save; + continue; + } + if(read != 5) + goto done; /* cannot be a rule= line! */ + cont = 0; /* no comment, so we can decide */ + buf[5] = '\0'; + if(!strncmp(buf, "rule=", 5)) { + ln_errprintf(ctx, 0, "line has 'rule=' at begin of line, which " + "does look like a typo in the previous lines (unmatched " + "%% character) and is forbidden. If valid, please re-format " + "the rule to start with other characters. Rule ignored."); + goto done; + } + } + + r = 0; +done: + fsetpos(repo, &fpos); + return r; +} + +/** + * Read a rule (sample) from repository (sequentially). + * + * Reads a sample starting with the current file position and + * creates a new ln_samp object out of it, which it adds to the + * pdag. + * + * @param[in] ctx current library context + * @param[in] repo repository descriptor if file input is desired + * @param[in/out] ptr to ptr of input buffer; this is used if a string is + * provided instead of a file. If so, this pointer is advanced + * as data is consumed. + * @param[out] isEof must be set to 0 on entry and is switched to 1 if EOF occured. + * @return standard error code + */ +static int +ln_sampRead(ln_ctx ctx, FILE *const __restrict__ repo, const char **inpbuf, + int *const __restrict__ isEof) +{ + int r = 0; + char buf[64*1024]; /**< max size of rule - TODO: make configurable */ + + size_t i = 0; + int inParser = 0; + int done = 0; + while(!done) { + const int c = ln_sampReadChar(ctx, repo, inpbuf); + if(c == EOF) { + *isEof = 1; + if(i == 0) + goto done; + else + done = 1; /* last line missing LF, still process it! */ + } else if(c == '\n') { + ++ctx->conf_ln_nbr; + if(inParser) { + if(ln_sampChkRunawayRule(ctx, repo, inpbuf)) { + /* ignore previous rule */ + inParser = 0; + i = 0; + } + } + if(!inParser && i != 0) + done = 1; + } else if(c == '#' && i == 0) { + ln_sampSkipCommentLine(ctx, repo, inpbuf); + i = 0; /* back to beginning */ + } else { + if(c == '%') + inParser = (inParser) ? 0 : 1; + buf[i++] = c; + if(i >= sizeof(buf)) { + ln_errprintf(ctx, 0, "line is too long"); + goto done; + } + } + } + buf[i] = '\0'; + + ln_dbgprintf(ctx, "read rulebase line[~%d]: '%s'", ctx->conf_ln_nbr, buf); + CHKR(ln_processSamp(ctx, buf, i)); + +done: + return r; +} + +/* check rulebase format version. Returns 2 if this is v2 rulebase, + * 1 for any pre-v2 and -1 if there was a problem reading the file. + */ +static int +checkVersion(FILE *const fp) +{ + char buf[64]; + + if(fgets(buf, sizeof(buf), fp) == NULL) + return -1; + if(!strcmp(buf, "version=2\n")) { + return 2; + } else { + return 1; + } +} + +/* we have a v1 rulebase, so let's do all stuff that we need + * to make that ole piece of ... work. + */ +static int +doOldCruft(ln_ctx ctx, const char *file) +{ + int r = -1; + if((ctx->ptree = ln_newPTree(ctx, NULL)) == NULL) { + free(ctx); + r = -1; + goto done; + } + r = ln_v1_loadSamples(ctx, file); +done: + return r; +} + +/* try to open a rulebase file. This also tries to see if we need to + * load it from some pre-configured alternative location. + * @returns open file pointer or NULL in case of error + */ +static FILE * +tryOpenRBFile(ln_ctx ctx, const char *const file) +{ + FILE *repo = NULL; + + if((repo = fopen(file, "r")) != NULL) + goto done; + const int eno1 = errno; + + const char *const rb_lib = getenv("LIBLOGNORM_RULEBASES"); + if(rb_lib == NULL || *file == '/') { + ln_errprintf(ctx, eno1, "cannot open rulebase '%s'", file); + goto done; + } + + char *fname = NULL; + int len; + len = asprintf(&fname, (rb_lib[strlen(rb_lib)-1] == '/') ? "%s%s" : "%s/%s", rb_lib, file); + if(len == -1) { + ln_errprintf(ctx, errno, "alloc error: cannot open rulebase '%s'", file); + goto done; + } + if((repo = fopen(fname, "r")) == NULL) { + const int eno2 = errno; + ln_errprintf(ctx, eno1, "cannot open rulebase '%s'", file); + ln_errprintf(ctx, eno2, "also tried to locate %s via " + "rulebase directory without success. Expanded " + "name was '%s'", file, fname); + } + free(fname); + +done: + return repo; +} + +/* @return 0 if all is ok, 1 if an error occured */ +int +ln_sampLoad(ln_ctx ctx, const char *file) +{ + int r = 1; + FILE *repo; + int isEof = 0; + + ln_dbgprintf(ctx, "loading rulebase file '%s'", file); + if(file == NULL) goto done; + if((repo = tryOpenRBFile(ctx, file)) == NULL) + goto done; + const int version = checkVersion(repo); + ln_dbgprintf(ctx, "rulebase version is %d\n", version); + if(version == -1) { + ln_errprintf(ctx, errno, "error determing version of %s", file); + goto done; + } + if(ctx->version != 0 && version != ctx->version) { + ln_errprintf(ctx, errno, "rulebase '%s' must be version %d, but is version %d " + " - can not be processed", file, ctx->version, version); + goto done; + } + ctx->version = version; + if(ctx->version == 1) { + fclose(repo); + r = doOldCruft(ctx, file); + goto done; + } + + /* now we are in our native code */ + ++ctx->conf_ln_nbr; /* "version=2" is line 1! */ + while(!isEof) { + CHKR(ln_sampRead(ctx, repo, NULL, &isEof)); + } + fclose(repo); + r = 0; + + if(ctx->include_level == 1) + ln_pdagOptimize(ctx); +done: + return r; +} + +/* @return 0 if all is ok, 1 if an error occured */ +int +ln_sampLoadFromString(ln_ctx ctx, const char *string) +{ + int r = 1; + int isEof = 0; + + if(string == NULL) + goto done; + + ln_dbgprintf(ctx, "loading v2 rulebase from string '%s'", string); + ctx->version = 2; + while(!isEof) { + CHKR(ln_sampRead(ctx, NULL, &string, &isEof)); + } + r = 0; + + if(ctx->include_level == 1) + ln_pdagOptimize(ctx); +done: + return r; +} diff --git a/src/samp.h b/src/samp.h new file mode 100644 index 0000000..695c32a --- /dev/null +++ b/src/samp.h @@ -0,0 +1,52 @@ +/** + * @file samples.h + * @brief Object to process log samples. + * @author Rainer Gerhards + * + * This object handles log samples, and in actual log sample files. + * It co-operates with the ptree object to build the actual parser tree. + *//* + * + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2015 by Rainer Gerhards and Adiscon GmbH. + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General PublicCH License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef LIBLOGNORM_SAMPLES_H_INCLUDED +#define LIBLOGNORM_SAMPLES_H_INCLUDED +#include /* we need es_size_t */ +#include + + +/** + * A single log sample. + */ +struct ln_samp { + es_str_t *msg; +}; + +void ln_sampFree(ln_ctx ctx, struct ln_samp *samp); +int ln_sampLoad(ln_ctx ctx, const char *file); +int ln_sampLoadFromString(ln_ctx ctx, const char *string); + +/* dual-use funtions for v1 engine */ +void ln_sampSkipCommentLine(ln_ctx ctx, FILE * const __restrict__ repo, const char **inpbuf); +int ln_sampChkRunawayRule(ln_ctx ctx, FILE *const __restrict__ repo, const char **inpbuf); + +#endif /* #ifndef LIBLOGNORM_SAMPLES_H_INCLUDED */ diff --git a/src/v1_liblognorm.c b/src/v1_liblognorm.c new file mode 100644 index 0000000..1c7bb7d --- /dev/null +++ b/src/v1_liblognorm.c @@ -0,0 +1,106 @@ +/* This file implements the liblognorm API. + * See header file for descriptions. + * + * liblognorm - a fast samples-based log normalization library + * Copyright 2013-2015 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include + +#include "liblognorm.h" +#include "v1_liblognorm.h" +#include "v1_ptree.h" +#include "lognorm.h" +#include "annot.h" +#include "v1_samp.h" + +#define ERR_ABORT {r = 1; goto done; } + +#define CHECK_CTX \ + if(ctx->objID != LN_ObjID_CTX) { \ + r = -1; \ + goto done; \ + } + + +ln_ctx +ln_v1_inherittedCtx(ln_ctx parent) +{ + ln_ctx child = ln_initCtx(); + if (child != NULL) { + child->opts = parent->opts; + child->dbgCB = parent->dbgCB; + child->dbgCookie = parent->dbgCookie; + child->version = parent->version; + child->ptree = ln_newPTree(child, NULL); + } + + return child; +} + + +int +ln_v1_loadSample(ln_ctx ctx, const char *buf) +{ + // Something bad happened - no new sample + if (ln_v1_processSamp(ctx, buf, strlen(buf)) == NULL) { + return 1; + } + return 0; +} + + +int +ln_v1_loadSamples(ln_ctx ctx, const char *file) +{ + int r = 0; + FILE *repo; + struct ln_v1_samp *samp; + int isEof = 0; + + char *fn_to_free = NULL; + CHECK_CTX; + + ctx->conf_file = fn_to_free = strdup(file); + ctx->conf_ln_nbr = 0; + + if(file == NULL) ERR_ABORT; + if((repo = fopen(file, "r")) == NULL) { + ln_errprintf(ctx, errno, "cannot open file %s", file); + ERR_ABORT; + } + while(!isEof) { + if((samp = ln_v1_sampRead(ctx, repo, &isEof)) == NULL) { + /* TODO: what exactly to do? */ + } + } + fclose(repo); + + ctx->conf_file = NULL; + +done: + free((void*)fn_to_free); + return r; +} + diff --git a/src/v1_liblognorm.h b/src/v1_liblognorm.h new file mode 100644 index 0000000..81cb998 --- /dev/null +++ b/src/v1_liblognorm.h @@ -0,0 +1,149 @@ +/** + * @file liblognorm.h + * @brief The public liblognorm API. + * + * Functions other than those defined here MUST not be called by + * a liblognorm "user" application. + * + * This file is meant to be included by applications using liblognorm. + * For lognorm library files themselves, include "lognorm.h". + *//** + * @mainpage + * Liblognorm is an easy to use and fast samples-based log normalization + * library. + * + * It can be passed a stream of arbitrary log messages, one at a time, and for + * each message it will output well-defined name-value pairs and a set of + * tags describing the message. + * + * For further details, see it's initial announcement available at + * http://blog.gerhards.net/2010/10/introducing-liblognorm.html + * + * The public interface of this library is describe in liblognorm.h. + * + * Liblognorm fully supports Unicode. Like most Linux tools, it operates + * on UTF-8 natively, called "passive mode". This was decided because we + * so can keep the size of data structures small while still supporting + * all of the world's languages (actually more than when we did UCS-2). + * + * At the technical level, we can handle UTF-8 multibyte sequences transparently. + * Liblognorm needs to look at a few US-ASCII characters to do the + * sample base parsing (things to indicate fields), so this is no + * issue. Inside the parse tree, a multibyte sequence can simple be processed + * as if it were a sequence of different characters that make up a their + * own symbol each. In fact, this even allows for somewhat greater parsing + * speed. + *//* + * + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2013 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef V1_LIBLOGNORM_H_INCLUDED +#define V1_LIBLOGNORM_H_INCLUDED +#include "liblognorm.h" + +/** + * Inherit control attributes from a library context. + * + * This does not copy the parse-tree, but does copy + * behaviour-controling attributes such as enableRegex. + * + * Just as with ln_initCtx, ln_exitCtx() must be called on a library + * context that is no longer needed. + * + * @return new library context or NULL if an error occured + */ +ln_ctx ln_v1_inherittedCtx(ln_ctx parent); + + +/** + * Reads a sample stored in buffer buf and creates a new ln_samp object + * out of it. + * + * @note + * It is the caller's responsibility to delete the newly + * created ln_samp object if it is no longer needed. + * + * @param[ctx] ctx current library context + * @param[buf] NULL terminated cstr containing the contents of the sample + * @return Returns zero on success, something else otherwise. + */ +int +ln_v1_loadSample(ln_ctx ctx, const char *buf); + +/** + * Load a (log) sample file. + * + * The file must contain log samples in syntactically correct format. Samples are added + * to set already loaded in the current context. If there is a sample with duplicate + * semantics, this sample will be ignored. Most importantly, this can \b not be used + * to change tag assignments for a given sample. + * + * @param[in] ctx The library context to apply callback to. + * @param[in] file Name of file to be loaded. + * + * @return Returns zero on success, something else otherwise. + */ +int ln_v1_loadSamples(ln_ctx ctx, const char *file); + +/** + * Normalize a message. + * + * This is the main library entry point. It is called with a message + * to normalize and will return a normalized in-memory representation + * of it. + * + * If an error occurs, the function returns -1. In that case, an + * in-memory event representation has been generated if event is + * non-NULL. In that case, the event contains further error details in + * normalized form. + * + * @note + * This function works on byte-counted strings and as such is able to + * process NUL bytes if they occur inside the message. On the other hand, + * this means the the correct messages size, \b excluding the NUL byte, + * must be provided. + * + * @param[in] ctx The library context to use. + * @param[in] str The message string (see note above). + * @param[in] strLen The length of the message in bytes. + * @param[out] json_p A new event record or NULL if an error occured. Must be + * destructed if no longer needed. + * + * @return Returns zero on success, something else otherwise. + */ +int ln_v1_normalize(ln_ctx ctx, const char *str, size_t strLen, struct json_object **json_p); + + +/** + * create a single sample. + */ +struct ln_v1_samp* ln_v1_sampCreate(ln_ctx __attribute__((unused)) ctx); + +/* here we add some stuff from the compatibility layer. A separate include + * would be cleaner, but would potentially require changes all over the + * place. So doing it here is better. The respective replacement + * functions should usually be found under ./compat -- rgerhards, 2015-05-20 + */ + +#endif /* #ifndef LOGNORM_H_INCLUDED */ diff --git a/src/v1_parser.c b/src/v1_parser.c new file mode 100644 index 0000000..323ada0 --- /dev/null +++ b/src/v1_parser.c @@ -0,0 +1,3192 @@ +/* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2018 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include +#include +#include +#include + +#include "v1_liblognorm.h" +#include "internal.h" +#include "lognorm.h" +#include "v1_parser.h" +#include "v1_samp.h" + +#ifdef FEATURE_REGEXP +#include +#include +#endif + + +/* some helpers */ +static inline int +hParseInt(const unsigned char **buf, size_t *lenBuf) +{ + const unsigned char *p = *buf; + size_t len = *lenBuf; + int i = 0; + + while(len > 0 && isdigit(*p)) { + i = i * 10 + *p - '0'; + ++p; + --len; + } + + *buf = p; + *lenBuf = len; + return i; +} + +/* parsers for the primitive types + * + * All parsers receive + * + * @param[in] str the to-be-parsed string + * @param[in] strLen length of the to-be-parsed string + * @param[in] offs an offset into the string + * @param[in] node fieldlist with additional data; for simple + * parsers, this sets variable "ed", which just is + * string data. + * @param[out] parsed bytes + * @param[out] value ptr to json object containing parsed data + * (can be unused, but if used *value MUST be NULL on entry) + * + * They will try to parse out "their" object from the string. If they + * succeed, they: + * + * return 0 on success and LN_WRONGPARSER if this parser could + * not successfully parse (but all went well otherwise) and something + * else in case of an error. + */ +#define PARSER(ParserName) \ +int ln_parse##ParserName(const char *const str, const size_t strLen, \ + size_t *const offs, \ + __attribute__((unused)) const ln_fieldList_t *node, \ + size_t *parsed, \ + __attribute__((unused)) struct json_object **value) \ +{ \ + int r = LN_WRONGPARSER; \ + __attribute__((unused)) es_str_t *ed = node->data; \ + *parsed = 0; + +#define FAILParser \ + goto parserdone; /* suppress warnings */ \ +parserdone: \ + r = 0; \ + goto done; /* suppress warnings */ \ +done: + +#define ENDFailParser \ + return r; \ +} + + +/** + * Utilities to allow constructors of complex parser's to + * easily process field-declaration arguments. + */ +#define FIELD_ARG_SEPERATOR ":" +#define MAX_FIELD_ARGS 10 + +struct pcons_args_s { + int argc; + char *argv[MAX_FIELD_ARGS]; +}; + +typedef struct pcons_args_s pcons_args_t; + +static void free_pcons_args(pcons_args_t** dat_p) { + pcons_args_t *dat = *dat_p; + *dat_p = NULL; + if (! dat) { + return; + } + while((--(dat->argc)) >= 0) { + if (dat->argv[dat->argc] != NULL) free(dat->argv[dat->argc]); + } + free(dat); +} + +static pcons_args_t* pcons_args(es_str_t *args, int expected_argc) { + pcons_args_t *dat = NULL; + char* orig_str = NULL; + if ((dat = malloc(sizeof(pcons_args_t))) == NULL) goto fail; + dat->argc = 0; + if (args != NULL) { + orig_str = es_str2cstr(args, NULL); + char *str = orig_str; + while (dat->argc < MAX_FIELD_ARGS) { + int i = dat->argc++; + char *next = (dat->argc == expected_argc) ? NULL : strstr(str, FIELD_ARG_SEPERATOR); + if (next == NULL) { + if ((dat->argv[i] = strdup(str)) == NULL) goto fail; + break; + } else { + if ((dat->argv[i] = strndup(str, next - str)) == NULL) goto fail; + next++; + } + str = next; + } + } + goto done; +fail: + if (dat != NULL) free_pcons_args(&dat); +done: + if (orig_str != NULL) free(orig_str); + return dat; +} + +static const char* pcons_arg(pcons_args_t *dat, int i, const char* dflt_val) { + if (i >= dat->argc) return dflt_val; + return dat->argv[i]; +} + +static char* pcons_arg_copy(pcons_args_t *dat, int i, const char* dflt_val) { + const char *str = pcons_arg(dat, i, dflt_val); + return (str == NULL) ? NULL : strdup(str); +} + +static void pcons_unescape_arg(pcons_args_t *dat, int i) { + char *arg = (char*) pcons_arg(dat, i, NULL); + es_str_t *str = NULL; + if (arg != NULL) { + str = es_newStrFromCStr(arg, strlen(arg)); + if (str != NULL) { + es_unescapeStr(str); + free(arg); + dat->argv[i] = es_str2cstr(str, NULL); + es_deleteStr(str); + } + } +} + +/** + * Parse a TIMESTAMP as specified in RFC5424 (subset of RFC3339). + */ +PARSER(RFC5424Date) + const unsigned char *pszTS; + /* variables to temporarily hold time information while we parse */ + __attribute__((unused)) int year; + int month; + int day; + int hour; /* 24 hour clock */ + int minute; + int second; + __attribute__((unused)) int secfrac; /* fractional seconds (must be 32 bit!) */ + __attribute__((unused)) int secfracPrecision; + int OffsetHour; /* UTC offset in hours */ + int OffsetMinute; /* UTC offset in minutes */ + size_t len; + size_t orglen; + /* end variables to temporarily hold time information while we parse */ + + pszTS = (unsigned char*) str + *offs; + len = orglen = strLen - *offs; + + year = hParseInt(&pszTS, &len); + + /* We take the liberty to accept slightly malformed timestamps e.g. in + * the format of 2003-9-1T1:0:0. */ + if(len == 0 || *pszTS++ != '-') goto done; + --len; + month = hParseInt(&pszTS, &len); + if(month < 1 || month > 12) goto done; + + if(len == 0 || *pszTS++ != '-') + goto done; + --len; + day = hParseInt(&pszTS, &len); + if(day < 1 || day > 31) goto done; + + if(len == 0 || *pszTS++ != 'T') goto done; + --len; + + hour = hParseInt(&pszTS, &len); + if(hour < 0 || hour > 23) goto done; + + if(len == 0 || *pszTS++ != ':') + goto done; + --len; + minute = hParseInt(&pszTS, &len); + if(minute < 0 || minute > 59) goto done; + + if(len == 0 || *pszTS++ != ':') goto done; + --len; + second = hParseInt(&pszTS, &len); + if(second < 0 || second > 60) goto done; + + /* Now let's see if we have secfrac */ + if(len > 0 && *pszTS == '.') { + --len; + const unsigned char *pszStart = ++pszTS; + secfrac = hParseInt(&pszTS, &len); + secfracPrecision = (int) (pszTS - pszStart); + } else { + secfracPrecision = 0; + secfrac = 0; + } + + /* check the timezone */ + if(len == 0) goto done; + + if(*pszTS == 'Z') { + --len; + pszTS++; /* eat Z */ + } else if((*pszTS == '+') || (*pszTS == '-')) { + --len; + pszTS++; + + OffsetHour = hParseInt(&pszTS, &len); + if(OffsetHour < 0 || OffsetHour > 23) + goto done; + + if(len == 0 || *pszTS++ != ':') + goto done; + --len; + OffsetMinute = hParseInt(&pszTS, &len); + if(OffsetMinute < 0 || OffsetMinute > 59) + goto done; + } else { + /* there MUST be TZ information */ + goto done; + } + + if(len > 0) { + if(*pszTS != ' ') /* if it is not a space, it can not be a "good" time */ + goto done; + } + + /* we had success, so update parse pointer */ + *parsed = orglen - len; + + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse a RFC3164 Date. + */ +PARSER(RFC3164Date) + const unsigned char *p; + size_t len, orglen; + /* variables to temporarily hold time information while we parse */ + __attribute__((unused)) int month; + int day; +#if 0 /* TODO: why does this still exist? */ + int year = 0; /* 0 means no year provided */ +#endif + int hour; /* 24 hour clock */ + int minute; + int second; + + p = (unsigned char*) str + *offs; + orglen = len = strLen - *offs; + /* If we look at the month (Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec), + * we may see the following character sequences occur: + * + * J(an/u(n/l)), Feb, Ma(r/y), A(pr/ug), Sep, Oct, Nov, Dec + * + * We will use this for parsing, as it probably is the + * fastest way to parse it. + */ + if(len < 3) + goto done; + + switch(*p++) + { + case 'j': + case 'J': + if(*p == 'a' || *p == 'A') { + ++p; + if(*p == 'n' || *p == 'N') { + ++p; + month = 1; + } else + goto done; + } else if(*p == 'u' || *p == 'U') { + ++p; + if(*p == 'n' || *p == 'N') { + ++p; + month = 6; + } else if(*p == 'l' || *p == 'L') { + ++p; + month = 7; + } else + goto done; + } else + goto done; + break; + case 'f': + case 'F': + if(*p == 'e' || *p == 'E') { + ++p; + if(*p == 'b' || *p == 'B') { + ++p; + month = 2; + } else + goto done; + } else + goto done; + break; + case 'm': + case 'M': + if(*p == 'a' || *p == 'A') { + ++p; + if(*p == 'r' || *p == 'R') { + ++p; + month = 3; + } else if(*p == 'y' || *p == 'Y') { + ++p; + month = 5; + } else + goto done; + } else + goto done; + break; + case 'a': + case 'A': + if(*p == 'p' || *p == 'P') { + ++p; + if(*p == 'r' || *p == 'R') { + ++p; + month = 4; + } else + goto done; + } else if(*p == 'u' || *p == 'U') { + ++p; + if(*p == 'g' || *p == 'G') { + ++p; + month = 8; + } else + goto done; + } else + goto done; + break; + case 's': + case 'S': + if(*p == 'e' || *p == 'E') { + ++p; + if(*p == 'p' || *p == 'P') { + ++p; + month = 9; + } else + goto done; + } else + goto done; + break; + case 'o': + case 'O': + if(*p == 'c' || *p == 'C') { + ++p; + if(*p == 't' || *p == 'T') { + ++p; + month = 10; + } else + goto done; + } else + goto done; + break; + case 'n': + case 'N': + if(*p == 'o' || *p == 'O') { + + ++p; + if(*p == 'v' || *p == 'V') { + ++p; + month = 11; + } else + goto done; + } else + goto done; + break; + case 'd': + case 'D': + if(*p == 'e' || *p == 'E') { + ++p; + if(*p == 'c' || *p == 'C') { + ++p; + month = 12; + } else + goto done; + } else + goto done; + break; + default: + goto done; + } + + len -= 3; + + /* done month */ + + if(len == 0 || *p++ != ' ') + goto done; + --len; + + /* we accept a slightly malformed timestamp with one-digit days. */ + if(*p == ' ') { + --len; + ++p; + } + + day = hParseInt(&p, &len); + if(day < 1 || day > 31) + goto done; + + if(len == 0 || *p++ != ' ') + goto done; + --len; + + /* time part */ + hour = hParseInt(&p, &len); + if(hour > 1970 && hour < 2100) { + /* if so, we assume this actually is a year. This is a format found + * e.g. in Cisco devices. + * + year = hour; + */ + + /* re-query the hour, this time it must be valid */ + if(len == 0 || *p++ != ' ') + goto done; + --len; + hour = hParseInt(&p, &len); + } + + if(hour < 0 || hour > 23) + goto done; + + if(len == 0 || *p++ != ':') + goto done; + --len; + minute = hParseInt(&p, &len); + if(minute < 0 || minute > 59) + goto done; + + if(len == 0 || *p++ != ':') + goto done; + --len; + second = hParseInt(&p, &len); + if(second < 0 || second > 60) + goto done; + + /* we provide support for an extra ":" after the date. While this is an + * invalid format, it occurs frequently enough (e.g. with Cisco devices) + * to permit it as a valid case. -- rgerhards, 2008-09-12 + */ + if(len > 0 && *p == ':') { + ++p; /* just skip past it */ + --len; + } + + /* we had success, so update parse pointer */ + *parsed = orglen - len; + + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse a Number. + * Note that a number is an abstracted concept. We always represent it + * as 64 bits (but may later change our mind if performance dictates so). + */ +PARSER(Number) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + + for (i = *offs; i < strLen && isdigit(c[i]); i++); + if (i == *offs) + goto done; + + /* success, persist */ + *parsed = i - *offs; + + r = 0; /* success */ +done: + return r; +} + +/** + * Parse a Real-number in floating-pt form. + */ +PARSER(Float) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + + int seen_point = 0; + + i = *offs; + + if (c[i] == '-') i++; + + for (; i < strLen; i++) { + if (c[i] == '.') { + if (seen_point != 0) break; + seen_point = 1; + } else if (! isdigit(c[i])) { + break; + } + } + if (i == *offs) + goto done; + + /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse a hex Number. + * A hex number begins with 0x and contains only hex digits until the terminating + * whitespace. Note that if a non-hex character is deteced inside the number string, + * this is NOT considered to be a number. + */ +PARSER(HexNumber) + const char *c; + size_t i = *offs; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + + if(c[i] != '0' || c[i+1] != 'x') + goto done; + + for (i += 2 ; i < strLen && isxdigit(c[i]); i++); + if (i == *offs || !isspace(c[i])) + goto done; + + /* success, persist */ + *parsed = i - *offs; + + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse a kernel timestamp. + * This is a fixed format, see + * https://git.kernel.org/cgit/linux/kernel/git/torvalds/linux.git/tree/kernel/printk/printk.c?id=refs/tags/v4.0#n1011 + * This is the code that generates it: + * sprintf(buf, "[%5lu.%06lu] ", (unsigned long)ts, rem_nsec / 1000); + * We accept up to 12 digits for ts, everything above that for sure is + * no timestamp. + */ +#define LEN_KERNEL_TIMESTAMP 14 +PARSER(KernelTimestamp) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + + i = *offs; + if(c[i] != '[' || i+LEN_KERNEL_TIMESTAMP > strLen + || !isdigit(c[i+1]) + || !isdigit(c[i+2]) + || !isdigit(c[i+3]) + || !isdigit(c[i+4]) + || !isdigit(c[i+5]) + ) + goto done; + i += 6; + for(int j = 0 ; j < 7 && i < strLen && isdigit(c[i]) ; ) + ++i, ++j; /* just scan */ + + if(i >= strLen || c[i] != '.') + goto done; + + ++i; /* skip over '.' */ + + if( i+7 > strLen + || !isdigit(c[i+0]) + || !isdigit(c[i+1]) + || !isdigit(c[i+2]) + || !isdigit(c[i+3]) + || !isdigit(c[i+4]) + || !isdigit(c[i+5]) + || c[i+6] != ']' + ) + goto done; + i += 7; + + /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ +done: + return r; +} + +/** + * Parse whitespace. + * This parses all whitespace until the first non-whitespace character + * is found. This is primarily a tool to skip to the next "word" if + * the exact number of whitspace characters (and type of whitespace) + * is not known. The current parsing position MUST be on a whitspace, + * else the parser does not match. + * This parser is also a forward-compatibility tool for the upcoming + * slsa (simple log structure analyser) tool. + */ +PARSER(Whitespace) + const char *c; + size_t i = *offs; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + + if(!isspace(c[i])) + goto done; + + for (i++ ; i < strLen && isspace(c[i]); i++); + /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse a word. + * A word is a SP-delimited entity. The parser always works, except if + * the offset is position on a space upon entry. + */ +PARSER(Word) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + + /* search end of word */ + while(i < strLen && c[i] != ' ') + i++; + + if(i == *offs) + goto done; + + /* success, persist */ + *parsed = i - *offs; + + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse everything up to a specific string. + * swisskid, 2015-01-21 + */ +PARSER(StringTo) + const char *c; + char *toFind = NULL; + size_t i, j, k, m; + int chkstr; + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + assert(ed != NULL); + k = es_strlen(ed) - 1; + toFind = es_str2cstr(ed, NULL); + c = str; + i = *offs; + chkstr = 0; + + /* Total hunt for letter */ + while(chkstr == 0 && i < strLen ) { + i++; + if(c[i] == toFind[0]) { + /* Found the first letter, now find the rest of the string */ + j = 0; + m = i; + while(m < strLen && j < k ) { + m++; + j++; + if(c[m] != toFind[j]) + break; + if (j == k) + chkstr = 1; + } + } + } + if(i == *offs || i == strLen || c[i] != toFind[0]) + goto done; + + /* success, persist */ + *parsed = i - *offs; + + r = 0; /* success */ +done: + if(toFind != NULL) free(toFind); + return r; +} + +/** + * Parse a alphabetic word. + * A alpha word is composed of characters for which isalpha returns true. + * The parser dones if there is no alpha character at all. + */ +PARSER(Alpha) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + + /* search end of word */ + while(i < strLen && isalpha(c[i])) + i++; + + if(i == *offs) { + goto done; + } + + /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse everything up to a specific character. + * The character must be the only char inside extra data passed to the parser. + * It is a program error if strlen(ed) != 1. It is considered a format error if + * a) the to-be-parsed buffer is already positioned on the terminator character + * b) there is no terminator until the end of the buffer + * In those cases, the parsers declares itself as not being successful, in all + * other cases a string is extracted. + */ +PARSER(CharTo) + const char *c; + unsigned char cTerm; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + assert(es_strlen(ed) == 1); + cTerm = *(es_getBufAddr(ed)); + c = str; + i = *offs; + + /* search end of word */ + while(i < strLen && c[i] != cTerm) + i++; + + if(i == *offs || i == strLen || c[i] != cTerm) + goto done; + + /* success, persist */ + *parsed = i - *offs; + + r = 0; /* success */ +done: + return r; +} + + +/** + * Parse everything up to a specific character, or up to the end of string. + * The character must be the only char inside extra data passed to the parser. + * It is a program error if strlen(ed) != 1. + * This parser always returns success. + * By nature of the parser, it is required that end of string or the separator + * follows this field in rule. + */ +PARSER(CharSeparated) + const char *c; + unsigned char cTerm; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + assert(es_strlen(ed) == 1); + cTerm = *(es_getBufAddr(ed)); + c = str; + i = *offs; + + /* search end of word */ + while(i < strLen && c[i] != cTerm) + i++; + + /* success, persist */ + *parsed = i - *offs; + + r = 0; /* success */ + return r; +} + +/** + * Parse yet-to-be-matched portion of string by re-applying + * top-level rules again. + */ +#define DEFAULT_REMAINING_FIELD_NAME "tail" + +struct recursive_parser_data_s { + ln_ctx ctx; + char* remaining_field; + int free_ctx; +}; + +PARSER(Recursive) + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + + struct recursive_parser_data_s* pData = (struct recursive_parser_data_s*) node->parser_data; + + if (pData != NULL) { + int remaining_len = strLen - *offs; + const char *remaining_str = str + *offs; + json_object *unparsed = NULL; + CHKN(*value = json_object_new_object()); + + ln_normalize(pData->ctx, remaining_str, remaining_len, value); + + if (json_object_object_get_ex(*value, UNPARSED_DATA_KEY, &unparsed)) { + json_object_put(*value); + *value = NULL; + *parsed = 0; + } else if (pData->remaining_field != NULL + && json_object_object_get_ex(*value, pData->remaining_field, &unparsed)) { + *parsed = strLen - *offs - json_object_get_string_len(unparsed); + json_object_object_del(*value, pData->remaining_field); + } else { + *parsed = strLen - *offs; + } + } + r = 0; /* success */ +done: + return r; +} + +typedef ln_ctx (ctx_constructor)(ln_ctx, pcons_args_t*, const char*); + +static void* +_recursive_parser_data_constructor(ln_fieldList_t *node, + ln_ctx ctx, + int no_of_args, + int remaining_field_arg_idx, + int free_ctx, ctx_constructor *fn) +{ + int r = LN_BADCONFIG; + char* name = NULL; + struct recursive_parser_data_s *pData = NULL; + pcons_args_t *args = NULL; + CHKN(name = es_str2cstr(node->name, NULL)); + CHKN(pData = calloc(1, sizeof(struct recursive_parser_data_s))); + pData->free_ctx = free_ctx; + pData->remaining_field = NULL; + CHKN(args = pcons_args(node->raw_data, no_of_args)); + CHKN(pData->ctx = fn(ctx, args, name)); + CHKN(pData->remaining_field = pcons_arg_copy(args, remaining_field_arg_idx, DEFAULT_REMAINING_FIELD_NAME)); + r = 0; +done: + if (r != 0) { + if (name == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for recursive/descent field name"); + else if (pData == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for parser-data for field: %s", name); + else if (args == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for argument-parsing for field: %s", name); + else if (pData->ctx == NULL) + ln_dbgprintf(ctx, "recursive/descent normalizer context creation " + "doneed for field: %s", name); + else if (pData->remaining_field == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for remaining-field name for " + "recursive/descent field: %s", name); + + recursive_parser_data_destructor((void**) &pData); + } + free(name); + free_pcons_args(&args); + return pData; +} + +static ln_ctx identity_recursive_parse_ctx_constructor(ln_ctx parent_ctx, + __attribute__((unused)) pcons_args_t* args, + __attribute__((unused)) const char* field_name) { + return parent_ctx; +} + +void* recursive_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx) { + return _recursive_parser_data_constructor(node, ctx, 1, 0, 0, identity_recursive_parse_ctx_constructor); +} + +static ln_ctx child_recursive_parse_ctx_constructor(ln_ctx parent_ctx, pcons_args_t* args, const char* field_name) { + int r = LN_BADCONFIG; + const char* rb = NULL; + ln_ctx ctx = NULL; + pcons_unescape_arg(args, 0); + CHKN(rb = pcons_arg(args, 0, NULL)); + CHKN(ctx = ln_v1_inherittedCtx(parent_ctx)); + CHKR(ln_v1_loadSamples(ctx, rb)); +done: + if (r != 0) { + if (rb == NULL) + ln_dbgprintf(parent_ctx, "file-name for descent rulebase not provided for field: %s", + field_name); + else if (ctx == NULL) + ln_dbgprintf(parent_ctx, "couldn't allocate memory to create descent-field normalizer " + "context for field: %s", field_name); + else + ln_dbgprintf(parent_ctx, "couldn't load samples into descent context for field: %s", + field_name); + if (ctx != NULL) ln_exitCtx(ctx); + ctx = NULL; + } + return ctx; +} + +void* descent_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx) { + return _recursive_parser_data_constructor(node, ctx, 2, 1, 1, child_recursive_parse_ctx_constructor); +} + +void recursive_parser_data_destructor(void** dataPtr) { + if (*dataPtr != NULL) { + struct recursive_parser_data_s *pData = (struct recursive_parser_data_s*) *dataPtr; + if (pData->free_ctx && pData->ctx != NULL) { + ln_exitCtx(pData->ctx); + pData->ctx = NULL; + } + if (pData->remaining_field != NULL) free(pData->remaining_field); + free(pData); + *dataPtr = NULL; + } +}; + +/** + * Parse string tokenized by given char-sequence + * The sequence may appear 0 or more times, but zero times means 1 token. + * NOTE: its not 0 tokens, but 1 token. + * + * The token found is parsed according to the field-type provided after + * tokenizer char-seq. + */ +#define DEFAULT_MATCHED_FIELD_NAME "default" + +struct tokenized_parser_data_s { + es_str_t *tok_str; + ln_ctx ctx; + char *remaining_field; + int use_default_field; + int free_ctx; +}; + +typedef struct tokenized_parser_data_s tokenized_parser_data_t; + +PARSER(Tokenized) + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + + tokenized_parser_data_t *pData = (tokenized_parser_data_t*) node->parser_data; + + if (pData != NULL ) { + json_object *json_p = NULL; + if (pData->use_default_field) CHKN(json_p = json_object_new_object()); + json_object *matches = NULL; + CHKN(matches = json_object_new_array()); + + int remaining_len = strLen - *offs; + const char *remaining_str = str + *offs; + json_object *remaining = NULL; + json_object *match = NULL; + + while (remaining_len > 0) { + if (! pData->use_default_field) { + json_object_put(json_p); + json_p = json_object_new_object(); + } /*TODO: handle null condition gracefully*/ + + ln_normalize(pData->ctx, remaining_str, remaining_len, &json_p); + + if (remaining) json_object_put(remaining); + + if (pData->use_default_field + && json_object_object_get_ex(json_p, DEFAULT_MATCHED_FIELD_NAME, &match)) { + json_object_array_add(matches, json_object_get(match)); + } else if (! (pData->use_default_field + || json_object_object_get_ex(json_p, UNPARSED_DATA_KEY, &match))) { + json_object_array_add(matches, json_object_get(json_p)); + } else { + if (json_object_array_length(matches) > 0) { + remaining_len += es_strlen(pData->tok_str); + break; + } else { + json_object_put(json_p); + json_object_put(matches); + FAIL(LN_WRONGPARSER); + } + } + + if (json_object_object_get_ex(json_p, pData->remaining_field, &remaining)) { + remaining_len = json_object_get_string_len(remaining); + if (remaining_len > 0) { + remaining_str = json_object_get_string(json_object_get(remaining)); + json_object_object_del(json_p, pData->remaining_field); + if (es_strbufcmp(pData->tok_str, (const unsigned char *)remaining_str, + es_strlen(pData->tok_str))) { + json_object_put(remaining); + break; + } else { + remaining_str += es_strlen(pData->tok_str); + remaining_len -= es_strlen(pData->tok_str); + } + } + } else { + remaining_len = 0; + break; + } + + if (pData->use_default_field) json_object_object_del(json_p, DEFAULT_MATCHED_FIELD_NAME); + } + json_object_put(json_p); + + /* success, persist */ + *parsed = (strLen - *offs) - remaining_len; + *value = matches; + } else { + FAIL(LN_BADPARSERSTATE); + } + + r = 0; /* success */ +done: + return r; +} + +void tokenized_parser_data_destructor(void** dataPtr) { + tokenized_parser_data_t *data = (tokenized_parser_data_t*) *dataPtr; + if (data->tok_str != NULL) es_deleteStr(data->tok_str); + if (data->free_ctx && (data->ctx != NULL)) ln_exitCtx(data->ctx); + if (data->remaining_field != NULL) free(data->remaining_field); + free(data); + *dataPtr = NULL; +} + +static void load_generated_parser_samples(ln_ctx ctx, + const char* const field_descr, const int field_descr_len, + const char* const suffix, const int length) { + static const char* const RULE_PREFIX = "rule=:%"DEFAULT_MATCHED_FIELD_NAME":";/*TODO: extract nice constants*/ + static const int RULE_PREFIX_LEN = 15; + + char *sample_str = NULL; + es_str_t *field_decl = es_newStrFromCStr(RULE_PREFIX, RULE_PREFIX_LEN); + if (! field_decl) goto free; + + if (es_addBuf(&field_decl, field_descr, field_descr_len) + || es_addBuf(&field_decl, "%", 1) + || es_addBuf(&field_decl, suffix, length)) { + ln_dbgprintf(ctx, "couldn't prepare field for tokenized field-picking: '%s'", field_descr); + goto free; + } + sample_str = es_str2cstr(field_decl, NULL); + if (! sample_str) { + ln_dbgprintf(ctx, "couldn't prepare sample-string for: '%s'", field_descr); + goto free; + } + ln_v1_loadSample(ctx, sample_str); +free: + if (sample_str) free(sample_str); + if (field_decl) es_deleteStr(field_decl); +} + +static ln_ctx generate_context_with_field_as_prefix(ln_ctx parent, const char* field_descr, int field_descr_len) { + int r = LN_BADCONFIG; + const char* remaining_field = "%"DEFAULT_REMAINING_FIELD_NAME":rest%"; + ln_ctx ctx = NULL; + CHKN(ctx = ln_v1_inherittedCtx(parent)); + load_generated_parser_samples(ctx, field_descr, field_descr_len, remaining_field, strlen(remaining_field)); + load_generated_parser_samples(ctx, field_descr, field_descr_len, "", 0); + r = 0; +done: + if (r != 0) { + ln_exitCtx(ctx); + ctx = NULL; + } + return ctx; +} + +static ln_fieldList_t* parse_tokenized_content_field(ln_ctx ctx, const char* field_descr, size_t field_descr_len) { + es_str_t* tmp = NULL; + es_str_t* descr = NULL; + ln_fieldList_t *node = NULL; + int r = 0; + CHKN(tmp = es_newStr(80)); + CHKN(descr = es_newStr(80)); + const char* field_prefix = "%" DEFAULT_MATCHED_FIELD_NAME ":"; + CHKR(es_addBuf(&descr, field_prefix, strlen(field_prefix))); + CHKR(es_addBuf(&descr, field_descr, field_descr_len)); + CHKR(es_addChar(&descr, '%')); + es_size_t offset = 0; + CHKN(node = ln_v1_parseFieldDescr(ctx, descr, &offset, &tmp, &r)); + if (offset != es_strlen(descr)) FAIL(LN_BADPARSERSTATE); +done: + if (r != 0) { + if (node != NULL) ln_deletePTreeNode(node); + node = NULL; + } + if (descr != NULL) es_deleteStr(descr); + if (tmp != NULL) es_deleteStr(tmp); + return node; +} + +void* tokenized_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx) { + int r = LN_BADCONFIG; + char* name = es_str2cstr(node->name, NULL); + pcons_args_t *args = NULL; + tokenized_parser_data_t *pData = NULL; + const char *field_descr = NULL; + ln_fieldList_t* field = NULL; + const char *tok = NULL; + + CHKN(args = pcons_args(node->raw_data, 2)); + + CHKN(pData = calloc(1, sizeof(tokenized_parser_data_t))); + pcons_unescape_arg(args, 0); + CHKN(tok = pcons_arg(args, 0, NULL)); + CHKN(pData->tok_str = es_newStrFromCStr(tok, strlen(tok))); + es_unescapeStr(pData->tok_str); + CHKN(field_descr = pcons_arg(args, 1, NULL)); + const int field_descr_len = strlen(field_descr); + pData->free_ctx = 1; + CHKN(field = parse_tokenized_content_field(ctx, field_descr, field_descr_len)); + if (field->parser == ln_parseRecursive) { + pData->use_default_field = 0; + struct recursive_parser_data_s *dat = (struct recursive_parser_data_s*) field->parser_data; + if (dat != NULL) { + CHKN(pData->remaining_field = strdup(dat->remaining_field)); + pData->free_ctx = dat->free_ctx; + pData->ctx = dat->ctx; + dat->free_ctx = 0; + } + } else { + pData->use_default_field = 1; + CHKN(pData->ctx = generate_context_with_field_as_prefix(ctx, field_descr, field_descr_len)); + } + if (pData->remaining_field == NULL) CHKN(pData->remaining_field = strdup(DEFAULT_REMAINING_FIELD_NAME)); + r = 0; +done: + if (r != 0) { + if (name == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for tokenized-field name"); + else if (args == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for argument-parsing for field: %s", name); + else if (pData == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for parser-data for field: %s", name); + else if (tok == NULL) + ln_dbgprintf(ctx, "token-separator not provided for field: %s", name); + else if (pData->tok_str == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for token-separator " + "for field: %s", name); + else if (field_descr == NULL) + ln_dbgprintf(ctx, "field-type not provided for field: %s", name); + else if (field == NULL) + ln_dbgprintf(ctx, "couldn't resolve single-token field-type for tokenized field: %s", name); + else if (pData->ctx == NULL) + ln_dbgprintf(ctx, "couldn't initialize normalizer-context for field: %s", name); + else if (pData->remaining_field == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for " + "remaining-field-name for field: %s", name); + if (pData) tokenized_parser_data_destructor((void**) &pData); + } + if (name != NULL) free(name); + if (field != NULL) ln_deletePTreeNode(field); + if (args) free_pcons_args(&args); + return pData; +} + +#ifdef FEATURE_REGEXP + +/** + * Parse string matched by provided posix extended regex. + * + * Please note that using regex field in most cases will be + * significantly slower than other field-types. + */ +struct regex_parser_data_s { + pcre *re; + int consume_group; + int return_group; + int max_groups; +}; + +PARSER(Regex) + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + unsigned int* ovector = NULL; + + struct regex_parser_data_s *pData = (struct regex_parser_data_s*) node->parser_data; + if (pData != NULL) { + ovector = calloc(pData->max_groups, sizeof(unsigned int) * 3); + if (ovector == NULL) FAIL(LN_NOMEM); + + int result = pcre_exec(pData->re, NULL, str, strLen, *offs, 0, (int*) ovector, pData->max_groups * 3); + if (result == 0) result = pData->max_groups; + if (result > pData->consume_group) { + /*please check 'man 3 pcreapi' for cryptic '2 * n' and '2 * n + 1' magic*/ + if (ovector[2 * pData->consume_group] == *offs) { + *parsed = ovector[2 * pData->consume_group + 1] - ovector[2 * pData->consume_group]; + if (pData->consume_group != pData->return_group) { + char* val = NULL; + if((val = strndup(str + ovector[2 * pData->return_group], + ovector[2 * pData->return_group + 1] - + ovector[2 * pData->return_group])) == NULL) { + free(ovector); + FAIL(LN_NOMEM); + } + *value = json_object_new_string(val); + free(val); + if (*value == NULL) { + free(ovector); + FAIL(LN_NOMEM); + } + } + } + } + free(ovector); + } + r = 0; /* success */ +done: + return r; +} + +static const char* regex_parser_configure_consume_and_return_group(pcons_args_t* args, +struct regex_parser_data_s *pData) { + const char* consume_group_parse_error = "couldn't parse consume-group number"; + const char* return_group_parse_error = "couldn't parse return-group number"; + + char* tmp = NULL; + + const char* consume_grp_str = NULL; + const char* return_grp_str = NULL; + + if ((consume_grp_str = pcons_arg(args, 1, "0")) == NULL || + strlen(consume_grp_str) == 0) return consume_group_parse_error; + if ((return_grp_str = pcons_arg(args, 2, consume_grp_str)) == NULL || + strlen(return_grp_str) == 0) return return_group_parse_error; + + errno = 0; + pData->consume_group = strtol(consume_grp_str, &tmp, 10); + if (errno != 0 || strlen(tmp) != 0) return consume_group_parse_error; + + pData->return_group = strtol(return_grp_str, &tmp, 10); + if (errno != 0 || strlen(tmp) != 0) return return_group_parse_error; + + return NULL; +} + +void* regex_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx) { + int r = LN_BADCONFIG; + char* exp = NULL; + const char* grp_parse_err = NULL; + pcons_args_t* args = NULL; + char* name = NULL; + struct regex_parser_data_s *pData = NULL; + const char *unescaped_exp = NULL; + const char *error = NULL; + int erroffset = 0; + + + CHKN(name = es_str2cstr(node->name, NULL)); + + if (! ctx->opts & LN_CTXOPT_ALLOW_REGEX) FAIL(LN_BADCONFIG); + CHKN(pData = malloc(sizeof(struct regex_parser_data_s))); + pData->re = NULL; + + CHKN(args = pcons_args(node->raw_data, 3)); + pData->consume_group = pData->return_group = 0; + CHKN(unescaped_exp = pcons_arg(args, 0, NULL)); + pcons_unescape_arg(args, 0); + CHKN(exp = pcons_arg_copy(args, 0, NULL)); + + if ((grp_parse_err = regex_parser_configure_consume_and_return_group(args, pData)) != NULL) + FAIL(LN_BADCONFIG); + + CHKN(pData->re = pcre_compile(exp, 0, &error, &erroffset, NULL)); + + pData->max_groups = ((pData->consume_group > pData->return_group) ? pData->consume_group : + pData->return_group) + 1; + r = 0; +done: + if (r != 0) { + if (name == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory regex-field name"); + else if (! ctx->opts & LN_CTXOPT_ALLOW_REGEX) + ln_dbgprintf(ctx, "regex support is not enabled for: '%s' " + "(please check lognorm context initialization)", name); + else if (pData == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for parser-data for field: %s", name); + else if (args == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for argument-parsing for field: %s", name); + else if (unescaped_exp == NULL) + ln_dbgprintf(ctx, "regular-expression missing for field: '%s'", name); + else if (exp == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for regex-string for field: '%s'", name); + else if (grp_parse_err != NULL) + ln_dbgprintf(ctx, "%s for: '%s'", grp_parse_err, name); + else if (pData->re == NULL) + ln_dbgprintf(ctx, "couldn't compile regex(encountered error '%s' at char '%d' in pattern) " + "for regex-matched field: '%s'", error, erroffset, name); + regex_parser_data_destructor((void**)&pData); + } + if (exp != NULL) free(exp); + if (args != NULL) free_pcons_args(&args); + if (name != NULL) free(name); + return pData; +} + +void regex_parser_data_destructor(void** dataPtr) { + if ((*dataPtr) != NULL) { + struct regex_parser_data_s *pData = (struct regex_parser_data_s*) *dataPtr; + if (pData->re != NULL) pcre_free(pData->re); + free(pData); + *dataPtr = NULL; + } +} + +#endif + +/** + * Parse yet-to-be-matched portion of string by re-applying + * top-level rules again. + */ +typedef enum interpret_type { + /* If you change this, be sure to update json_type_to_name() too */ + it_b10int, + it_b16int, + it_floating_pt, + it_boolean +} interpret_type; + +struct interpret_parser_data_s { + ln_ctx ctx; + enum interpret_type intrprt; +}; + +static json_object* interpret_as_int(json_object *value, int base) { + if (json_object_is_type(value, json_type_string)) { + return json_object_new_int64(strtol(json_object_get_string(value), NULL, base)); + } else if (json_object_is_type(value, json_type_int)) { + return value; + } else { + return NULL; + } +} + +static json_object* interpret_as_double(json_object *value) { + double val = json_object_get_double(value); + return json_object_new_double(val); +} + +static json_object* interpret_as_boolean(json_object *value) { + json_bool val; + if (json_object_is_type(value, json_type_string)) { + const char* str = json_object_get_string(value); + val = (strcasecmp(str, "false") == 0 || strcasecmp(str, "no") == 0) ? 0 : 1; + } else { + val = json_object_get_boolean(value); + } + return json_object_new_boolean(val); +} + +static int reinterpret_value(json_object **value, enum interpret_type to_type) { + switch(to_type) { + case it_b10int: + *value = interpret_as_int(*value, 10); + break; + case it_b16int: + *value = interpret_as_int(*value, 16); + break; + case it_floating_pt: + *value = interpret_as_double(*value); + break; + case it_boolean: + *value = interpret_as_boolean(*value); + break; + default: + return 0; + } + return 1; +} + +PARSER(Interpret) + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + json_object *unparsed = NULL; + json_object *parsed_raw = NULL; + + struct interpret_parser_data_s* pData = (struct interpret_parser_data_s*) node->parser_data; + + if (pData != NULL) { + int remaining_len = strLen - *offs; + const char *remaining_str = str + *offs; + + CHKN(parsed_raw = json_object_new_object()); + + ln_normalize(pData->ctx, remaining_str, remaining_len, &parsed_raw); + + if (json_object_object_get_ex(parsed_raw, UNPARSED_DATA_KEY, NULL)) { + *parsed = 0; + } else { + json_object_object_get_ex(parsed_raw, DEFAULT_MATCHED_FIELD_NAME, value); + json_object_object_get_ex(parsed_raw, DEFAULT_REMAINING_FIELD_NAME, &unparsed); + if (reinterpret_value(value, pData->intrprt)) { + *parsed = strLen - *offs - json_object_get_string_len(unparsed); + } + } + json_object_put(parsed_raw); + } + r = 0; /* success */ +done: + return r; +} + +void* interpret_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx) { + int r = LN_BADCONFIG; + char* name = NULL; + struct interpret_parser_data_s *pData = NULL; + pcons_args_t *args = NULL; + int bad_interpret = 0; + const char* type_str = NULL; + const char *field_type = NULL; + CHKN(name = es_str2cstr(node->name, NULL)); + CHKN(pData = calloc(1, sizeof(struct interpret_parser_data_s))); + CHKN(args = pcons_args(node->raw_data, 2)); + CHKN(type_str = pcons_arg(args, 0, NULL)); + if (strcmp(type_str, "int") == 0 || strcmp(type_str, "base10int") == 0) { + pData->intrprt = it_b10int; + } else if (strcmp(type_str, "base16int") == 0) { + pData->intrprt = it_b16int; + } else if (strcmp(type_str, "float") == 0) { + pData->intrprt = it_floating_pt; + } else if (strcmp(type_str, "bool") == 0) { + pData->intrprt = it_boolean; + } else { + bad_interpret = 1; + FAIL(LN_BADCONFIG); + } + + CHKN(field_type = pcons_arg(args, 1, NULL)); + CHKN(pData->ctx = generate_context_with_field_as_prefix(ctx, field_type, strlen(field_type))); + r = 0; +done: + if (r != 0) { + if (name == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for interpret-field name"); + else if (pData == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for parser-data for field: %s", name); + else if (args == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for argument-parsing for field: %s", name); + else if (type_str == NULL) + ln_dbgprintf(ctx, "no type provided for interpretation of field: %s", name); + else if (bad_interpret != 0) + ln_dbgprintf(ctx, "interpretation to unknown type '%s' requested for field: %s", + type_str, name); + else if (field_type == NULL) + ln_dbgprintf(ctx, "field-type to actually match the content not provided for " + "field: %s", name); + else if (pData->ctx == NULL) + ln_dbgprintf(ctx, "couldn't instantiate the normalizer context for matching " + "field: %s", name); + + interpret_parser_data_destructor((void**) &pData); + } + free(name); + free_pcons_args(&args); + return pData; +} + +void interpret_parser_data_destructor(void** dataPtr) { + if (*dataPtr != NULL) { + struct interpret_parser_data_s *pData = (struct interpret_parser_data_s*) *dataPtr; + if (pData->ctx != NULL) ln_exitCtx(pData->ctx); + free(pData); + *dataPtr = NULL; + } +}; + +/** + * Parse suffixed char-sequence, where suffix is one of many possible suffixes. + */ +struct suffixed_parser_data_s { + int nsuffix; + int *suffix_offsets; + int *suffix_lengths; + char* suffixes_str; + ln_ctx ctx; + char* value_field_name; + char* suffix_field_name; +}; + +PARSER(Suffixed) { + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + json_object *unparsed = NULL; + json_object *parsed_raw = NULL; + json_object *parsed_value = NULL; + json_object *result = NULL; + json_object *suffix = NULL; + + struct suffixed_parser_data_s *pData = (struct suffixed_parser_data_s*) node->parser_data; + + if (pData != NULL) { + int remaining_len = strLen - *offs; + const char *remaining_str = str + *offs; + int i; + + CHKN(parsed_raw = json_object_new_object()); + + ln_normalize(pData->ctx, remaining_str, remaining_len, &parsed_raw); + + if (json_object_object_get_ex(parsed_raw, UNPARSED_DATA_KEY, NULL)) { + *parsed = 0; + } else { + json_object_object_get_ex(parsed_raw, DEFAULT_MATCHED_FIELD_NAME, &parsed_value); + json_object_object_get_ex(parsed_raw, DEFAULT_REMAINING_FIELD_NAME, &unparsed); + const char* unparsed_frag = json_object_get_string(unparsed); + for(i = 0; i < pData->nsuffix; i++) { + const char* possible_suffix = pData->suffixes_str + pData->suffix_offsets[i]; + int len = pData->suffix_lengths[i]; + if (strncmp(possible_suffix, unparsed_frag, len) == 0) { + CHKN(result = json_object_new_object()); + CHKN(suffix = json_object_new_string(possible_suffix)); + json_object_get(parsed_value); + json_object_object_add(result, pData->value_field_name, parsed_value); + json_object_object_add(result, pData->suffix_field_name, suffix); + *parsed = strLen - *offs - json_object_get_string_len(unparsed) + len; + break; + } + } + if (result != NULL) { + *value = result; + } + } + } +FAILParser + if (r != 0) { + if (result != NULL) json_object_put(result); + } + if (parsed_raw != NULL) json_object_put(parsed_raw); +} ENDFailParser + +static struct suffixed_parser_data_s* _suffixed_parser_data_constructor(ln_fieldList_t *node, + ln_ctx ctx, + es_str_t* raw_args, + const char* value_field, + const char* suffix_field) { + int r = LN_BADCONFIG; + pcons_args_t* args = NULL; + char* name = NULL; + struct suffixed_parser_data_s *pData = NULL; + const char *escaped_tokenizer = NULL; + const char *uncopied_suffixes_str = NULL; + const char *tokenizer = NULL; + char *suffixes_str = NULL; + const char *field_type = NULL; + + char *tok_saveptr = NULL; + char *tok_input = NULL; + int i = 0; + char *tok = NULL; + + CHKN(name = es_str2cstr(node->name, NULL)); + + CHKN(pData = calloc(1, sizeof(struct suffixed_parser_data_s))); + + if (value_field == NULL) value_field = "value"; + if (suffix_field == NULL) suffix_field = "suffix"; + pData->value_field_name = strdup(value_field); + pData->suffix_field_name = strdup(suffix_field); + + CHKN(args = pcons_args(raw_args, 3)); + CHKN(escaped_tokenizer = pcons_arg(args, 0, NULL)); + pcons_unescape_arg(args, 0); + CHKN(tokenizer = pcons_arg(args, 0, NULL)); + + CHKN(uncopied_suffixes_str = pcons_arg(args, 1, NULL)); + pcons_unescape_arg(args, 1); + CHKN(suffixes_str = pcons_arg_copy(args, 1, NULL)); + + tok_input = suffixes_str; + while (strtok_r(tok_input, tokenizer, &tok_saveptr) != NULL) { + tok_input = NULL; + pData->nsuffix++; + } + + if (pData->nsuffix == 0) { + FAIL(LN_INVLDFDESCR); + } + CHKN(pData->suffix_offsets = calloc(pData->nsuffix, sizeof(int))); + CHKN(pData->suffix_lengths = calloc(pData->nsuffix, sizeof(int))); + CHKN(pData->suffixes_str = pcons_arg_copy(args, 1, NULL)); + + tok_input = pData->suffixes_str; + while ((tok = strtok_r(tok_input, tokenizer, &tok_saveptr)) != NULL) { + tok_input = NULL; + pData->suffix_offsets[i] = tok - pData->suffixes_str; + pData->suffix_lengths[i++] = strlen(tok); + } + + CHKN(field_type = pcons_arg(args, 2, NULL)); + CHKN(pData->ctx = generate_context_with_field_as_prefix(ctx, field_type, strlen(field_type))); + + r = 0; +done: + if (r != 0) { + if (name == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory suffixed-field name"); + else if (pData == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for parser-data for field: %s", name); + else if (pData->value_field_name == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for value-field's name for field: %s", name); + else if (pData->suffix_field_name == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for suffix-field's name for field: %s", name); + else if (args == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for argument-parsing for field: %s", name); + else if (escaped_tokenizer == NULL) + ln_dbgprintf(ctx, "suffix token-string missing for field: '%s'", name); + else if (tokenizer == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for unescaping token-string for field: '%s'", + name); + else if (uncopied_suffixes_str == NULL) + ln_dbgprintf(ctx, "suffix-list missing for field: '%s'", name); + else if (suffixes_str == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for suffix-list for field: '%s'", name); + else if (pData->nsuffix == 0) + ln_dbgprintf(ctx, "could't read suffix-value(s) for field: '%s'", name); + else if (pData->suffix_offsets == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for suffix-list element references for field: " + "'%s'", name); + else if (pData->suffix_lengths == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for suffix-list element lengths for field: '%s'", + name); + else if (pData->suffixes_str == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for suffix-list for field: '%s'", name); + else if (field_type == NULL) + ln_dbgprintf(ctx, "field-type declaration missing for field: '%s'", name); + else if (pData->ctx == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for normalizer-context for field: '%s'", name); + suffixed_parser_data_destructor((void**)&pData); + } + free_pcons_args(&args); + if (suffixes_str != NULL) free(suffixes_str); + if (name != NULL) free(name); + return pData; +} + +void* suffixed_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx) { + return _suffixed_parser_data_constructor(node, ctx, node->raw_data, NULL, NULL); +} + +void* named_suffixed_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx) { + int r = LN_BADCONFIG; + pcons_args_t* args = NULL; + char* name = NULL; + const char* value_field_name = NULL; + const char* suffix_field_name = NULL; + const char* remaining_args = NULL; + es_str_t* unnamed_suffix_args = NULL; + struct suffixed_parser_data_s* pData = NULL; + CHKN(name = es_str2cstr(node->name, NULL)); + CHKN(args = pcons_args(node->raw_data, 3)); + CHKN(value_field_name = pcons_arg(args, 0, NULL)); + CHKN(suffix_field_name = pcons_arg(args, 1, NULL)); + CHKN(remaining_args = pcons_arg(args, 2, NULL)); + CHKN(unnamed_suffix_args = es_newStrFromCStr(remaining_args, strlen(remaining_args))); + + CHKN(pData = _suffixed_parser_data_constructor(node, ctx, unnamed_suffix_args, value_field_name, + suffix_field_name)); + r = 0; +done: + if (r != 0) { + if (name == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory named_suffixed-field name"); + else if (args == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for argument-parsing for field: %s", name); + else if (value_field_name == NULL) + ln_dbgprintf(ctx, "key-name for value not provided for field: %s", name); + else if (suffix_field_name == NULL) + ln_dbgprintf(ctx, "key-name for suffix not provided for field: %s", name); + else if (unnamed_suffix_args == NULL) + ln_dbgprintf(ctx, "couldn't allocate memory for unnamed-suffix-field args for field: %s", + name); + else if (pData == NULL) + ln_dbgprintf(ctx, "couldn't create parser-data for field: %s", name); + suffixed_parser_data_destructor((void**)&pData); + } + if (unnamed_suffix_args != NULL) free(unnamed_suffix_args); + if (args != NULL) free_pcons_args(&args); + if (name != NULL) free(name); + return pData; +} + + +void suffixed_parser_data_destructor(void** dataPtr) { + if ((*dataPtr) != NULL) { + struct suffixed_parser_data_s *pData = (struct suffixed_parser_data_s*) *dataPtr; + if (pData->suffixes_str != NULL) free(pData->suffixes_str); + if (pData->suffix_offsets != NULL) free(pData->suffix_offsets); + if (pData->suffix_lengths != NULL) free(pData->suffix_lengths); + if (pData->value_field_name != NULL) free(pData->value_field_name); + if (pData->suffix_field_name != NULL) free(pData->suffix_field_name); + if (pData->ctx != NULL) ln_exitCtx(pData->ctx); + free(pData); + *dataPtr = NULL; + } +} + +/** + * Just get everything till the end of string. + */ +PARSER(Rest) + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + + /* silence the warning about unused variable */ + (void)str; + /* success, persist */ + *parsed = strLen - *offs; + r = 0; + return r; +} + +/** + * Parse a possibly quoted string. In this initial implementation, escaping of the quote + * char is not supported. A quoted string is one start starts with a double quote, + * has some text (not containing double quotes) and ends with the first double + * quote character seen. The extracted string does NOT include the quote characters. + * swisskid, 2015-01-21 + */ +PARSER(OpQuotedString) + const char *c; + size_t i; + char *cstr = NULL; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + + if(c[i] != '"') { + while(i < strLen && c[i] != ' ') + i++; + + if(i == *offs) + goto done; + + /* success, persist */ + *parsed = i - *offs; + /* create JSON value to save quoted string contents */ + CHKN(cstr = strndup((char*)c + *offs, *parsed)); + } else { + ++i; + + /* search end of string */ + while(i < strLen && c[i] != '"') + i++; + + if(i == strLen || c[i] != '"') + goto done; + /* success, persist */ + *parsed = i + 1 - *offs; /* "eat" terminal double quote */ + /* create JSON value to save quoted string contents */ + CHKN(cstr = strndup((char*)c + *offs + 1, *parsed - 2)); + } + CHKN(*value = json_object_new_string(cstr)); + + r = 0; /* success */ +done: + free(cstr); + return r; +} + +/** + * Parse a quoted string. In this initial implementation, escaping of the quote + * char is not supported. A quoted string is one start starts with a double quote, + * has some text (not containing double quotes) and ends with the first double + * quote character seen. The extracted string does NOT include the quote characters. + * rgerhards, 2011-01-14 + */ +PARSER(QuotedString) + const char *c; + size_t i; + char *cstr = NULL; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + if(i + 2 > strLen) + goto done; /* needs at least 2 characters */ + + if(c[i] != '"') + goto done; + ++i; + + /* search end of string */ + while(i < strLen && c[i] != '"') + i++; + + if(i == strLen || c[i] != '"') + goto done; + + /* success, persist */ + *parsed = i + 1 - *offs; /* "eat" terminal double quote */ + /* create JSON value to save quoted string contents */ + CHKN(cstr = strndup((char*)c + *offs + 1, *parsed - 2)); + CHKN(*value = json_object_new_string(cstr)); + r = 0; /* success */ +done: + free(cstr); + return r; +} + + +/** + * Parse an ISO date, that is YYYY-MM-DD (exactly this format). + * Note: we do manual loop unrolling -- this is fast AND efficient. + * rgerhards, 2011-01-14 + */ +PARSER(ISODate) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + + if(*offs+10 > strLen) + goto done; /* if it is not 10 chars, it can't be an ISO date */ + + /* year */ + if(!isdigit(c[i])) goto done; + if(!isdigit(c[i+1])) goto done; + if(!isdigit(c[i+2])) goto done; + if(!isdigit(c[i+3])) goto done; + if(c[i+4] != '-') goto done; + /* month */ + if(c[i+5] == '0') { + if(c[i+6] < '1' || c[i+6] > '9') goto done; + } else if(c[i+5] == '1') { + if(c[i+6] < '0' || c[i+6] > '2') goto done; + } else { + goto done; + } + if(c[i+7] != '-') goto done; + /* day */ + if(c[i+8] == '0') { + if(c[i+9] < '1' || c[i+9] > '9') goto done; + } else if(c[i+8] == '1' || c[i+8] == '2') { + if(!isdigit(c[i+9])) goto done; + } else if(c[i+8] == '3') { + if(c[i+9] != '0' && c[i+9] != '1') goto done; + } else { + goto done; + } + + /* success, persist */ + *parsed = 10; + + r = 0; /* success */ +done: + return r; +} + +/** + * Parse a Cisco interface spec. Sample for such a spec are: + * outside:192.168.52.102/50349 + * inside:192.168.1.15/56543 (192.168.1.112/54543) + * outside:192.168.1.13/50179 (192.168.1.13/50179)(LOCAL\some.user) + * outside:192.168.1.25/41850(LOCAL\RG-867G8-DEL88D879BBFFC8) + * inside:192.168.1.25/53 (192.168.1.25/53) (some.user) + * 192.168.1.15/0(LOCAL\RG-867G8-DEL88D879BBFFC8) + * From this, we conclude the format is: + * [interface:]ip/port [SP (ip2/port2)] [[SP](username)] + * In order to match, this syntax must start on a non-whitespace char + * other than colon. + */ +PARSER(CiscoInterfaceSpec) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + + if(c[i] == ':' || isspace(c[i])) goto done; + + /* first, check if we have an interface. We do this by trying + * to detect if we have an IP. If we have, obviously no interface + * is present. Otherwise, we check if we have a valid interface. + */ + int bHaveInterface = 0; + size_t idxInterface = 0; + size_t lenInterface = 0; + int bHaveIP = 0; + size_t lenIP; + size_t idxIP = i; + if(ln_parseIPv4(str, strLen, &i, node, &lenIP, NULL) == 0) { + bHaveIP = 1; + i += lenIP - 1; /* position on delimiter */ + } else { + idxInterface = i; + while(i < strLen) { + if(isspace(c[i])) goto done; + if(c[i] == ':') + break; + ++i; + } + lenInterface = i - idxInterface; + bHaveInterface = 1; + } + if(i == strLen) goto done; + ++i; /* skip over colon */ + + /* we now utilize our other parser helpers */ + if(!bHaveIP) { + idxIP = i; + if(ln_parseIPv4(str, strLen, &i, node, &lenIP, NULL) != 0) goto done; + i += lenIP; + } + if(i == strLen || c[i] != '/') goto done; + ++i; /* skip slash */ + const size_t idxPort = i; + size_t lenPort; + if(ln_parseNumber(str, strLen, &i, node, &lenPort, NULL) != 0) goto done; + i += lenPort; + if(i == strLen) goto success; + + /* check if optional second ip/port is present + * We assume we must at least have 5 chars [" (::1)"] + */ + int bHaveIP2 = 0; + size_t idxIP2 = 0, lenIP2 = 0; + size_t idxPort2 = 0, lenPort2 = 0; + if(i+5 < strLen && c[i] == ' ' && c[i+1] == '(') { + size_t iTmp = i+2; /* skip over " (" */ + idxIP2 = iTmp; + if(ln_parseIPv4(str, strLen, &iTmp, node, &lenIP2, NULL) == 0) { + iTmp += lenIP2; + if(i < strLen || c[iTmp] == '/') { + ++iTmp; /* skip slash */ + idxPort2 = iTmp; + if(ln_parseNumber(str, strLen, &iTmp, node, &lenPort2, NULL) == 0) { + iTmp += lenPort2; + if(iTmp < strLen && c[iTmp] == ')') { + i = iTmp + 1; /* match, so use new index */ + bHaveIP2 = 1; + } + } + } + } + } + + /* check if optional username is present + * We assume we must at least have 3 chars ["(n)"] + */ + int bHaveUser = 0; + size_t idxUser = 0; + size_t lenUser = 0; + if( (i+2 < strLen && c[i] == '(' && !isspace(c[i+1]) ) + || (i+3 < strLen && c[i] == ' ' && c[i+1] == '(' && !isspace(c[i+2])) ) { + idxUser = i + ((c[i] == ' ') ? 2 : 1); /* skip [SP]'(' */ + size_t iTmp = idxUser; + while(iTmp < strLen && !isspace(c[iTmp]) && c[iTmp] != ')') + ++iTmp; /* just scan */ + if(iTmp < strLen && c[iTmp] == ')') { + i = iTmp + 1; /* we have a match, so use new index */ + bHaveUser = 1; + lenUser = iTmp - idxUser; + } + } + + /* all done, save data */ + if(value == NULL) + goto success; + + CHKN(*value = json_object_new_object()); + json_object *json; + if(bHaveInterface) { + CHKN(json = json_object_new_string_len(c+idxInterface, lenInterface)); + json_object_object_add_ex(*value, "interface", json, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } + CHKN(json = json_object_new_string_len(c+idxIP, lenIP)); + json_object_object_add_ex(*value, "ip", json, JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + CHKN(json = json_object_new_string_len(c+idxPort, lenPort)); + json_object_object_add_ex(*value, "port", json, JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + if(bHaveIP2) { + CHKN(json = json_object_new_string_len(c+idxIP2, lenIP2)); + json_object_object_add_ex(*value, "ip2", json, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + CHKN(json = json_object_new_string_len(c+idxPort2, lenPort2)); + json_object_object_add_ex(*value, "port2", json, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } + if(bHaveUser) { + CHKN(json = json_object_new_string_len(c+idxUser, lenUser)); + json_object_object_add_ex(*value, "user", json, + JSON_C_OBJECT_ADD_KEY_IS_NEW|JSON_C_OBJECT_KEY_IS_CONSTANT); + } + +success: /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ +done: + if(r != 0 && value != NULL && *value != NULL) { + json_object_put(*value); + *value = NULL; /* to be on the save side */ + } + return r; +} + +/** + * Parse a duration. A duration is similar to a timestamp, except that + * it tells about time elapsed. As such, hours can be larger than 23 + * and hours may also be specified by a single digit (this, for example, + * is commonly done in Cisco software). + * Note: we do manual loop unrolling -- this is fast AND efficient. + */ +PARSER(Duration) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + + /* hour is a bit tricky */ + if(!isdigit(c[i])) goto done; + ++i; + if(isdigit(c[i])) + ++i; + if(c[i] == ':') + ++i; + else + goto done; + + if(i+5 > strLen) + goto done;/* if it is not 5 chars from here, it can't be us */ + + if(c[i] < '0' || c[i] > '5') goto done; + if(!isdigit(c[i+1])) goto done; + if(c[i+2] != ':') goto done; + if(c[i+3] < '0' || c[i+3] > '5') goto done; + if(!isdigit(c[i+4])) goto done; + + /* success, persist */ + *parsed = (i + 5) - *offs; + + r = 0; /* success */ +done: + return r; +} + +/** + * Parse a timestamp in 24hr format (exactly HH:MM:SS). + * Note: we do manual loop unrolling -- this is fast AND efficient. + * rgerhards, 2011-01-14 + */ +PARSER(Time24hr) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + + if(*offs+8 > strLen) + goto done; /* if it is not 8 chars, it can't be us */ + + /* hour */ + if(c[i] == '0' || c[i] == '1') { + if(!isdigit(c[i+1])) goto done; + } else if(c[i] == '2') { + if(c[i+1] < '0' || c[i+1] > '3') goto done; + } else { + goto done; + } + /* TODO: the code below is a duplicate of 24hr parser - create common function */ + if(c[i+2] != ':') goto done; + if(c[i+3] < '0' || c[i+3] > '5') goto done; + if(!isdigit(c[i+4])) goto done; + if(c[i+5] != ':') goto done; + if(c[i+6] < '0' || c[i+6] > '5') goto done; + if(!isdigit(c[i+7])) goto done; + + /* success, persist */ + *parsed = 8; + + r = 0; /* success */ +done: + return r; +} + +/** + * Parse a timestamp in 12hr format (exactly HH:MM:SS). + * Note: we do manual loop unrolling -- this is fast AND efficient. + * TODO: the code below is a duplicate of 24hr parser - create common function? + * rgerhards, 2011-01-14 + */ +PARSER(Time12hr) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + c = str; + i = *offs; + + if(*offs+8 > strLen) + goto done; /* if it is not 8 chars, it can't be us */ + + /* hour */ + if(c[i] == '0') { + if(!isdigit(c[i+1])) goto done; + } else if(c[i] == '1') { + if(c[i+1] < '0' || c[i+1] > '2') goto done; + } else { + goto done; + } + if(c[i+2] != ':') goto done; + if(c[i+3] < '0' || c[i+3] > '5') goto done; + if(!isdigit(c[i+4])) goto done; + if(c[i+5] != ':') goto done; + if(c[i+6] < '0' || c[i+6] > '5') goto done; + if(!isdigit(c[i+7])) goto done; + + /* success, persist */ + *parsed = 8; + + r = 0; /* success */ +done: + return r; +} + + +/* helper to IPv4 address parser, checks the next set of numbers. + * Syntax 1 to 3 digits, value together not larger than 255. + * @param[in] str parse buffer + * @param[in/out] offs offset into buffer, updated if successful + * @return 0 if OK, 1 otherwise + */ +static int +chkIPv4AddrByte(const char *str, size_t strLen, size_t *offs) +{ + int val = 0; + int r = 1; /* default: done -- simplifies things */ + const char *c; + size_t i = *offs; + + c = str; + if(i == strLen || !isdigit(c[i])) + goto done; + val = c[i++] - '0'; + if(i < strLen && isdigit(c[i])) { + val = val * 10 + c[i++] - '0'; + if(i < strLen && isdigit(c[i])) + val = val * 10 + c[i++] - '0'; + } + if(val > 255) /* cannot be a valid IP address byte! */ + goto done; + + *offs = i; + r = 0; +done: + return r; +} + +/** + * Parser for IPv4 addresses. + */ +PARSER(IPv4) + const char *c; + size_t i; + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + i = *offs; + if(i + 7 > strLen) { + /* IPv4 addr requires at least 7 characters */ + goto done; + } + c = str; + + /* byte 1*/ + if(chkIPv4AddrByte(str, strLen, &i) != 0) goto done; + if(i == strLen || c[i++] != '.') goto done; + /* byte 2*/ + if(chkIPv4AddrByte(str, strLen, &i) != 0) goto done; + if(i == strLen || c[i++] != '.') goto done; + /* byte 3*/ + if(chkIPv4AddrByte(str, strLen, &i) != 0) goto done; + if(i == strLen || c[i++] != '.') goto done; + /* byte 4 - we do NOT need any char behind it! */ + if(chkIPv4AddrByte(str, strLen, &i) != 0) goto done; + + /* if we reach this point, we found a valid IP address */ + *parsed = i - *offs; + + r = 0; /* success */ +done: + return r; +} + + +/* skip past the IPv6 address block, parse pointer is set to + * first char after the block. Returns an error if already at end + * of string. + * @param[in] str parse buffer + * @param[in/out] offs offset into buffer, updated if successful + * @return 0 if OK, 1 otherwise + */ +static int +skipIPv6AddrBlock(const char *const __restrict__ str, + const size_t strLen, + size_t *const __restrict__ offs) +{ + int j; + if(*offs == strLen) + return 1; + + for(j = 0 ; j < 4 && *offs+j < strLen && isxdigit(str[*offs+j]) ; ++j) + /*just skip*/ ; + + *offs += j; + return 0; +} + +/** + * Parser for IPv6 addresses. + * Bases on RFC4291 Section 2.2. The address must be followed + * by whitespace or end-of-string, else it is not considered + * a valid address. This prevents false positives. + */ +PARSER(IPv6) + const char *c; + size_t i; + size_t beginBlock; /* last block begin in case we need IPv4 parsing */ + int hasIPv4 = 0; + int nBlocks = 0; /* how many blocks did we already have? */ + int bHad0Abbrev = 0; /* :: already used? */ + + assert(str != NULL); + assert(offs != NULL); + assert(parsed != NULL); + i = *offs; + if(i + 2 > strLen) { + /* IPv6 addr requires at least 2 characters ("::") */ + goto done; + } + c = str; + + /* check that first block is non-empty */ + if(! ( isxdigit(c[i]) || (c[i] == ':' && c[i+1] == ':') ) ) + goto done; + + /* try for all potential blocks plus one more (so we see errors!) */ + for(int j = 0 ; j < 9 ; ++j) { + beginBlock = i; + if(skipIPv6AddrBlock(str, strLen, &i) != 0) goto done; + nBlocks++; + if(i == strLen) goto chk_ok; + if(isspace(c[i])) goto chk_ok; + if(c[i] == '.'){ /* IPv4 processing! */ + hasIPv4 = 1; + break; + } + if(c[i] != ':') goto done; + i++; /* "eat" ':' */ + if(i == strLen) goto chk_ok; + /* check for :: */ + if(bHad0Abbrev) { + if(c[i] == ':') goto done; + } else { + if(c[i] == ':') { + bHad0Abbrev = 1; + ++i; + if(i == strLen) goto chk_ok; + } + } + } + + if(hasIPv4) { + size_t ipv4_parsed; + --nBlocks; + /* prevent pure IPv4 address to be recognized */ + if(beginBlock == *offs) goto done; + i = beginBlock; + if(ln_parseIPv4(str, strLen, &i, node, &ipv4_parsed, NULL) != 0) + goto done; + i += ipv4_parsed; + } + +chk_ok: /* we are finished parsing, check if things are ok */ + if(nBlocks > 8) goto done; + if(bHad0Abbrev && nBlocks >= 8) goto done; + /* now check if trailing block is missing. Note that i is already + * on next character, so we need to go two back. Two are always + * present, else we would not reach this code here. + */ + if(c[i-1] == ':' && c[i-2] != ':') goto done; + + /* if we reach this point, we found a valid IP address */ + *parsed = i - *offs; + + r = 0; /* success */ +done: + return r; +} + +/* check if a char is valid inside a name of the iptables motif. + * We try to keep the set as slim as possible, because the iptables + * parser may otherwise create a very broad match (especially the + * inclusion of simple words like "DF" cause grief here). + * Note: we have taken the permitted set from iptables log samples. + * Report bugs if we missed some additional rules. + */ +static inline int +isValidIPTablesNameChar(const char c) +{ + /* right now, upper case only is valid */ + return ('A' <= c && c <= 'Z') ? 1 : 0; +} + +/* helper to iptables parser, parses out a a single name=value pair + */ +static int +parseIPTablesNameValue(const char *const __restrict__ str, + const size_t strLen, + size_t *const __restrict__ offs, + struct json_object *const __restrict__ valroot) +{ + int r = LN_WRONGPARSER; + size_t i = *offs; + char *name = NULL; + + const size_t iName = i; + while(i < strLen && isValidIPTablesNameChar(str[i])) + ++i; + if(i == iName || (i < strLen && str[i] != '=' && str[i] != ' ')) + goto done; /* no name at all! */ + + const ssize_t lenName = i - iName; + + ssize_t iVal = -1; + size_t lenVal = i - iVal; + if(i < strLen && str[i] != ' ') { + /* we have a real value (not just a flag name like "DF") */ + ++i; /* skip '=' */ + iVal = i; + while(i < strLen && !isspace(str[i])) + ++i; + lenVal = i - iVal; + } + + /* parsing OK */ + *offs = i; + r = 0; + + if(valroot == NULL) + goto done; + + CHKN(name = malloc(lenName+1)); + memcpy(name, str+iName, lenName); + name[lenName] = '\0'; + json_object *json; + if(iVal == -1) { + json = NULL; + } else { + CHKN(json = json_object_new_string_len(str+iVal, lenVal)); + } + json_object_object_add(valroot, name, json); +done: + free(name); + return r; +} + +/** + * Parser for iptables logs (the structured part). + * This parser is named "v2-iptables" because of a traditional + * parser named "iptables", which we do not want to replace, at + * least right now (we may re-think this before the first release). + * For performance reasons, this works in two stages. In the first + * stage, we only detect if the motif is correct. The second stage is + * only called when we know it is. In it, we go once again over the + * message again and actually extract the data. This is done because + * data extraction is relatively expensive and in most cases we will + * have much more frequent mismatches than matches. + * Note that this motif must have at least one field, otherwise it + * could detect things that are not iptables to be it. Further limits + * may be imposed in the future as we see additional need. + * added 2015-04-30 rgerhards + */ +PARSER(v2IPTables) + size_t i = *offs; + int nfields = 0; + + /* stage one */ + while(i < strLen) { + CHKR(parseIPTablesNameValue(str, strLen, &i, NULL)); + ++nfields; + /* exactly one SP is permitted between fields */ + if(i < strLen && str[i] == ' ') + ++i; + } + + if(nfields < 2) { + FAIL(LN_WRONGPARSER); + } + + /* success, persist */ + *parsed = i - *offs; + r = 0; + + /* stage two */ + if(value == NULL) + goto done; + + i = *offs; + CHKN(*value = json_object_new_object()); + while(i < strLen) { + CHKR(parseIPTablesNameValue(str, strLen, &i, *value)); + while(i < strLen && isspace(str[i])) + ++i; + } + +done: + if(r != 0 && value != NULL && *value != NULL) { + json_object_put(*value); + *value = NULL; + } + return r; +} + +/** + * Parse JSON. This parser tries to find JSON data inside a message. + * If it finds valid JSON, it will extract it. Extra data after the + * JSON is permitted. + * Note: the json-c JSON parser treats whitespace after the actual + * json to be part of the json. So in essence, any whitespace is + * processed by this parser. We use the same semantics to keep things + * neatly in sync. If json-c changes for some reason or we switch to + * an alternate json lib, we probably need to be sure to keep that + * behaviour, and probably emulate it. + * added 2015-04-28 by rgerhards, v1.1.2 + */ +PARSER(JSON) + const size_t i = *offs; + struct json_tokener *tokener = NULL; + + if(str[i] != '{' && str[i] != ']') { + /* this can't be json, see RFC4627, Sect. 2 + * see this bug in json-c: + * https://github.com/json-c/json-c/issues/181 + * In any case, it's better to do this quick check, + * even if json-c did not have the bug because this + * check here is much faster than calling the parser. + */ + goto done; + } + + if((tokener = json_tokener_new()) == NULL) + goto done; + + struct json_object *const json + = json_tokener_parse_ex(tokener, str+i, (int) (strLen - i)); + + if(json == NULL) + goto done; + + /* success, persist */ + *parsed = (i + tokener->char_offset) - *offs; + r = 0; /* success */ + + if(value == NULL) { + json_object_put(json); + } else { + *value = json; + } + +done: + if(tokener != NULL) + json_tokener_free(tokener); + return r; +} + + +/* check if a char is valid inside a name of a NameValue list + * The set of valid characters may be extended if there is good + * need to do so. We have selected the current set carefully, but + * may have overlooked some cases. + */ +static inline int +isValidNameChar(const char c) +{ + return (isalnum(c) + || c == '.' + || c == '_' + || c == '-' + ) ? 1 : 0; +} +/* helper to NameValue parser, parses out a a single name=value pair + * + * name must be alphanumeric characters, value must be non-whitespace + * characters, if quoted than with symmetric quotes. Supported formats + * - name=value + * - name="value" + * - name='value' + * Note "name=" is valid and means a field with empty value. + * TODO: so far, quote characters are not permitted WITHIN quoted values. + */ +static int +parseNameValue(const char *const __restrict__ str, + const size_t strLen, + size_t *const __restrict__ offs, + struct json_object *const __restrict__ valroot) +{ + int r = LN_WRONGPARSER; + size_t i = *offs; + char *name = NULL; + + const size_t iName = i; + while(i < strLen && isValidNameChar(str[i])) + ++i; + if(i == iName || str[i] != '=') + goto done; /* no name at all! */ + + const size_t lenName = i - iName; + ++i; /* skip '=' */ + + const size_t iVal = i; + while(i < strLen && !isspace(str[i])) + ++i; + const size_t lenVal = i - iVal; + + /* parsing OK */ + *offs = i; + r = 0; + + if(valroot == NULL) + goto done; + + CHKN(name = malloc(lenName+1)); + memcpy(name, str+iName, lenName); + name[lenName] = '\0'; + json_object *json; + CHKN(json = json_object_new_string_len(str+iVal, lenVal)); + json_object_object_add(valroot, name, json); +done: + free(name); + return r; +} + +/** + * Parse CEE syslog. + * This essentially is a JSON parser, with additional restrictions: + * The message must start with "@cee:" and json must immediately follow (whitespace permitted). + * after the JSON, there must be no other non-whitespace characters. + * In other words: the message must consist of a single JSON object, + * only. + * added 2015-04-28 by rgerhards, v1.1.2 + */ +PARSER(CEESyslog) + size_t i = *offs; + struct json_tokener *tokener = NULL; + struct json_object *json = NULL; + + if(strLen < i + 7 || /* "@cee:{}" is minimum text */ + str[i] != '@' || + str[i+1] != 'c' || + str[i+2] != 'e' || + str[i+3] != 'e' || + str[i+4] != ':') + goto done; + + /* skip whitespace */ + for(i += 5 ; i < strLen && isspace(str[i]) ; ++i) + /* just skip */; + + if(i == strLen || str[i] != '{') + goto done; + /* note: we do not permit arrays in CEE mode */ + + if((tokener = json_tokener_new()) == NULL) + goto done; + + json = json_tokener_parse_ex(tokener, str+i, (int) (strLen - i)); + + if(json == NULL) + goto done; + + if(i + tokener->char_offset != strLen) + goto done; + + /* success, persist */ + *parsed = strLen; + r = 0; /* success */ + + if(value != NULL) { + *value = json; + json = NULL; /* do NOT free below! */ + } + +done: + if(tokener != NULL) + json_tokener_free(tokener); + if(json != NULL) + json_object_put(json); + return r; +} + +/** + * Parser for name/value pairs. + * On entry must point to alnum char. All following chars must be + * name/value pairs delimited by whitespace up until the end of string. + * For performance reasons, this works in two stages. In the first + * stage, we only detect if the motif is correct. The second stage is + * only called when we know it is. In it, we go once again over the + * message again and actually extract the data. This is done because + * data extraction is relatively expensive and in most cases we will + * have much more frequent mismatches than matches. + * added 2015-04-25 rgerhards + */ +PARSER(NameValue) + size_t i = *offs; + + /* stage one */ + while(i < strLen) { + CHKR(parseNameValue(str, strLen, &i, NULL)); + while(i < strLen && isspace(str[i])) + ++i; + } + + /* success, persist */ + *parsed = i - *offs; + r = 0; /* success */ + + /* stage two */ + if(value == NULL) + goto done; + + i = *offs; + CHKN(*value = json_object_new_object()); + while(i < strLen) { + CHKR(parseNameValue(str, strLen, &i, *value)); + while(i < strLen && isspace(str[i])) + ++i; + } + + /* TODO: fix mem leak if alloc json fails */ + +done: + return r; +} + +/** + * Parse a MAC layer address. + * The standard (IEEE 802) format for printing MAC-48 addresses in + * human-friendly form is six groups of two hexadecimal digits, + * separated by hyphens (-) or colons (:), in transmission order + * (e.g. 01-23-45-67-89-ab or 01:23:45:67:89:ab ). + * This form is also commonly used for EUI-64. + * from: http://en.wikipedia.org/wiki/MAC_address + * + * This parser must start on a hex digit. + * added 2015-05-04 by rgerhards, v1.1.2 + */ +PARSER(MAC48) + size_t i = *offs; + char delim; + + if(strLen < i + 17 || /* this motif has exactly 17 characters */ + !isxdigit(str[i]) || + !isxdigit(str[i+1]) + ) + FAIL(LN_WRONGPARSER); + + if(str[i+2] == ':') + delim = ':'; + else if(str[i+2] == '-') + delim = '-'; + else + FAIL(LN_WRONGPARSER); + + /* first byte ok */ + if(!isxdigit(str[i+3]) || + !isxdigit(str[i+4]) || + str[i+5] != delim || /* 2nd byte ok */ + !isxdigit(str[i+6]) || + !isxdigit(str[i+7]) || + str[i+8] != delim || /* 3rd byte ok */ + !isxdigit(str[i+9]) || + !isxdigit(str[i+10]) || + str[i+11] != delim || /* 4th byte ok */ + !isxdigit(str[i+12]) || + !isxdigit(str[i+13]) || + str[i+14] != delim || /* 5th byte ok */ + !isxdigit(str[i+15]) || + !isxdigit(str[i+16]) /* 6th byte ok */ + ) + FAIL(LN_WRONGPARSER); + + /* success, persist */ + *parsed = 17; + r = 0; /* success */ + + if(value != NULL) { + CHKN(*value = json_object_new_string_len(str+i, 17)); + } + +done: + return r; +} + + +/* This parses the extension value and updates the index + * to point to the end of it. + */ +static int +cefParseExtensionValue(const char *const __restrict__ str, + const size_t strLen, + size_t *__restrict__ iEndVal) +{ + int r = 0; + size_t i = *iEndVal; + size_t iLastWordBegin; + /* first find next unquoted equal sign and record begin of + * last word in front of it - this is the actual end of the + * current name/value pair and the begin of the next one. + */ + int hadSP = 0; + int inEscape = 0; + for(iLastWordBegin = 0 ; i < strLen ; ++i) { + if(inEscape) { + if(str[i] != '=' && + str[i] != '\\' && + str[i] != 'r' && + str[i] != 'n') + FAIL(LN_WRONGPARSER); + inEscape = 0; + } else { + if(str[i] == '=') { + break; + } else if(str[i] == '\\') { + inEscape = 1; + } else if(str[i] == ' ') { + hadSP = 1; + } else { + if(hadSP) { + iLastWordBegin = i; + hadSP = 0; + } + } + } + } + + /* Note: iLastWordBegin can never be at offset zero, because + * the CEF header starts there! + */ + if(i < strLen) { + *iEndVal = (iLastWordBegin == 0) ? i : iLastWordBegin - 1; + } else { + *iEndVal = i; + } +done: + return r; +} + +/* must be positioned on first char of name, returns index + * of end of name. + * Note: ArcSight violates the CEF spec ifself: they generate + * leading underscores in their extension names, which are + * definetly not alphanumeric. We still accept them... + * They also seem to use dots. + */ +static int +cefParseName(const char *const __restrict__ str, + const size_t strLen, + size_t *const __restrict__ i) +{ + int r = 0; + while(*i < strLen && str[*i] != '=') { + if(!(isalnum(str[*i]) || str[*i] == '_' || str[*i] == '.')) + FAIL(LN_WRONGPARSER); + ++(*i); + } +done: + return r; +} + +/* parse CEF extensions. They are basically name=value + * pairs with the ugly exception that values may contain + * spaces but need NOT to be quoted. Thankfully, at least + * names are specified as being alphanumeric without spaces + * in them. So we must add a lookahead parser to check if + * a word is a name (and thus the begin of a new pair) or + * not. This is done by subroutines. + */ +static int +cefParseExtensions(const char *const __restrict__ str, + const size_t strLen, + size_t *const __restrict__ offs, + json_object *const __restrict__ jroot) +{ + int r = 0; + size_t i = *offs; + size_t iName, lenName; + size_t iValue, lenValue; + char *name = NULL; + char *value = NULL; + + while(i < strLen) { + while(i < strLen && str[i] == ' ') + ++i; + iName = i; + CHKR(cefParseName(str, strLen, &i)); + if(i+1 >= strLen || str[i] != '=') + FAIL(LN_WRONGPARSER); + lenName = i - iName; + ++i; /* skip '=' */ + + iValue = i; + CHKR(cefParseExtensionValue(str, strLen, &i)); + lenValue = i - iValue; + + ++i; /* skip past value */ + + if(jroot != NULL) { + CHKN(name = malloc(sizeof(char) * (lenName + 1))); + memcpy(name, str+iName, lenName); + name[lenName] = '\0'; + CHKN(value = malloc(sizeof(char) * (lenValue + 1))); + /* copy value but escape it */ + size_t iDst = 0; + for(size_t iSrc = 0 ; iSrc < lenValue ; ++iSrc) { + if(str[iValue+iSrc] == '\\') { + ++iSrc; /* we know the next char must exist! */ + switch(str[iValue+iSrc]) { + case '=': value[iDst] = '='; + break; + case 'n': value[iDst] = '\n'; + break; + case 'r': value[iDst] = '\r'; + break; + case '\\': value[iDst] = '\\'; + break; + default: break; + } + } else { + value[iDst] = str[iValue+iSrc]; + } + ++iDst; + } + value[iDst] = '\0'; + json_object *json; + CHKN(json = json_object_new_string(value)); + json_object_object_add(jroot, name, json); + free(name); name = NULL; + free(value); value = NULL; + } + } + +done: + free(name); + free(value); + return r; +} + +/* gets a CEF header field. Must be positioned on the + * first char after the '|' in front of field. + * Note that '|' may be escaped as "\|", which also means + * we need to supprot "\\" (see CEF spec for details). + * We return the string in *val, if val is non-null. In + * that case we allocate memory that the caller must free. + * This is necessary because there are potentially escape + * sequences inside the string. + */ +static int +cefGetHdrField(const char *const __restrict__ str, + const size_t strLen, + size_t *const __restrict__ offs, + char **val) +{ + int r = 0; + size_t i = *offs; + assert(str[i] != '|'); + while(i < strLen && str[i] != '|') { + if(str[i] == '\\') { + ++i; /* skip esc char */ + if(str[i] != '\\' && str[i] != '|') + FAIL(LN_WRONGPARSER); + } + ++i; /* scan to next delimiter */ + } + + if(str[i] != '|') + FAIL(LN_WRONGPARSER); + + const size_t iBegin = *offs; + /* success, persist */ + *offs = i + 1; + + if(val == NULL) { + r = 0; + goto done; + } + + const size_t len = i - iBegin; + CHKN(*val = malloc(len + 1)); + size_t iDst = 0; + for(size_t iSrc = 0 ; iSrc < len ; ++iSrc) { + if(str[iBegin+iSrc] == '\\') + ++iSrc; /* we already checked above that this is OK! */ + (*val)[iDst++] = str[iBegin+iSrc]; + } + (*val)[iDst] = 0; + r = 0; +done: + return r; +} + +/** + * Parser for ArcSight Common Event Format (CEF) version 0. + * added 2015-05-05 by rgerhards, v1.1.2 + */ +PARSER(CEF) + size_t i = *offs; + char *vendor = NULL; + char *product = NULL; + char *version = NULL; + char *sigID = NULL; + char *name = NULL; + char *severity = NULL; + + /* minumum header: "CEF:0|x|x|x|x|x|x|" --> 17 chars */ + if(strLen < i + 17 || + str[i] != 'C' || + str[i+1] != 'E' || + str[i+2] != 'F' || + str[i+3] != ':' || + str[i+4] != '0' || + str[i+5] != '|' + ) FAIL(LN_WRONGPARSER); + + i += 6; /* position on '|' */ + + CHKR(cefGetHdrField(str, strLen, &i, (value == NULL) ? NULL : &vendor)); + CHKR(cefGetHdrField(str, strLen, &i, (value == NULL) ? NULL : &product)); + CHKR(cefGetHdrField(str, strLen, &i, (value == NULL) ? NULL : &version)); + CHKR(cefGetHdrField(str, strLen, &i, (value == NULL) ? NULL : &sigID)); + CHKR(cefGetHdrField(str, strLen, &i, (value == NULL) ? NULL : &name)); + CHKR(cefGetHdrField(str, strLen, &i, (value == NULL) ? NULL : &severity)); + ++i; /* skip over terminal '|' */ + + /* OK, we now know we have a good header. Now, we need + * to process extensions. + * This time, we do NOT pre-process the extension, but rather + * persist them directly to JSON. This is contrary to other + * parsers, but as the CEF header is pretty unique, this time + * it is exteremely unlike we will get a no-match during + * extension processing. Even if so, nothing bad happens, as + * the extracted data is discarded. But the regular case saves + * us processing time and complexity. The only time when we + * cannot directly process it is when the caller asks us not + * to persist the data. So this must be handled differently. + */ + size_t iBeginExtensions = i; + CHKR(cefParseExtensions(str, strLen, &i, NULL)); + + /* success, persist */ + *parsed = *offs - i; + r = 0; /* success */ + + if(value != NULL) { + CHKN(*value = json_object_new_object()); + json_object *json; + CHKN(json = json_object_new_string(vendor)); + json_object_object_add(*value, "DeviceVendor", json); + CHKN(json = json_object_new_string(product)); + json_object_object_add(*value, "DeviceProduct", json); + CHKN(json = json_object_new_string(version)); + json_object_object_add(*value, "DeviceVersion", json); + CHKN(json = json_object_new_string(sigID)); + json_object_object_add(*value, "SignatureID", json); + CHKN(json = json_object_new_string(name)); + json_object_object_add(*value, "Name", json); + CHKN(json = json_object_new_string(severity)); + json_object_object_add(*value, "Severity", json); + + json_object *jext; + CHKN(jext = json_object_new_object()); + json_object_object_add(*value, "Extensions", jext); + + i = iBeginExtensions; + cefParseExtensions(str, strLen, &i, jext); + } + +done: + if(r != 0 && value != NULL && *value != NULL) { + json_object_put(*value); + value = NULL; + } + free(vendor); + free(product); + free(version); + free(sigID); + free(name); + free(severity); + return r; +} + +/** + * Parser for Checkpoint LEA on-disk format. + * added 2015-06-18 by rgerhards, v1.1.2 + */ +PARSER(CheckpointLEA) + size_t i = *offs; + size_t iName, lenName; + size_t iValue, lenValue; + int foundFields = 0; + char *name = NULL; + char *val = NULL; + + while(i < strLen) { + while(i < strLen && str[i] == ' ') /* skip leading SP */ + ++i; + if(i == strLen) { /* OK if just trailing space */ + if(foundFields == 0) + FAIL(LN_WRONGPARSER); + break; /* we are done with the loop, all processed */ + } else { + ++foundFields; + } + iName = i; + /* TODO: do a stricter check? ... but we don't have a spec */ + while(i < strLen && str[i] != ':') { + ++i; + } + if(i+1 >= strLen || str[i] != ':') + FAIL(LN_WRONGPARSER); + lenName = i - iName; + ++i; /* skip ':' */ + + while(i < strLen && str[i] == ' ') /* skip leading SP */ + ++i; + iValue = i; + while(i < strLen && str[i] != ';') { + ++i; + } + if(i+1 > strLen || str[i] != ';') + FAIL(LN_WRONGPARSER); + lenValue = i - iValue; + ++i; /* skip ';' */ + + if(value != NULL) { + CHKN(name = malloc(sizeof(char) * (lenName + 1))); + memcpy(name, str+iName, lenName); + name[lenName] = '\0'; + CHKN(val = malloc(sizeof(char) * (lenValue + 1))); + memcpy(val, str+iValue, lenValue); + val[lenValue] = '\0'; + if(*value == NULL) + CHKN(*value = json_object_new_object()); + json_object *json; + CHKN(json = json_object_new_string(val)); + json_object_object_add(*value, name, json); + free(name); name = NULL; + free(val); val = NULL; + } + } + + /* success, persist */ + *parsed = *offs - i; + r = 0; /* success */ + +done: + free(name); + free(val); + if(r != 0 && value != NULL && *value != NULL) { + json_object_put(*value); + value = NULL; + } + return r; +} diff --git a/src/v1_parser.h b/src/v1_parser.h new file mode 100644 index 0000000..3d61381 --- /dev/null +++ b/src/v1_parser.h @@ -0,0 +1,272 @@ +/* + * liblognorm - a fast samples-based log normalization library + * Copyright 2010-2015 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef LIBLOGNORM_V1_PARSER_H_INCLUDED +#define LIBLOGNORM_V1_PARSER_H_INCLUDED +#include "v1_ptree.h" + + +/** + * Parser interface + * @param[in] str the to-be-parsed string + * @param[in] strLen length of the to-be-parsed string + * @param[in] offs an offset into the string + * @param[in] node fieldlist with additional data; for simple + * parsers, this sets variable "ed", which just is + * string data. + * @param[out] parsed bytes + * @param[out] json object containing parsed data (can be unused) + * @return 0 on success, something else otherwise + */ + +/** + * Parser for RFC5424 date. + */ +int ln_parseRFC5424Date(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for RFC3164 date. + */ +int ln_parseRFC3164Date(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for numbers. + */ +int ln_parseNumber(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for real-number in floating-pt representation + */ +int ln_parseFloat(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for hex numbers. + */ +int ln_parseHexNumber(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + + +/** + * Parser for kernel timestamps. + */ +int ln_parseKernelTimestamp(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for whitespace + */ +int ln_parseWhitespace(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + + +/** + * Parser for Words (SP-terminated strings). + */ +int ln_parseWord(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + + +/** + * Parse everything up to a specific string. + */ +int ln_parseStringTo(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for Alphabetic words (no numbers, punct, ctrl, space). + */ +int ln_parseAlpha(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + + +/** + * Parse everything up to a specific character. + */ +int ln_parseCharTo(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parse everything up to a specific character (relaxed constraints, suitable for CSV) + */ +int ln_parseCharSeparated(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + + +/** + * Get everything till the rest of string. + */ +int ln_parseRest(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parse an optionally quoted string. + */ +int ln_parseOpQuotedString(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, +size_t *parsed, struct json_object **value); + +/** + * Parse a quoted string. + */ +int ln_parseQuotedString(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parse an ISO date. + */ +int ln_parseISODate(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + + +/** + * Parse a timestamp in 12hr format. + */ +int ln_parseTime12hr(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parse a timestamp in 24hr format. + */ +int ln_parseTime24hr(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parse a duration. + */ +int ln_parseDuration(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for IPv4 addresses. + */ +int ln_parseIPv4(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for IPv6 addresses. + */ +int ln_parseIPv6(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parse JSON. + */ +int ln_parseJSON(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parse cee syslog. + */ +int ln_parseCEESyslog(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parse iptables log, the new way + */ +int ln_parsev2IPTables(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser Cisco interface specifiers + */ +int ln_parseCiscoInterfaceSpec(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, +size_t *parsed, struct json_object **value); + +/** + * Parser 48 bit MAC layer addresses. + */ +int ln_parseMAC48(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for CEF version 0. + */ +int ln_parseCEF(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for Checkpoint LEA. + */ +int ln_parseCheckpointLEA(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Parser for name/value pairs. + */ +int ln_parseNameValue(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +/** + * Get all tokens separated by tokenizer-string as array. + */ +int ln_parseTokenized(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +void* tokenized_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx); +void tokenized_parser_data_destructor(void** dataPtr); + +#ifdef FEATURE_REGEXP +/** + * Get field matching regex + */ +int ln_parseRegex(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +void* regex_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx); +void regex_parser_data_destructor(void** dataPtr); +#endif + +/** + * Match using the 'current' or 'separate rulebase' all over again from current match position + */ +int ln_parseRecursive(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +void* recursive_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx); +void* descent_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx); +void recursive_parser_data_destructor(void** dataPtr); + +/** + * Get interpreted field + */ +int ln_parseInterpret(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, +size_t *parsed, struct json_object **value); + +void* interpret_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx); +void interpret_parser_data_destructor(void** dataPtr); + +/** + * Parse a suffixed field + */ +int ln_parseSuffixed(const char *str, size_t strlen, size_t *offs, const ln_fieldList_t *node, size_t *parsed, +struct json_object **value); + +void* suffixed_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx); +void* named_suffixed_parser_data_constructor(ln_fieldList_t *node, ln_ctx ctx); +void suffixed_parser_data_destructor(void** dataPtr); + +#endif /* #ifndef LIBLOGNORM_V1_PARSER_H_INCLUDED */ diff --git a/src/v1_ptree.c b/src/v1_ptree.c new file mode 100644 index 0000000..30185c5 --- /dev/null +++ b/src/v1_ptree.c @@ -0,0 +1,910 @@ +/** + * @file ptree.c + * @brief Implementation of the parse tree object. + * @class ln_ptree ptree.h + *//* + * Copyright 2010 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include +#include +#include + +#define LOGNORM_V1_SUBSYSTEM /* indicate we are old cruft */ +#include "v1_liblognorm.h" +#include "annot.h" +#include "internal.h" +#include "lognorm.h" +#include "v1_ptree.h" +#include "v1_samp.h" +#include "v1_parser.h" + +/** + * Get base addr of common prefix. Takes length of prefix in account + * and selects the right buffer. + */ +static inline unsigned char* +prefixBase(struct ln_ptree *tree) +{ + return (tree->lenPrefix <= sizeof(tree->prefix)) + ? tree->prefix.data : tree->prefix.ptr; +} + + +struct ln_ptree* +ln_newPTree(ln_ctx ctx, struct ln_ptree **parentptr) +{ + struct ln_ptree *tree; + + if((tree = calloc(1, sizeof(struct ln_ptree))) == NULL) + goto done; + + tree->parentptr = parentptr; + tree->ctx = ctx; + ctx->nNodes++; +done: return tree; +} + +void +ln_deletePTreeNode(ln_fieldList_t *node) +{ + ln_deletePTree(node->subtree); + es_deleteStr(node->name); + if(node->data != NULL) + es_deleteStr(node->data); + if(node->raw_data != NULL) + es_deleteStr(node->raw_data); + if(node->parser_data != NULL && node->parser_data_destructor != NULL) + node->parser_data_destructor(&(node->parser_data)); + free(node); +} + +void +ln_deletePTree(struct ln_ptree *tree) +{ + ln_fieldList_t *node, *nextnode; + size_t i; + + if(tree == NULL) + goto done; + + if(tree->tags != NULL) + json_object_put(tree->tags); + for(node = tree->froot; node != NULL; node = nextnode) { + nextnode = node->next; + ln_deletePTreeNode(node); + } + + /* need to free a large prefix buffer? */ + if(tree->lenPrefix > sizeof(tree->prefix)) + free(tree->prefix.ptr); + + for(i = 0 ; i < 256 ; ++i) + if(tree->subtree[i] != NULL) + ln_deletePTree(tree->subtree[i]); + free(tree); + +done: return; +} + + +/** + * Set the common prefix inside a note, taking into account the subtle + * issues associated with it. + * @return 0 on success, something else otherwise + */ +static int +setPrefix(struct ln_ptree *tree, unsigned char *buf, size_t lenBuf, size_t offs) +{ + int r; +LN_DBGPRINTF(tree->ctx, "setPrefix lenBuf %zu, offs %zu", lenBuf, offs); + tree->lenPrefix = lenBuf - offs; + if(tree->lenPrefix > sizeof(tree->prefix)) { + /* too-large for standard buffer, need to alloc one */ + if((tree->prefix.ptr = malloc(tree->lenPrefix * sizeof(unsigned char))) == NULL) { + r = LN_NOMEM; + goto done; /* fail! */ + } + memcpy(tree->prefix.ptr, buf, tree->lenPrefix); + } else { + /* note: r->lenPrefix may be 0, but that is OK */ + memcpy(tree->prefix.data, buf, tree->lenPrefix); + } + r = 0; + +done: return r; +} + + +/** + * Check if the provided tree is a leaf. This means that it + * does not contain any subtrees. + * @return 1 if it is a leaf, 0 otherwise + */ +static int +isLeaf(struct ln_ptree *tree) +{ + int r = 0; + int i; + + if(tree->froot != NULL) + goto done; + + for(i = 0 ; i < 256 ; ++i) { + if(tree->subtree[i] != NULL) + goto done; + } + r = 1; + +done: return r; +} + + +/** + * Check if the provided tree is a true leaf. This means that it + * does not contain any subtrees of any kind and no prefix, + * and it is not terminal leaf. + * @return 1 if it is a leaf, 0 otherwise + */ +static inline int +isTrueLeaf(struct ln_ptree *tree) +{ + return((tree->lenPrefix == 0) && isLeaf(tree)) && !tree->flags.isTerminal; +} + + +struct ln_ptree * +ln_addPTree(struct ln_ptree *tree, es_str_t *str, size_t offs) +{ + struct ln_ptree *r; + struct ln_ptree **parentptr; /**< pointer in parent that needs to be updated */ + +LN_DBGPRINTF(tree->ctx, "addPTree: offs %zu", offs); + parentptr = &(tree->subtree[es_getBufAddr(str)[offs]]); + /* First check if tree node is totaly empty. If so, we can simply add + * the prefix to this node. This case is important, because it happens + * every time with a new field. + */ + if(isTrueLeaf(tree)) { + if(setPrefix(tree, es_getBufAddr(str), es_strlen(str), offs) != 0) { + r = NULL; + } else { + r = tree; + } + goto done; + } + + if(tree->ctx->debug) { + char *cstr = es_str2cstr(str, NULL); + LN_DBGPRINTF(tree->ctx, "addPTree: add '%s', offs %zu, tree %p", + cstr + offs, offs, tree); + free(cstr); + } + + if((r = ln_newPTree(tree->ctx, parentptr)) == NULL) + goto done; + + if(setPrefix(r, es_getBufAddr(str) + offs + 1, es_strlen(str) - offs - 1, 0) != 0) { + free(r); + r = NULL; + goto done; + } + + *parentptr = r; + +done: return r; +} + + +/** + * Split the provided tree (node) into two at the provided index into its + * common prefix. This function exists to support splitting nodes when + * a mismatch in the common prefix requires that. This function more or less + * keeps the tree as it is, just changes the structure. No new node is added. + * Usually, it is desired to add a new node. This must be made afterwards. + * Note that we need to create a new tree *in front of* the current one, as + * the current one contains field etc. subtree pointers. + * @param[in] tree tree to split + * @param[in] offs offset into common prefix (must be less than prefix length!) + */ +static struct ln_ptree* +splitTree(struct ln_ptree *tree, unsigned short offs) +{ + unsigned char *c; + struct ln_ptree *r; + unsigned short newlen; + ln_ptree **newparentptr; /**< pointer in parent that needs to be updated */ + + assert(offs < tree->lenPrefix); + if((r = ln_newPTree(tree->ctx, tree->parentptr)) == NULL) + goto done; + + LN_DBGPRINTF(tree->ctx, "splitTree %p at offs %u", tree, offs); + /* note: the overall prefix is reduced by one char, which is now taken + * care of inside the "branch table". + */ + c = prefixBase(tree); +//LN_DBGPRINTF(tree->ctx, "splitTree new bb, *(c+offs): '%s'", c); + if(setPrefix(r, c, offs, 0) != 0) { + ln_deletePTree(r); + r = NULL; + goto done; /* fail! */ + } + +LN_DBGPRINTF(tree->ctx, "splitTree new tree %p lenPrefix=%u, char '%c'", r, r->lenPrefix, r->prefix.data[0]); + /* add the proper branch table entry for the new node. must be done + * here, because the next step will destroy the required index char! + */ + newparentptr = &(r->subtree[c[offs]]); + r->subtree[c[offs]] = tree; + + /* finally fix existing common prefix */ + newlen = tree->lenPrefix - offs - 1; + if(tree->lenPrefix > sizeof(tree->prefix) && (newlen <= sizeof(tree->prefix))) { + /* note: c is a different pointer; the original + * pointer is overwritten by memcpy! */ +LN_DBGPRINTF(tree->ctx, "splitTree new case one bb, offs %u, lenPrefix %u, newlen %u", offs, tree->lenPrefix, newlen); +//LN_DBGPRINTF(tree->ctx, "splitTree new case one bb, *(c+offs): '%s'", c); + memcpy(tree->prefix.data, c+offs+1, newlen); + free(c); + } else { +LN_DBGPRINTF(tree->ctx, "splitTree new case two bb, offs=%u, newlen %u", offs, newlen); + memmove(c, c+offs+1, newlen); + } + tree->lenPrefix = tree->lenPrefix - offs - 1; + + if(tree->parentptr == 0) + tree->ctx->ptree = r; /* root does not have a parent! */ + else + *(tree->parentptr) = r; + tree->parentptr = newparentptr; + +done: return r; +} + + +struct ln_ptree * +ln_buildPTree(struct ln_ptree *tree, es_str_t *str, size_t offs) +{ + struct ln_ptree *r; + unsigned char *c; + unsigned char *cpfix; + size_t i; + unsigned short ipfix; + + assert(tree != NULL); + LN_DBGPRINTF(tree->ctx, "buildPTree: begin at %p, offs %zu", tree, offs); + c = es_getBufAddr(str); + + /* check if the prefix matches and, if not, at what offset it is different */ + ipfix = 0; + cpfix = prefixBase(tree); + for( i = offs + ; (i < es_strlen(str)) && (ipfix < tree->lenPrefix) && (c[i] == cpfix[ipfix]) + ; ++i, ++ipfix) { + ; /*DO NOTHING - just find end of match */ + LN_DBGPRINTF(tree->ctx, "buildPTree: tree %p, i %zu, char '%c'", tree, i, c[i]); + } + + /* if we reach this point, we have processed as much of the common prefix + * as we could. The following code now does the proper actions based on + * the possible cases. + */ + if(i == es_strlen(str)) { + /* all of our input is consumed, no more recursion */ + if(ipfix == tree->lenPrefix) { + LN_DBGPRINTF(tree->ctx, "case 1.1"); + /* exact match, we are done! */ + r = tree; + } else { + LN_DBGPRINTF(tree->ctx, "case 1.2"); + /* we need to split the node at the current position */ + r = splitTree(tree, ipfix); + } + } else if(ipfix < tree->lenPrefix) { + LN_DBGPRINTF(tree->ctx, "case 2, i=%zu, ipfix=%u", i, ipfix); + /* we need to split the node at the current position */ + if((r = splitTree(tree, ipfix)) == NULL) + goto done; /* fail */ +LN_DBGPRINTF(tree->ctx, "pre addPTree: i %zu", i); + if((r = ln_addPTree(r, str, i)) == NULL) + goto done; + //r = ln_buildPTree(r, str, i + 1); + } else { + /* we could consume the current common prefix, but now need + * to traverse the rest of the tree based on the next char. + */ + if(tree->subtree[c[i]] == NULL) { + LN_DBGPRINTF(tree->ctx, "case 3.1"); + /* non-match, need new subtree */ + r = ln_addPTree(tree, str, i); + } else { + LN_DBGPRINTF(tree->ctx, "case 3.2"); + /* match, follow subtree */ + r = ln_buildPTree(tree->subtree[c[i]], str, i + 1); + } + } + +//LN_DBGPRINTF(tree->ctx, "---------------------------------------"); +//ln_displayPTree(tree, 0); +//LN_DBGPRINTF(tree->ctx, "======================================="); +done: return r; +} + + +int +ln_addFDescrToPTree(struct ln_ptree **tree, ln_fieldList_t *node) +{ + int r; + ln_fieldList_t *curr; + + assert(tree != NULL);assert(*tree != NULL); + assert(node != NULL); + + if((node->subtree = ln_newPTree((*tree)->ctx, &node->subtree)) == NULL) { + r = -1; + goto done; + } + LN_DBGPRINTF((*tree)->ctx, "got new subtree %p", node->subtree); + + /* check if we already have this field, if so, merge + * TODO: optimized, check logic + */ + for(curr = (*tree)->froot ; curr != NULL ; curr = curr->next) { + if(!es_strcmp(curr->name, node->name) + && curr->parser == node->parser + && ((curr->raw_data == NULL && node->raw_data == NULL) + || (curr->raw_data != NULL && node->raw_data != NULL + && !es_strcmp(curr->raw_data, node->raw_data)))) { + *tree = curr->subtree; + ln_deletePTreeNode(node); + r = 0; + LN_DBGPRINTF((*tree)->ctx, "merging with tree %p\n", *tree); + goto done; + } + } + + if((*tree)->froot == NULL) { + (*tree)->froot = (*tree)->ftail = node; + } else { + (*tree)->ftail->next = node; + (*tree)->ftail = node; + } + r = 0; + LN_DBGPRINTF((*tree)->ctx, "prev subtree %p", *tree); + *tree = node->subtree; + LN_DBGPRINTF((*tree)->ctx, "new subtree %p", *tree); + +done: return r; +} + + +void +ln_displayPTree(struct ln_ptree *tree, int level) +{ + int i; + int nChildLit; + int nChildField; + es_str_t *str; + char *cstr; + ln_fieldList_t *node; + char indent[2048]; + + if(level > 1023) + level = 1023; + memset(indent, ' ', level * 2); + indent[level * 2] = '\0'; + + nChildField = 0; + for(node = tree->froot ; node != NULL ; node = node->next ) { + ++nChildField; + } + + nChildLit = 0; + for(i = 0 ; i < 256 ; ++i) { + if(tree->subtree[i] != NULL) { + nChildLit++; + } + } + + str = es_newStr(sizeof(tree->prefix)); + es_addBuf(&str, (char*) prefixBase(tree), tree->lenPrefix); + cstr = es_str2cstr(str, NULL); + es_deleteStr(str); + LN_DBGPRINTF(tree->ctx, "%ssubtree%s %p (prefix: '%s', children: %d literals, %d fields) [visited %u " + "backtracked %u terminated %u]", + indent, tree->flags.isTerminal ? " TERM" : "", tree, cstr, nChildLit, nChildField, + tree->stats.visited, tree->stats.backtracked, tree->stats.terminated); + free(cstr); + /* display char subtrees */ + for(i = 0 ; i < 256 ; ++i) { + if(tree->subtree[i] != NULL) { + LN_DBGPRINTF(tree->ctx, "%schar %2.2x(%c):", indent, i, i); + ln_displayPTree(tree->subtree[i], level + 1); + } + } + + /* display field subtrees */ + for(node = tree->froot ; node != NULL ; node = node->next ) { + cstr = es_str2cstr(node->name, NULL); + LN_DBGPRINTF(tree->ctx, "%sfield %s:", indent, cstr); + free(cstr); + ln_displayPTree(node->subtree, level + 1); + } +} + + +/* the following is a quick hack, which should be moved to the + * string class. + */ +static inline void dotAddPtr(es_str_t **str, void *p) +{ + char buf[64]; + int i; + i = snprintf(buf, sizeof(buf), "%p", p); + es_addBuf(str, buf, i); +} +/** + * recursive handler for DOT graph generator. + */ +static void +ln_genDotPTreeGraphRec(struct ln_ptree *tree, es_str_t **str) +{ + int i; + ln_fieldList_t *node; + + + dotAddPtr(str, tree); + es_addBufConstcstr(str, " [label=\""); + if(tree->lenPrefix > 0) { + es_addChar(str, '\''); + es_addBuf(str, (char*) prefixBase(tree), tree->lenPrefix); + es_addChar(str, '\''); + } + es_addBufConstcstr(str, "\""); + if(isLeaf(tree)) { + es_addBufConstcstr(str, " style=\"bold\""); + } + es_addBufConstcstr(str, "]\n"); + + /* display char subtrees */ + for(i = 0 ; i < 256 ; ++i) { + if(tree->subtree[i] != NULL) { + dotAddPtr(str, tree); + es_addBufConstcstr(str, " -> "); + dotAddPtr(str, tree->subtree[i]); + es_addBufConstcstr(str, " [label=\""); + es_addChar(str, (char) i); + es_addBufConstcstr(str, "\"]\n"); + ln_genDotPTreeGraphRec(tree->subtree[i], str); + } + } + + /* display field subtrees */ + for(node = tree->froot ; node != NULL ; node = node->next ) { + dotAddPtr(str, tree); + es_addBufConstcstr(str, " -> "); + dotAddPtr(str, node->subtree); + es_addBufConstcstr(str, " [label=\""); + es_addStr(str, node->name); + es_addBufConstcstr(str, "\" style=\"dotted\"]\n"); + ln_genDotPTreeGraphRec(node->subtree, str); + } +} + + +void +ln_genDotPTreeGraph(struct ln_ptree *tree, es_str_t **str) +{ + es_addBufConstcstr(str, "digraph ptree {\n"); + ln_genDotPTreeGraphRec(tree, str); + es_addBufConstcstr(str, "}\n"); +} + + +/** + * add unparsed string to event. + */ +static int +addUnparsedField(const char *str, size_t strLen, int offs, struct json_object *json) +{ + int r = 1; + struct json_object *value; + char *s = NULL; + CHKN(s = strndup(str, strLen)); + value = json_object_new_string(s); + if (value == NULL) { + goto done; + } + json_object_object_add(json, ORIGINAL_MSG_KEY, value); + + value = json_object_new_string(s + offs); + if (value == NULL) { + goto done; + } + json_object_object_add(json, UNPARSED_DATA_KEY, value); + + r = 0; +done: + free(s); + return r; +} + + +/** + * Special parser for iptables-like name/value pairs. + * The pull multiple fields. Note that once this parser has been selected, + * it is very unlikely to be left, as it is *very* generic. This parser is + * required because practice shows that already-structured data like iptables + * can otherwise not be processed by liblognorm in a meaningful way. + * + * @param[in] tree current tree to process + * @param[in] str string to be matched against (the to-be-normalized data) + * @param[in] strLen length of str + * @param[in/out] offs start position in input data, on exit first unparsed position + * @param[in/out] event handle to event that is being created during normalization + * + * @return 0 if parser was successfully, something else on error + */ +static int +ln_iptablesParser(struct ln_ptree *tree, const char *str, size_t strLen, size_t *offs, + struct json_object *json) +{ + int r; + size_t o = *offs; + es_str_t *fname; + es_str_t *fval; + const char *pstr; + const char *end; + struct json_object *value; + +LN_DBGPRINTF(tree->ctx, "%zu enter iptables parser, len %zu", *offs, strLen); + if(o == strLen) { + r = -1; /* can not be, we have no n/v pairs! */ + goto done; + } + + end = str + strLen; + pstr = str + o; + while(pstr < end) { + while(pstr < end && isspace(*pstr)) + ++pstr; + CHKN(fname = es_newStr(16)); + while(pstr < end && !isspace(*pstr) && *pstr != '=') { + es_addChar(&fname, *pstr); + ++pstr; + } + if(pstr < end && *pstr == '=') { + CHKN(fval = es_newStr(16)); + ++pstr; + /* error on space */ + while(pstr < end && !isspace(*pstr)) { + es_addChar(&fval, *pstr); + ++pstr; + } + } else { + CHKN(fval = es_newStrFromCStr("[*PRESENT*]", + sizeof("[*PRESENT*]")-1)); + } + char *cn, *cv; + CHKN(cn = ln_es_str2cstr(&fname)); + CHKN(cv = ln_es_str2cstr(&fval)); + if (tree->ctx->debug) { + LN_DBGPRINTF(tree->ctx, "iptables parser extracts %s=%s", cn, cv); + } + CHKN(value = json_object_new_string(cv)); + json_object_object_add(json, cn, value); + es_deleteStr(fval); + es_deleteStr(fname); + } + + r = 0; + *offs = strLen; + +done: + LN_DBGPRINTF(tree->ctx, "%zu iptables parser returns %d", *offs, r); + return r; +} + + +/** + * Recursive step of the normalizer. It walks the parse tree and calls itself + * recursively when this is appropriate. It also implements backtracking in + * those (hopefully rare) cases where it is required. + * + * @param[in] tree current tree to process + * @param[in] string string to be matched against (the to-be-normalized data) + * @param[in] strLen length of the to-be-matched string + * @param[in] offs start position in input data + * @param[in/out] json ... that is being created during normalization + * @param[out] endNode if a match was found, this is the matching node (undefined otherwise) + * + * @return number of characters left unparsed by following the subtree, negative if + * the to-be-parsed message is shorter than the rule sample by this number of + * characters. + */ +static int +ln_v1_normalizeRec(struct ln_ptree *tree, const char *str, size_t strLen, size_t offs, struct json_object *json, + struct ln_ptree **endNode) +{ + int r; + int localR; + size_t i; + int left; + ln_fieldList_t *node; + ln_fieldList_t *restMotifNode = NULL; + char *cstr; + const char *c; + unsigned char *cpfix; + unsigned ipfix; + size_t parsed; + char *namestr; + struct json_object *value; + + ++tree->stats.visited; + if(offs >= strLen) { + *endNode = tree; + r = -tree->lenPrefix; + goto done; + } + +LN_DBGPRINTF(tree->ctx, "%zu: enter parser, tree %p", offs, tree); + c = str; + cpfix = prefixBase(tree); + node = tree->froot; + r = strLen - offs; + /* first we need to check if the common prefix matches (and consume input data while we do) */ + ipfix = 0; + while(offs < strLen && ipfix < tree->lenPrefix) { + LN_DBGPRINTF(tree->ctx, "%zu: prefix compare '%c', '%c'", offs, c[offs], cpfix[ipfix]); + if(c[offs] != cpfix[ipfix]) { + r -= ipfix; + goto done; + } + ++offs, ++ipfix; + } + + if(ipfix != tree->lenPrefix) { + /* incomplete prefix match --> to-be-normalized string too short */ + r = ipfix - tree->lenPrefix; + goto done; + } + + r -= ipfix; + LN_DBGPRINTF(tree->ctx, "%zu: prefix compare succeeded, still valid", offs); + + /* now try the parsers */ + while(node != NULL) { + if(tree->ctx->debug) { + cstr = es_str2cstr(node->name, NULL); + LN_DBGPRINTF(tree->ctx, "%zu:trying parser for field '%s': %p", + offs, cstr, node->parser); + free(cstr); + } + i = offs; + if(node->isIPTables) { + localR = ln_iptablesParser(tree, str, strLen, &i, json); + LN_DBGPRINTF(tree->ctx, "%zu iptables parser return, i=%zu", + offs, i); + if(localR == 0) { + /* potential hit, need to verify */ + LN_DBGPRINTF(tree->ctx, "potential hit, trying subtree"); + left = ln_v1_normalizeRec(node->subtree, str, strLen, i, json, endNode); + if(left == 0 && (*endNode)->flags.isTerminal) { + LN_DBGPRINTF(tree->ctx, "%zu: parser matches at %zu", offs, i); + r = 0; + goto done; + } + LN_DBGPRINTF(tree->ctx, "%zu nonmatch, backtracking required, left=%d", + offs, left); + ++tree->stats.backtracked; + if(left < r) + r = left; + } + } else if(node->parser == ln_parseRest) { + /* This is a quick and dirty adjustment to handle "rest" more intelligently. + * It's just a tactical fix: in the longer term, we'll handle the whole + * situation differently. However, it makes sense to fix this now, as this + * solves some real-world problems immediately. -- rgerhards, 2015-04-15 + */ + restMotifNode = node; + } else { + value = NULL; + localR = node->parser(str, strLen, &i, node, &parsed, &value); + LN_DBGPRINTF(tree->ctx, "parser returns %d, parsed %zu", localR, parsed); + if(localR == 0) { + /* potential hit, need to verify */ + LN_DBGPRINTF(tree->ctx, "%zu: potential hit, trying subtree %p", offs, node->subtree); + left = ln_v1_normalizeRec(node->subtree, str, strLen, i + parsed, json, endNode); + LN_DBGPRINTF(tree->ctx, "%zu: subtree returns %d", offs, r); + if(left == 0 && (*endNode)->flags.isTerminal) { + LN_DBGPRINTF(tree->ctx, "%zu: parser matches at %zu", offs, i); + if(es_strbufcmp(node->name, (unsigned char*)"-", 1)) { + /* Store the value here; create json if not already created */ + if (value == NULL) { + CHKN(cstr = strndup(str + i, parsed)); + value = json_object_new_string(cstr); + free(cstr); + } + if (value == NULL) { + LN_DBGPRINTF(tree->ctx, "unable to create json"); + goto done; + } + namestr = ln_es_str2cstr(&node->name); + json_object_object_add(json, namestr, value); + } else { + if (value != NULL) { + /* Free the unneeded value */ + json_object_put(value); + } + } + r = 0; + goto done; + } + LN_DBGPRINTF(tree->ctx, "%zu nonmatch, backtracking required, left=%d", + offs, left); + if (value != NULL) { + /* Free the value if it was created */ + json_object_put(value); + } + if(left > 0 && left < r) + r = left; + LN_DBGPRINTF(tree->ctx, "%zu nonmatch, backtracking required, left=%d, r now %d", + offs, left, r); + ++tree->stats.backtracked; + } + } + node = node->next; + } + + if(offs == strLen) { + *endNode = tree; + r = 0; + goto done; + } + +if(offs < strLen) { +unsigned char cc = str[offs]; +LN_DBGPRINTF(tree->ctx, "%zu no field, trying subtree char '%c': %p", offs, cc, tree->subtree[cc]); +} else { +LN_DBGPRINTF(tree->ctx, "%zu no field, offset already beyond end", offs); +} + /* now let's see if we have a literal */ + if(tree->subtree[(unsigned char)str[offs]] != NULL) { + left = ln_v1_normalizeRec(tree->subtree[(unsigned char)str[offs]], + str, strLen, offs + 1, json, endNode); +LN_DBGPRINTF(tree->ctx, "%zu got left %d, r %d", offs, left, r); + if(left < r) + r = left; +LN_DBGPRINTF(tree->ctx, "%zu got return %d", offs, r); + } + + if(r == 0 && (*endNode)->flags.isTerminal) + goto done; + + /* and finally give "rest" a try if it was present. Note that we MUST do this after + * literal evaluation, otherwise "rest" can never be overriden by other rules. + */ + if(restMotifNode != NULL) { + LN_DBGPRINTF(tree->ctx, "rule has rest motif, forcing match via it"); + value = NULL; + restMotifNode->parser(str, strLen, &i, restMotifNode, &parsed, &value); +# ifndef NDEBUG + left = /* we only need this for the assert below */ +# endif + ln_v1_normalizeRec(restMotifNode->subtree, str, strLen, i + parsed, json, endNode); + assert(left == 0); /* with rest, we have this invariant */ + assert((*endNode)->flags.isTerminal); /* this one also */ + LN_DBGPRINTF(tree->ctx, "%zu: parser matches at %zu", offs, i); + if(es_strbufcmp(restMotifNode->name, (unsigned char*)"-", 1)) { + /* Store the value here; create json if not already created */ + if (value == NULL) { + CHKN(cstr = strndup(str + i, parsed)); + value = json_object_new_string(cstr); + free(cstr); + } + if (value == NULL) { + LN_DBGPRINTF(tree->ctx, "unable to create json"); + goto done; + } + namestr = ln_es_str2cstr(&restMotifNode->name); + json_object_object_add(json, namestr, value); + } else { + if (value != NULL) { + /* Free the unneeded value */ + json_object_put(value); + } + } + r = 0; + goto done; + } +done: + LN_DBGPRINTF(tree->ctx, "%zu returns %d", offs, r); + if(r == 0 && *endNode == tree) + ++tree->stats.terminated; + return r; +} + + +int +ln_v1_normalize(ln_ctx ctx, const char *str, size_t strLen, struct json_object **json_p) +{ + int r; + int left; + struct ln_ptree *endNode = NULL; + + if(*json_p == NULL) { + CHKN(*json_p = json_object_new_object()); + } + + left = ln_v1_normalizeRec(ctx->ptree, str, strLen, 0, *json_p, &endNode); + + if(ctx->debug) { + if(left == 0) { + LN_DBGPRINTF(ctx, "final result for normalizer: left %d, endNode %p, " + "isTerminal %d, tagbucket %p", + left, endNode, endNode->flags.isTerminal, endNode->tags); + } else { + LN_DBGPRINTF(ctx, "final result for normalizer: left %d, endNode %p", + left, endNode); + } + } + if(left != 0 || !endNode->flags.isTerminal) { + /* we could not successfully parse, some unparsed items left */ + if(left < 0) { + addUnparsedField(str, strLen, strLen, *json_p); + } else { + addUnparsedField(str, strLen, strLen - left, *json_p); + } + } else { + /* success, finalize event */ + if(endNode->tags != NULL) { + /* add tags to an event */ + json_object_get(endNode->tags); + json_object_object_add(*json_p, "event.tags", endNode->tags); + CHKR(ln_annotate(ctx, *json_p, endNode->tags)); + } + } + + r = 0; + +done: return r; +} + + +/** + * Gather and output pdag statistics for the full pdag (ctx) + * including all disconnected components (type defs). + * + * Data is sent to given file ptr. + */ +void +ln_fullPTreeStats(ln_ctx ctx, FILE __attribute__((unused)) *const fp, +const int __attribute__((unused)) extendedStats) +{ + ln_displayPTree(ctx->ptree, 0); +} diff --git a/src/v1_ptree.h b/src/v1_ptree.h new file mode 100644 index 0000000..94ac747 --- /dev/null +++ b/src/v1_ptree.h @@ -0,0 +1,206 @@ +/** + * @file ptree.h + * @brief The parse tree object. + * @class ln_ptree ptree.h + *//* + * Copyright 2013 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is meant to be included by applications using liblognorm. + * For lognorm library files themselves, include "lognorm.h". + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef LIBLOGNORM_PTREE_H_INCLUDED +#define LIBLOGNORM_PTREE_H_INCLUDED +#include +#include + +#define ORIGINAL_MSG_KEY "originalmsg" +#define UNPARSED_DATA_KEY "unparsed-data" + +typedef struct ln_ptree ln_ptree; /**< the parse tree object */ +typedef struct ln_fieldList_s ln_fieldList_t; + +/** + * List of supported fields inside parse tree. + * This list holds all fields and their description. While normalizing, + * fields are tried in the order of this list. So the enqeue order + * dictates precedence during parsing. + * + * value list. This is a single-linked list. In a later stage, we should + * optimize it so that frequently used fields are moved "up" towards + * the root of the list. In any case, we do NOT expect this list to + * be long, as the parser should already have gotten quite specific when + * we hit a fieldconst . + */ +struct ln_fieldList_s { + es_str_t *name; /**< field name */ + es_str_t *data; /**< extra data to be passed to parser */ + es_str_t *raw_data; /**< extra untouched (unescaping is not done) data availble to be used by parser */ + void *parser_data; /** opaque data that the field-parser understands */ + void (*parser_data_destructor)(void **); /** destroy opaque data that field-parser understands */ + int (*parser)(const char*, size_t, size_t*, const ln_fieldList_t *, + size_t*, struct json_object **); /**< parser to use */ + ln_ptree *subtree; /**< subtree to follow if parser succeeded */ + ln_fieldList_t *next; /**< list housekeeping, next node (or NULL) */ + unsigned char isIPTables; /**< special parser: iptables! */ +}; + + +/* parse tree object + */ +struct ln_ptree { + ln_ctx ctx; /**< our context */ + ln_ptree **parentptr; /**< pointer to *us* *inside* the parent + BUT this is NOT a pointer to the parent! */ + ln_fieldList_t *froot; /**< root of field list */ + ln_fieldList_t *ftail; /**< tail of field list */ + struct { + unsigned isTerminal:1; /**< designates this node a terminal sequence? */ + } flags; + struct json_object *tags; /* tags to assign to events of this type */ + /* the respresentation below requires a lof of memory but is + * very fast. As an alternate approach, we can use a hash table + * where we ignore control characters. That should work quite well. + * But we do not do this in the initial step. + */ + ln_ptree *subtree[256]; + unsigned short lenPrefix; /**< length of common prefix, 0->none */ + union { + unsigned char *ptr; /**< use if data element is too large */ + unsigned char data[16]; /**< fast lookup for small string */ + } prefix; /**< a common prefix string for all of this node */ + struct { + unsigned visited; + unsigned backtracked; /**< incremented when backtracking was initiated */ + unsigned terminated; + } stats; /**< usage statistics */ +}; + + +/* Methods */ + +/** + * Allocates and initializes a new parse tree node. + * @memberof ln_ptree + * + * @param[in] ctx current library context. This MUST match the + * context of the parent. + * @param[in] parent pointer to the new node inside the parent + * + * @return pointer to new node or NULL on error + */ +struct ln_ptree* ln_newPTree(ln_ctx ctx, struct ln_ptree** parent); + + +/** + * Free a parse tree and destruct all members. + * @memberof ln_ptree + * + * @param[in] tree pointer to ptree to free + */ +void ln_deletePTree(struct ln_ptree *tree); + +/** + * Free a parse tree node and destruct all members. + * @memberof ln_ptree + * + * @param[in] node pointer to free + */ +void ln_deletePTreeNode(ln_fieldList_t *node); + +/** + * Add a field description to the a tree. + * The field description will be added as last field. Fields are + * parsed in the order they have been added, so be sure to care + * about the order if that matters. + * @memberof ln_ptree + * + * @param[in] tree pointer to ptree to modify + * @param[in] fielddescr a fully populated (and initialized) + * field description node + * @returns 0 on success, something else otherwise + */ +int ln_addFDescrToPTree(struct ln_ptree **tree, ln_fieldList_t *node); + + +/** + * Add a literal to a ptree. + * Creates new tree nodes as necessary. + * @memberof ln_ptree + * + * @param[in] tree root of tree where to add + * @param[in] str literal (string) to add + * @param[in] offs offset of where in literal adding should start + * + * @return NULL on error, otherwise pointer to deepest tree added + */ +struct ln_ptree* +ln_addPTree(struct ln_ptree *tree, es_str_t *str, size_t offs); + + +/** + * Display the content of a ptree (debug function). + * This is a debug aid that spits out a textual representation + * of the provided ptree via multiple calls of the debug callback. + * + * @param tree ptree to display + * @param level recursion level, must be set to 0 on initial call + */ +void ln_displayPTree(struct ln_ptree *tree, int level); + + +/** + * Generate a DOT graph. + * Well, actually it does not generate the graph itself, but a + * control file that is suitable for the GNU DOT tool. Such a file + * can be very useful to understand complex sample databases + * (not to mention that it is probably fun for those creating + * samples). + * The dot commands are appended to the provided string. + * + * @param[in] tree ptree to display + * @param[out] str string which receives the DOT commands. + */ +void ln_genDotPTreeGraph(struct ln_ptree *tree, es_str_t **str); + + +/** + * Build a ptree based on the provided string, but only if necessary. + * The passed-in tree is searched and traversed for str. If a node exactly + * matching str is found, that node is returned. If no exact match is found, + * a new node is added. Existing nodes may be split, if a so-far common + * prefix needs to be split in order to add the new node. + * + * @param[in] tree root of the current tree + * @param[in] str string to be added + * @param[in] offs offset into str where match needs to start + * (this is required for recursive calls to handle + * common prefixes) + * @return NULL on error, otherwise the ptree leaf that + * corresponds to the parameters passed. + */ +struct ln_ptree * ln_buildPTree(struct ln_ptree *tree, es_str_t *str, size_t offs); + +/* internal helper for displaying stats */ +void ln_fullPTreeStats(ln_ctx ctx, FILE *const fp, const int extendedStats); + +#endif /* #ifndef LOGNORM_PTREE_H_INCLUDED */ diff --git a/src/v1_samp.c b/src/v1_samp.c new file mode 100644 index 0000000..e91059b --- /dev/null +++ b/src/v1_samp.c @@ -0,0 +1,844 @@ +/* samp.c -- code for ln_v1_samp objects. + * + * Copyright 2010-2015 by Rainer Gerhards and Adiscon GmbH. + * + * Modified by Pavel Levshin (pavel@levshin.spb.ru) in 2013 + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#include "config.h" +#include +#include +#include +#include +#include +#include + +#define LOGNORM_V1_SUBSYSTEM /* indicate we are old cruft */ +#include "v1_liblognorm.h" +#include "internal.h" +#include "lognorm.h" +#include "samp.h" +#include "v1_ptree.h" +#include "v1_samp.h" +#include "v1_parser.h" + + +/** + * Construct a sample object. + */ +struct ln_v1_samp* +ln_v1_sampCreate(ln_ctx __attribute__((unused)) ctx) +{ + struct ln_v1_samp* samp; + + if((samp = calloc(1, sizeof(struct ln_v1_samp))) == NULL) + goto done; + + /* place specific init code here (none at this time) */ + +done: return samp; +} + +void +ln_v1_sampFree(ln_ctx __attribute__((unused)) ctx, struct ln_v1_samp *samp) +{ + free(samp); +} + + +/** + * Extract a field description from a sample. + * The field description is added to the tail of the current + * subtree's field list. The parse buffer must be position on the + * leading '%' that starts a field definition. It is a program error + * if this condition is not met. + * + * Note that we break up the object model and access ptree members + * directly. Let's consider us a friend of ptree. This is necessary + * to optimize the structure for a high-speed parsing process. + * + * @param[in] str a temporary work string. This is passed in to save the + * creation overhead + * @returns 0 on success, something else otherwise + */ +static int +addFieldDescr(ln_ctx ctx, struct ln_ptree **subtree, es_str_t *rule, + es_size_t *bufOffs, es_str_t **str) +{ + int r; + ln_fieldList_t *node = ln_v1_parseFieldDescr(ctx, rule, bufOffs, str, &r); + assert(subtree != NULL); + + if (node != NULL) CHKR(ln_addFDescrToPTree(subtree, node)); +done: + return r; +} + +ln_fieldList_t* +ln_v1_parseFieldDescr(ln_ctx ctx, es_str_t *rule, es_size_t *bufOffs, es_str_t **str, int* ret) +{ + int r = 0; + ln_fieldList_t *node; + es_size_t i = *bufOffs; + char *cstr; /* for debug mode strings */ + unsigned char *buf; + es_size_t lenBuf; + void* (*constructor_fn)(ln_fieldList_t *, ln_ctx) = NULL; + + buf = es_getBufAddr(rule); + lenBuf = es_strlen(rule); + assert(buf[i] == '%'); + ++i; /* "eat" ':' */ + CHKN(node = calloc(1, sizeof(ln_fieldList_t))); + node->subtree = NULL; + node->next = NULL; + node->data = NULL; + node->raw_data = NULL; + node->parser_data = NULL; + node->parser_data_destructor = NULL; + CHKN(node->name = es_newStr(16)); + + /* skip leading whitespace in field name */ + while(i < lenBuf && isspace(buf[i])) + ++i; + while(i < lenBuf && buf[i] != ':') { + CHKR(es_addChar(&node->name, buf[i++])); + } + + if(es_strlen(node->name) == 0) { + FAIL(LN_INVLDFDESCR); + } + + if(ctx->debug) { + cstr = es_str2cstr(node->name, NULL); + ln_dbgprintf(ctx, "parsed field: '%s'", cstr); + free(cstr); + } + + if(buf[i] != ':') { + /* may be valid later if we have a loaded CEE dictionary + * and the name is present inside it. + */ + FAIL(LN_INVLDFDESCR); + } + ++i; /* skip ':' */ + + /* parse and process type (trailing whitespace must be trimmed) */ + es_emptyStr(*str); + size_t j = i; + /* scan for terminator */ + while(j < lenBuf && buf[j] != ':' && buf[j] != '%') + ++j; + /* now trim trailing space backwards */ + size_t next = j; + --j; + while(j >= i && isspace(buf[j])) + --j; + /* now copy */ + while(i <= j) { + CHKR(es_addChar(str, buf[i++])); + } + /* finally move i to consumed position */ + i = next; + + if(i == lenBuf) { + FAIL(LN_INVLDFDESCR); + } + + node->isIPTables = 0; /* first assume no special parser is used */ + if(!es_strconstcmp(*str, "date-rfc3164")) { + node->parser = ln_parseRFC3164Date; + } else if(!es_strconstcmp(*str, "date-rfc5424")) { + node->parser = ln_parseRFC5424Date; + } else if(!es_strconstcmp(*str, "number")) { + node->parser = ln_parseNumber; + } else if(!es_strconstcmp(*str, "float")) { + node->parser = ln_parseFloat; + } else if(!es_strconstcmp(*str, "hexnumber")) { + node->parser = ln_parseHexNumber; + } else if(!es_strconstcmp(*str, "kernel-timestamp")) { + node->parser = ln_parseKernelTimestamp; + } else if(!es_strconstcmp(*str, "whitespace")) { + node->parser = ln_parseWhitespace; + } else if(!es_strconstcmp(*str, "ipv4")) { + node->parser = ln_parseIPv4; + } else if(!es_strconstcmp(*str, "ipv6")) { + node->parser = ln_parseIPv6; + } else if(!es_strconstcmp(*str, "word")) { + node->parser = ln_parseWord; + } else if(!es_strconstcmp(*str, "alpha")) { + node->parser = ln_parseAlpha; + } else if(!es_strconstcmp(*str, "rest")) { + node->parser = ln_parseRest; + } else if(!es_strconstcmp(*str, "op-quoted-string")) { + node->parser = ln_parseOpQuotedString; + } else if(!es_strconstcmp(*str, "quoted-string")) { + node->parser = ln_parseQuotedString; + } else if(!es_strconstcmp(*str, "date-iso")) { + node->parser = ln_parseISODate; + } else if(!es_strconstcmp(*str, "time-24hr")) { + node->parser = ln_parseTime24hr; + } else if(!es_strconstcmp(*str, "time-12hr")) { + node->parser = ln_parseTime12hr; + } else if(!es_strconstcmp(*str, "duration")) { + node->parser = ln_parseDuration; + } else if(!es_strconstcmp(*str, "cisco-interface-spec")) { + node->parser = ln_parseCiscoInterfaceSpec; + } else if(!es_strconstcmp(*str, "json")) { + node->parser = ln_parseJSON; + } else if(!es_strconstcmp(*str, "cee-syslog")) { + node->parser = ln_parseCEESyslog; + } else if(!es_strconstcmp(*str, "mac48")) { + node->parser = ln_parseMAC48; + } else if(!es_strconstcmp(*str, "name-value-list")) { + node->parser = ln_parseNameValue; + } else if(!es_strconstcmp(*str, "cef")) { + node->parser = ln_parseCEF; + } else if(!es_strconstcmp(*str, "checkpoint-lea")) { + node->parser = ln_parseCheckpointLEA; + } else if(!es_strconstcmp(*str, "v2-iptables")) { + node->parser = ln_parsev2IPTables; + } else if(!es_strconstcmp(*str, "iptables")) { + node->parser = NULL; + node->isIPTables = 1; + } else if(!es_strconstcmp(*str, "string-to")) { + /* TODO: check extra data!!!! (very important) */ + node->parser = ln_parseStringTo; + } else if(!es_strconstcmp(*str, "char-to")) { + /* TODO: check extra data!!!! (very important) */ + node->parser = ln_parseCharTo; + } else if(!es_strconstcmp(*str, "char-sep")) { + /* TODO: check extra data!!!! (very important) */ + node->parser = ln_parseCharSeparated; + } else if(!es_strconstcmp(*str, "tokenized")) { + node->parser = ln_parseTokenized; + constructor_fn = tokenized_parser_data_constructor; + node->parser_data_destructor = tokenized_parser_data_destructor; + } +#ifdef FEATURE_REGEXP + else if(!es_strconstcmp(*str, "regex")) { + node->parser = ln_parseRegex; + constructor_fn = regex_parser_data_constructor; + node->parser_data_destructor = regex_parser_data_destructor; + } +#endif + else if (!es_strconstcmp(*str, "recursive")) { + node->parser = ln_parseRecursive; + constructor_fn = recursive_parser_data_constructor; + node->parser_data_destructor = recursive_parser_data_destructor; + } else if (!es_strconstcmp(*str, "descent")) { + node->parser = ln_parseRecursive; + constructor_fn = descent_parser_data_constructor; + node->parser_data_destructor = recursive_parser_data_destructor; + } else if (!es_strconstcmp(*str, "interpret")) { + node->parser = ln_parseInterpret; + constructor_fn = interpret_parser_data_constructor; + node->parser_data_destructor = interpret_parser_data_destructor; + } else if (!es_strconstcmp(*str, "suffixed")) { + node->parser = ln_parseSuffixed; + constructor_fn = suffixed_parser_data_constructor; + node->parser_data_destructor = suffixed_parser_data_destructor; + } else if (!es_strconstcmp(*str, "named_suffixed")) { + node->parser = ln_parseSuffixed; + constructor_fn = named_suffixed_parser_data_constructor; + node->parser_data_destructor = suffixed_parser_data_destructor; + } else { + cstr = es_str2cstr(*str, NULL); + ln_errprintf(ctx, 0, "invalid field type '%s'", cstr); + free(cstr); + FAIL(LN_INVLDFDESCR); + } + + if(buf[i] == '%') { + i++; + } else { + /* parse extra data */ + CHKN(node->data = es_newStr(8)); + i++; + while(i < lenBuf) { + if(buf[i] == '%') { + ++i; + break; /* end of field */ + } + CHKR(es_addChar(&node->data, buf[i++])); + } + node->raw_data = es_strdup(node->data); + es_unescapeStr(node->data); + if(ctx->debug) { + cstr = es_str2cstr(node->data, NULL); + ln_dbgprintf(ctx, "parsed extra data: '%s'", cstr); + free(cstr); + } + } + + if (constructor_fn) node->parser_data = constructor_fn(node, ctx); + + + *bufOffs = i; +done: + if (r != 0) { + if (node->name != NULL) es_deleteStr(node->name); + free(node); + node = NULL; + } + *ret = r; + return node; +} + +/** + * Parse a Literal string out of the template and add it to the tree. + * @param[in] ctx the context + * @param[in/out] subtree on entry, current subtree, on exist newest + * deepest subtree + * @param[in] rule string with current rule + * @param[in/out] bufOffs parse pointer, up to which offset is parsed + * (is updated so that it points to first char after consumed + * string on exit). + * @param[out] str literal extracted (is empty, when no litral could be found) + * @return 0 on success, something else otherwise + */ +static int +parseLiteral(ln_ctx ctx, struct ln_ptree **subtree, es_str_t *rule, + es_size_t *bufOffs, es_str_t **str) +{ + int r = 0; + es_size_t i = *bufOffs; + unsigned char *buf; + es_size_t lenBuf; + + es_emptyStr(*str); + buf = es_getBufAddr(rule); + lenBuf = es_strlen(rule); + /* extract maximum length literal */ + while(i < lenBuf) { + if(buf[i] == '%') { + if(i+1 < lenBuf && buf[i+1] != '%') { + break; /* field start is end of literal */ + } + if (++i == lenBuf) break; + } + CHKR(es_addChar(str, buf[i])); + ++i; + } + + es_unescapeStr(*str); + if(ctx->debug) { + char *cstr = es_str2cstr(*str, NULL); + ln_dbgprintf(ctx, "parsed literal: '%s'", cstr); + free(cstr); + } + + *subtree = ln_buildPTree(*subtree, *str, 0); + *bufOffs = i; + r = 0; + +done: return r; +} + + +/* Implementation note: + * We read in the sample, and split it into chunks of literal text and + * fields. Each literal text is added as whole to the tree, as is each + * field individually. To do so, we keep track of our current subtree + * root, which changes whenever a new part of the tree is build. It is + * set to the then-lowest part of the tree, where the next step sample + * data is to be added. + * + * This function processes the whole string or returns an error. + * + * format: literal1%field:type:extra-data%literal2 + * + * @returns the new subtree root (or NULL in case of error) + */ +static int +addSampToTree(ln_ctx ctx, es_str_t *rule, struct json_object *tagBucket) +{ + int r = -1; + struct ln_ptree* subtree; + es_str_t *str = NULL; + es_size_t i; + + subtree = ctx->ptree; + CHKN(str = es_newStr(256)); + i = 0; + while(i < es_strlen(rule)) { + ln_dbgprintf(ctx, "addSampToTree %d of %d", i, es_strlen(rule)); + CHKR(parseLiteral(ctx, &subtree, rule, &i, &str)); + /* After the literal there can be field only*/ + if (i < es_strlen(rule)) { + CHKR(addFieldDescr(ctx, &subtree, rule, &i, &str)); + if (i == es_strlen(rule)) { + /* finish the tree with empty literal to avoid false merging*/ + CHKR(parseLiteral(ctx, &subtree, rule, &i, &str)); + } + } + } + + ln_dbgprintf(ctx, "end addSampToTree %d of %d", i, es_strlen(rule)); + /* we are at the end of rule processing, so this node is a terminal */ + subtree->flags.isTerminal = 1; + subtree->tags = tagBucket; + +done: + if(str != NULL) + es_deleteStr(str); + return r; +} + + + +/** + * get the initial word of a rule line that tells us the type of the + * line. + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[out] offs offset after "=" + * @param[out] str string with "linetype-word" (newly created) + * @returns 0 on success, something else otherwise + */ +static int +getLineType(const char *buf, es_size_t lenBuf, es_size_t *offs, es_str_t **str) +{ + int r = -1; + es_size_t i; + + *str = es_newStr(16); + for(i = 0 ; i < lenBuf && buf[i] != '=' ; ++i) { + CHKR(es_addChar(str, buf[i])); + } + + if(i < lenBuf) + ++i; /* skip over '=' */ + *offs = i; + +done: return r; +} + + +/** + * Get a new common prefix from the config file. That is actually everything from + * the current offset to the end of line. + * + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset after "=" + * @param[in/out] str string to store common offset. If NULL, it is created, + * otherwise it is emptied. + * @returns 0 on success, something else otherwise + */ +static int +getPrefix(const char *buf, es_size_t lenBuf, es_size_t offs, es_str_t **str) +{ + int r; + + if(*str == NULL) { + CHKN(*str = es_newStr(lenBuf - offs)); + } else { + es_emptyStr(*str); + } + + r = es_addBuf(str, (char*)buf + offs, lenBuf - offs); +done: return r; +} + +/** + * Extend the common prefix. This means that the line is concatenated + * to the prefix. This is useful if the same rulebase is to be used with + * different prefixes (well, not strictly necessary, but probably useful). + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset to-be-added text starts + * @returns 0 on success, something else otherwise + */ +static int +extendPrefix(ln_ctx ctx, const char *buf, es_size_t lenBuf, es_size_t offs) +{ + return es_addBuf(&ctx->rulePrefix, (char*)buf+offs, lenBuf - offs); +} + + +/** + * Add a tag to the tag bucket. Helper to processTags. + * @param[in] ctx current context + * @param[in] tagname string with tag name + * @param[out] tagBucket tagbucket to which new tags shall be added + * the tagbucket is created if it is NULL + * @returns 0 on success, something else otherwise + */ +static int +addTagStrToBucket(ln_ctx ctx, es_str_t *tagname, struct json_object **tagBucket) +{ + int r = -1; + char *cstr; + struct json_object *tag; + + if(*tagBucket == NULL) { + CHKN(*tagBucket = json_object_new_array()); + } + cstr = es_str2cstr(tagname, NULL); + ln_dbgprintf(ctx, "tag found: '%s'", cstr); + CHKN(tag = json_object_new_string(cstr)); + json_object_array_add(*tagBucket, tag); + free(cstr); + r = 0; + +done: return r; +} + + +/** + * Extract the tags and create a tag bucket out of them + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in,out] poffs offset where tags start, on exit and success + * offset after tag part (excluding ':') + * @param[out] tagBucket tagbucket to which new tags shall be added + * the tagbucket is created if it is NULL + * @returns 0 on success, something else otherwise + */ +static int +processTags(ln_ctx ctx, const char *buf, es_size_t lenBuf, es_size_t *poffs, struct json_object **tagBucket) +{ + int r = -1; + es_str_t *str = NULL; + es_size_t i; + + assert(poffs != NULL); + i = *poffs; + while(i < lenBuf && buf[i] != ':') { + if(buf[i] == ',') { + /* end of this tag */ + CHKR(addTagStrToBucket(ctx, str, tagBucket)); + es_deleteStr(str); + str = NULL; + } else { + if(str == NULL) { + CHKN(str = es_newStr(32)); + } + CHKR(es_addChar(&str, buf[i])); + } + ++i; + } + + if(buf[i] != ':') + goto done; + ++i; /* skip ':' */ + + if(str != NULL) { + CHKR(addTagStrToBucket(ctx, str, tagBucket)); + es_deleteStr(str); + } + + *poffs = i; + r = 0; + +done: return r; +} + + +/** + * Process a new rule and add it to tree. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset where rule starts + * @returns 0 on success, something else otherwise + */ +static int +processRule(ln_ctx ctx, const char *buf, es_size_t lenBuf, es_size_t offs) +{ + int r = -1; + es_str_t *str; + struct json_object *tagBucket = NULL; + + ln_dbgprintf(ctx, "sample line to add: '%s'\n", buf+offs); + CHKR(processTags(ctx, buf, lenBuf, &offs, &tagBucket)); + + if(offs == lenBuf) { + ln_dbgprintf(ctx, "error, actual message sample part is missing"); + // TODO: provide some error indicator to app? We definitely must do (a callback?) + goto done; + } + if(ctx->rulePrefix == NULL) { + CHKN(str = es_newStr(lenBuf)); + } else { + CHKN(str = es_strdup(ctx->rulePrefix)); + } + CHKR(es_addBuf(&str, (char*)buf + offs, lenBuf - offs)); + addSampToTree(ctx, str, tagBucket); + es_deleteStr(str); + r = 0; +done: return r; +} + + +/** + * Obtain a field name from a rule base line. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in/out] offs on entry: offset where tag starts, + * on exit: updated offset AFTER TAG and (':') + * @param [out] strTag obtained tag, if successful + * @returns 0 on success, something else otherwise + */ +static int +getFieldName(ln_ctx __attribute__((unused)) ctx, const char *buf, es_size_t lenBuf, es_size_t *offs, +es_str_t **strTag) +{ + int r = -1; + es_size_t i; + + i = *offs; + while(i < lenBuf && + (isalnum(buf[i]) || buf[i] == '_' || buf[i] == '.')) { + if(*strTag == NULL) { + CHKN(*strTag = es_newStr(32)); + } + CHKR(es_addChar(strTag, buf[i])); + ++i; + } + *offs = i; + r = 0; +done: return r; +} + + +/** + * Skip over whitespace. + * Skips any whitespace present at the offset. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in/out] offs on entry: offset first unprocessed position + */ +static void +skipWhitespace(ln_ctx __attribute__((unused)) ctx, const char *buf, es_size_t lenBuf, es_size_t *offs) +{ + while(*offs < lenBuf && isspace(buf[*offs])) { + (*offs)++; + } +} + + +/** + * Obtain an annotation (field) operation. + * This usually is a plus or minus sign followed by a field name + * followed (if plus) by an equal sign and the field value. On entry, + * offs must be positioned on the first unprocessed field (after ':' for + * the initial field!). Extra whitespace is detected and, if present, + * skipped. The obtained operation is added to the annotation set provided. + * Note that extracted string objects are passed to the annotation; thus it + * is vital NOT to free them (most importantly, this is *not* a memory leak). + * + * @param[in] ctx current context + * @param[in] annot active annotation set to which the operation is to be added + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in/out] offs on entry: offset where tag starts, + * on exit: updated offset AFTER TAG and (':') + * @param [out] strTag obtained tag, if successful + * @returns 0 on success, something else otherwise + */ +static int +getAnnotationOp(ln_ctx ctx, ln_annot *annot, const char *buf, es_size_t lenBuf, es_size_t *offs) +{ + int r = -1; + es_size_t i; + es_str_t *fieldName = NULL; + es_str_t *fieldVal = NULL; + ln_annot_opcode opc; + + i = *offs; + skipWhitespace(ctx, buf, lenBuf, &i); + if(i == lenBuf) { + r = 0; + goto done; /* nothing left to process (no error!) */ + } + + if(buf[i] == '+') { + opc = ln_annot_ADD; + } else if(buf[i] == '-') { + ln_dbgprintf(ctx, "annotate op '-' not yet implemented - failing"); + goto fail; + } else { + ln_dbgprintf(ctx, "invalid annotate opcode '%c' - failing" , buf[i]); + goto fail; + } + i++; + + if(i == lenBuf) goto fail; /* nothing left to process */ + + CHKR(getFieldName(ctx, buf, lenBuf, &i, &fieldName)); + if(i == lenBuf) goto fail; /* nothing left to process */ + if(buf[i] != '=') goto fail; /* format error */ + i++; + + skipWhitespace(ctx, buf, lenBuf, &i); + if(buf[i] != '"') goto fail; /* format error */ + ++i; + + while(i < lenBuf && buf[i] != '"') { + if(fieldVal == NULL) { + CHKN(fieldVal = es_newStr(32)); + } + CHKR(es_addChar(&fieldVal, buf[i])); + ++i; + } + *offs = (i == lenBuf) ? i : i+1; + CHKR(ln_addAnnotOp(annot, opc, fieldName, fieldVal)); + r = 0; +done: return r; +fail: return -1; +} + + +/** + * Process a new annotation and add it to the annotation set. + * + * @param[in] ctx current context + * @param[in] buf line buffer + * @param[in] len length of buffer + * @param[in] offs offset where annotation starts + * @returns 0 on success, something else otherwise + */ +static int +processAnnotate(ln_ctx ctx, const char *buf, es_size_t lenBuf, es_size_t offs) +{ + int r; + es_str_t *tag = NULL; + ln_annot *annot; + + ln_dbgprintf(ctx, "sample annotation to add: '%s'", buf+offs); + CHKR(getFieldName(ctx, buf, lenBuf, &offs, &tag)); + skipWhitespace(ctx, buf, lenBuf, &offs); + if(buf[offs] != ':' || tag == NULL) { + ln_dbgprintf(ctx, "invalid tag field in annotation, line is '%s'", buf); + r=-1; + goto done; + } + ++offs; + + /* we got an annotation! */ + CHKN(annot = ln_newAnnot(tag)); + + while(offs < lenBuf) { + CHKR(getAnnotationOp(ctx, annot, buf, lenBuf, &offs)); + } + + r = ln_addAnnotToSet(ctx->pas, annot); + +done: return r; +} + +struct ln_v1_samp * +ln_v1_processSamp(ln_ctx ctx, const char *buf, es_size_t lenBuf) +{ + struct ln_v1_samp *samp = NULL; + es_str_t *typeStr = NULL; + es_size_t offs; + + if(getLineType(buf, lenBuf, &offs, &typeStr) != 0) + goto done; + + if(!es_strconstcmp(typeStr, "prefix")) { + if(getPrefix(buf, lenBuf, offs, &ctx->rulePrefix) != 0) goto done; + } else if(!es_strconstcmp(typeStr, "extendprefix")) { + if(extendPrefix(ctx, buf, lenBuf, offs) != 0) goto done; + } else if(!es_strconstcmp(typeStr, "rule")) { + if(processRule(ctx, buf, lenBuf, offs) != 0) goto done; + } else if(!es_strconstcmp(typeStr, "annotate")) { + if(processAnnotate(ctx, buf, lenBuf, offs) != 0) goto done; + } else { + /* TODO error reporting */ + char *str; + str = es_str2cstr(typeStr, NULL); + ln_dbgprintf(ctx, "invalid record type detected: '%s'", str); + free(str); + goto done; + } + +done: + if(typeStr != NULL) + es_deleteStr(typeStr); + + return samp; +} + + +struct ln_v1_samp * +ln_v1_sampRead(ln_ctx ctx, FILE *const __restrict__ repo, int *const __restrict__ isEof) +{ + struct ln_v1_samp *samp = NULL; + char buf[10*1024]; /**< max size of rule - TODO: make configurable */ + + size_t i = 0; + int inParser = 0; + int done = 0; + while(!done) { + int c = fgetc(repo); + if(c == EOF) { + *isEof = 1; + if(i == 0) + goto done; + else + done = 1; /* last line missing LF, still process it! */ + } else if(c == '\n') { + ++ctx->conf_ln_nbr; + if(inParser) { + if(ln_sampChkRunawayRule(ctx, repo, NULL)) { + /* ignore previous rule */ + inParser = 0; + i = 0; + } + } + if(!inParser && i != 0) + done = 1; + } else if(c == '#' && i == 0) { + ln_sampSkipCommentLine(ctx, repo, NULL); + i = 0; /* back to beginning */ + } else { + if(c == '%') + inParser = (inParser) ? 0 : 1; + buf[i++] = c; + if(i >= sizeof(buf)) { + ln_errprintf(ctx, 0, "line is too long"); + goto done; + } + } + } + buf[i] = '\0'; + + ln_dbgprintf(ctx, "read rulebase line[~%d]: '%s'", ctx->conf_ln_nbr, buf); + ln_v1_processSamp(ctx, buf, i); + +ln_dbgprintf(ctx, "---------------------------------------"); +ln_displayPTree(ctx->ptree, 0); +ln_dbgprintf(ctx, "======================================="); +done: + return samp; +} diff --git a/src/v1_samp.h b/src/v1_samp.h new file mode 100644 index 0000000..92b3fad --- /dev/null +++ b/src/v1_samp.h @@ -0,0 +1,103 @@ +/** + * @file samples.h + * @brief Object to process log samples. + * @author Rainer Gerhards + * + * This object handles log samples, and in actual log sample files. + * It co-operates with the ptree object to build the actual parser tree. + *//* + * + * liblognorm - a fast samples-based log normalization library + * Copyright 2010 by Rainer Gerhards and Adiscon GmbH. + * + * This file is part of liblognorm. + * + * This library is free software; you can redistribute it and/or + * modify it under the terms of the GNU Lesser General Public + * License as published by the Free Software Foundation; either + * version 2.1 of the License, or (at your option) any later version. + * + * This library is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + * Lesser General Public License for more details. + * + * You should have received a copy of the GNU Lesser General Public + * License along with this library; if not, write to the Free Software + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA + * + * A copy of the LGPL v2.1 can be found in the file "COPYING" in this distribution. + */ +#ifndef LIBLOGNORM_V1_SAMPLES_H_INCLUDED +#define LIBLOGNORM_V1_SAMPLES_H_INCLUDED +#include /* we need es_size_t */ +#include + + +/** + * A single log sample. + */ +struct ln_v1_samp { + es_str_t *msg; +}; + +/** + * Reads a sample stored in buffer buf and creates a new ln_v1_samp object + * out of it. + * + * @note + * It is the caller's responsibility to delete the newly + * created ln_v1_samp object if it is no longer needed. + * + * @param[ctx] ctx current library context + * @param[buf] cstr buffer containing the string contents of the sample + * @param[lenBuf] length of the sample contained within buf + * @return Newly create object or NULL if an error occured. + */ +struct ln_v1_samp * +ln_v1_processSamp(ln_ctx ctx, const char *buf, es_size_t lenBuf); + + +/** + * Read a sample from repository (sequentially). + * + * Reads a sample starting with the current file position and + * creates a new ln_v1_samp object out of it. + * + * @note + * It is the caller's responsibility to delete the newly + * created ln_v1_samp object if it is no longer needed. + * + * @param[in] ctx current library context + * @param[in] repo repository descriptor + * @param[out] isEof must be set to 0 on entry and is switched to 1 if EOF occured. + * @return Newly create object or NULL if an error or EOF occured. + */ +struct ln_v1_samp * +ln_v1_sampRead(ln_ctx ctx, FILE *repo, int *isEof); + + +/** + * Free ln_v1_samp object. + */ +void +ln_v1_sampFree(ln_ctx ctx, struct ln_v1_samp *samp); + + +/** + * Parse a given sample + * + * @param[in] ctx current library context + * @param[in] rule string (with prefix and suffix '%' markers) + * @param[in] offset in rule-string to start at (it should be pointed to + * starting character: '%') + * @param[in] temp string buffer(working space), + * externalized for efficiency reasons + * @param[out] return code (0 means success) + * @return newly created node, which can be added to sample tree. + */ +ln_fieldList_t* +ln_v1_parseFieldDescr(ln_ctx ctx, es_str_t *rule, es_size_t *bufOffs, + es_str_t **str, int* ret); + +#endif /* #ifndef LIBLOGNORM_V1_SAMPLES_H_INCLUDED */ diff --git a/test-driver b/test-driver new file mode 100755 index 0000000..8e575b0 --- /dev/null +++ b/test-driver @@ -0,0 +1,148 @@ +#! /bin/sh +# test-driver - basic testsuite driver script. + +scriptversion=2013-07-13.22; # UTC + +# Copyright (C) 2011-2014 Free Software Foundation, Inc. +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation; either version 2, or (at your option) +# any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +# As a special exception to the GNU General Public License, if you +# distribute this file as part of a program that contains a +# configuration script generated by Autoconf, you may include it under +# the same distribution terms that you use for the rest of that program. + +# This file is maintained in Automake, please report +# bugs to or send patches to +# . + +# Make unconditional expansion of undefined variables an error. This +# helps a lot in preventing typo-related bugs. +set -u + +usage_error () +{ + echo "$0: $*" >&2 + print_usage >&2 + exit 2 +} + +print_usage () +{ + cat <$log_file 2>&1 +estatus=$? + +if test $enable_hard_errors = no && test $estatus -eq 99; then + tweaked_estatus=1 +else + tweaked_estatus=$estatus +fi + +case $tweaked_estatus:$expect_failure in + 0:yes) col=$red res=XPASS recheck=yes gcopy=yes;; + 0:*) col=$grn res=PASS recheck=no gcopy=no;; + 77:*) col=$blu res=SKIP recheck=no gcopy=yes;; + 99:*) col=$mgn res=ERROR recheck=yes gcopy=yes;; + *:yes) col=$lgn res=XFAIL recheck=no gcopy=yes;; + *:*) col=$red res=FAIL recheck=yes gcopy=yes;; +esac + +# Report the test outcome and exit status in the logs, so that one can +# know whether the test passed or failed simply by looking at the '.log' +# file, without the need of also peaking into the corresponding '.trs' +# file (automake bug#11814). +echo "$res $test_name (exit status: $estatus)" >>$log_file + +# Report outcome to console. +echo "${col}${res}${std}: $test_name" + +# Register the test result, and other relevant metadata. +echo ":test-result: $res" > $trs_file +echo ":global-test-result: $res" >> $trs_file +echo ":recheck: $recheck" >> $trs_file +echo ":copy-in-global-log: $gcopy" >> $trs_file + +# Local Variables: +# mode: shell-script +# sh-indentation: 2 +# eval: (add-hook 'write-file-hooks 'time-stamp) +# time-stamp-start: "scriptversion=" +# time-stamp-format: "%:y-%02m-%02d.%02H" +# time-stamp-time-zone: "UTC" +# time-stamp-end: "; # UTC" +# End: diff --git a/tests/Makefile.am b/tests/Makefile.am new file mode 100644 index 0000000..0967f3b --- /dev/null +++ b/tests/Makefile.am @@ -0,0 +1,164 @@ +check_PROGRAMS = json_eq +# re-enable if we really need the c program check check_PROGRAMS = json_eq user_test +json_eq_self_sources = json_eq.c +json_eq_SOURCES = $(json_eq_self_sources) +json_eq_CPPFLAGS = $(JSON_C_CFLAGS) $(WARN_CFLAGS) -I$(top_srcdir)/src +json_eq_LDADD = $(JSON_C_LIBS) -lm -lestr +json_eq_LDFLAGS = -no-install + +#user_test_SOURCES = user_test.c +#user_test_CPPFLAGS = $(LIBLOGNORM_CFLAGS) $(JSON_C_CFLAGS) $(LIBESTR_CFLAGS) +#user_test_LDADD = $(JSON_C_LIBS) $(LIBLOGNORM_LIBS) $(LIBESTR_LIBS) ../compat/compat.la +#user_test_LDFLAGS = -no-install + +# The following tests are for the new pdag-based engine (v2+). +# +# There are some notes due: +# +# removed field_float_with_invalid_ruledef.sh because test is not valid. +# more info: https://github.com/rsyslog/liblognorm/issues/98 +# note that probably the other currently disable *_invalid_*.sh +# tests are also affected. +# +# there seems to be a problem with some format in cisco-interface-spec +# Probably this was just not seen in v1, because of some impreciseness +# in the ptree normalizer. Pushing equivalent v2 test back until v2 +# implementation is further developed. +TESTS_SHELLSCRIPTS = \ + usrdef_simple.sh \ + usrdef_two.sh \ + usrdef_twotypes.sh \ + usrdef_actual1.sh \ + usrdef_ipaddr.sh \ + usrdef_ipaddr_dotdot.sh \ + usrdef_ipaddr_dotdot2.sh \ + usrdef_ipaddr_dotdot3.sh \ + missing_line_ending.sh \ + lognormalizer-invld-call.sh \ + string_rb_simple.sh \ + string_rb_simple_2_lines.sh \ + names.sh \ + literal.sh \ + include.sh \ + include_RULEBASES.sh \ + seq_simple.sh \ + runaway_rule.sh \ + runaway_rule_comment.sh \ + annotate.sh \ + alternative_simple.sh \ + alternative_three.sh \ + alternative_nested.sh \ + alternative_segfault.sh \ + repeat_very_simple.sh \ + repeat_simple.sh \ + repeat_mismatch_in_while.sh \ + repeat_while_alternative.sh \ + repeat_alternative_nested.sh \ + parser_prios.sh \ + parser_whitespace.sh \ + parser_whitespace_jsoncnf.sh \ + parser_LF.sh \ + parser_LF_jsoncnf.sh \ + strict_prefix_actual_sample1.sh \ + strict_prefix_matching_1.sh \ + strict_prefix_matching_2.sh \ + field_string.sh \ + field_string_perm_chars.sh \ + field_number.sh \ + field_number-fmt_number.sh \ + field_number_maxval.sh \ + field_hexnumber.sh \ + field_hexnumber-fmt_number.sh \ + field_hexnumber_jsoncnf.sh \ + field_hexnumber_range.sh \ + field_hexnumber_range_jsoncnf.sh \ + rule_last_str_short.sh \ + field_mac48.sh \ + field_mac48_jsoncnf.sh \ + field_name_value.sh \ + field_name_value_jsoncnf.sh \ + field_kernel_timestamp.sh \ + field_kernel_timestamp_jsoncnf.sh \ + field_whitespace.sh \ + rule_last_str_long.sh \ + field_whitespace_jsoncnf.sh \ + field_rest.sh \ + field_rest_jsoncnf.sh \ + field_json.sh \ + field_json_jsoncnf.sh \ + field_cee-syslog.sh \ + field_cee-syslog_jsoncnf.sh \ + field_ipv6.sh \ + field_ipv6_jsoncnf.sh \ + field_v2-iptables.sh \ + field_v2-iptables_jsoncnf.sh \ + field_cef.sh \ + field_cef_jsoncnf.sh \ + field_checkpoint-lea.sh \ + field_checkpoint-lea_jsoncnf.sh \ + field_duration.sh \ + field_duration_jsoncnf.sh \ + field_float.sh \ + field_float-fmt_number.sh \ + field_float_jsoncnf.sh \ + field_rfc5424timestamp-fmt_timestamp-unix.sh \ + field_rfc5424timestamp-fmt_timestamp-unix-ms.sh \ + very_long_logline_jsoncnf.sh + +# now come tests for the legacy (v1) engine +TESTS_SHELLSCRIPTS += \ + missing_line_ending_v1.sh \ + runaway_rule_v1.sh \ + runaway_rule_comment_v1.sh \ + field_hexnumber_v1.sh \ + field_mac48_v1.sh \ + field_name_value_v1.sh \ + field_kernel_timestamp_v1.sh \ + field_whitespace_v1.sh \ + field_rest_v1.sh \ + field_json_v1.sh \ + field_cee-syslog_v1.sh \ + field_ipv6_v1.sh \ + field_v2-iptables_v1.sh \ + field_cef_v1.sh \ + field_checkpoint-lea_v1.sh \ + field_duration_v1.sh \ + field_float_v1.sh \ + field_cee-syslog_v1.sh \ + field_tokenized.sh \ + field_tokenized_with_invalid_ruledef.sh \ + field_recursive.sh \ + field_tokenized_recursive.sh \ + field_interpret.sh \ + field_interpret_with_invalid_ruledef.sh \ + field_descent.sh \ + field_descent_with_invalid_ruledef.sh \ + field_suffixed.sh \ + field_suffixed_with_invalid_ruledef.sh \ + field_cisco-interface-spec.sh \ + field_float_with_invalid_ruledef.sh \ + very_long_logline.sh + + +#re-add to TESTS if needed: user_test +TESTS = \ + $(TESTS_SHELLSCRIPTS) + +REGEXP_TESTS = \ + field_regex_default_group_parse_and_return.sh \ + field_regex_invalid_args.sh \ + field_regex_with_consume_group.sh \ + field_regex_with_consume_group_and_return_group.sh \ + field_regex_with_negation.sh \ + field_tokenized_with_regex.sh \ + field_regex_while_regex_support_is_disabled.sh + +EXTRA_DIST = exec.sh \ + $(TESTS_SHELLSCRIPTS) \ + $(REGEXP_TESTS) \ + $(json_eq_self_sources) \ + $(user_test_SOURCES) + +if ENABLE_REGEXP +TESTS += $(REGEXP_TESTS) +endif diff --git a/tests/Makefile.in b/tests/Makefile.in new file mode 100644 index 0000000..450fc56 --- /dev/null +++ b/tests/Makefile.in @@ -0,0 +1,1891 @@ +# Makefile.in generated by automake 1.15 from Makefile.am. +# @configure_input@ + +# Copyright (C) 1994-2014 Free Software Foundation, Inc. + +# This Makefile.in is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY, to the extent permitted by law; without +# even the implied warranty of MERCHANTABILITY or FITNESS FOR A +# PARTICULAR PURPOSE. + +@SET_MAKE@ +VPATH = @srcdir@ +am__is_gnu_make = { \ + if test -z '$(MAKELEVEL)'; then \ + false; \ + elif test -n '$(MAKE_HOST)'; then \ + true; \ + elif test -n '$(MAKE_VERSION)' && test -n '$(CURDIR)'; then \ + true; \ + else \ + false; \ + fi; \ +} +am__make_running_with_option = \ + case $${target_option-} in \ + ?) ;; \ + *) echo "am__make_running_with_option: internal error: invalid" \ + "target option '$${target_option-}' specified" >&2; \ + exit 1;; \ + esac; \ + has_opt=no; \ + sane_makeflags=$$MAKEFLAGS; \ + if $(am__is_gnu_make); then \ + sane_makeflags=$$MFLAGS; \ + else \ + case $$MAKEFLAGS in \ + *\\[\ \ ]*) \ + bs=\\; \ + sane_makeflags=`printf '%s\n' "$$MAKEFLAGS" \ + | sed "s/$$bs$$bs[$$bs $$bs ]*//g"`;; \ + esac; \ + fi; \ + skip_next=no; \ + strip_trailopt () \ + { \ + flg=`printf '%s\n' "$$flg" | sed "s/$$1.*$$//"`; \ + }; \ + for flg in $$sane_makeflags; do \ + test $$skip_next = yes && { skip_next=no; continue; }; \ + case $$flg in \ + *=*|--*) continue;; \ + -*I) strip_trailopt 'I'; skip_next=yes;; \ + -*I?*) strip_trailopt 'I';; \ + -*O) strip_trailopt 'O'; skip_next=yes;; \ + -*O?*) strip_trailopt 'O';; \ + -*l) strip_trailopt 'l'; skip_next=yes;; \ + -*l?*) strip_trailopt 'l';; \ + -[dEDm]) skip_next=yes;; \ + -[JT]) skip_next=yes;; \ + esac; \ + case $$flg in \ + *$$target_option*) has_opt=yes; break;; \ + esac; \ + done; \ + test $$has_opt = yes +am__make_dryrun = (target_option=n; $(am__make_running_with_option)) +am__make_keepgoing = (target_option=k; $(am__make_running_with_option)) +pkgdatadir = $(datadir)/@PACKAGE@ +pkgincludedir = $(includedir)/@PACKAGE@ +pkglibdir = $(libdir)/@PACKAGE@ +pkglibexecdir = $(libexecdir)/@PACKAGE@ +am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd +install_sh_DATA = $(install_sh) -c -m 644 +install_sh_PROGRAM = $(install_sh) -c +install_sh_SCRIPT = $(install_sh) -c +INSTALL_HEADER = $(INSTALL_DATA) +transform = $(program_transform_name) +NORMAL_INSTALL = : +PRE_INSTALL = : +POST_INSTALL = : +NORMAL_UNINSTALL = : +PRE_UNINSTALL = : +POST_UNINSTALL = : +build_triplet = @build@ +host_triplet = @host@ +check_PROGRAMS = json_eq$(EXEEXT) +@ENABLE_REGEXP_TRUE@am__append_1 = $(REGEXP_TESTS) +subdir = tests +ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 +am__aclocal_m4_deps = $(top_srcdir)/m4/libtool.m4 \ + $(top_srcdir)/m4/ltoptions.m4 $(top_srcdir)/m4/ltsugar.m4 \ + $(top_srcdir)/m4/ltversion.m4 $(top_srcdir)/m4/lt~obsolete.m4 \ + $(top_srcdir)/m4/pkg.m4 $(top_srcdir)/configure.ac +am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \ + $(ACLOCAL_M4) +DIST_COMMON = $(srcdir)/Makefile.am $(am__DIST_COMMON) +mkinstalldirs = $(install_sh) -d +CONFIG_HEADER = $(top_builddir)/config.h +CONFIG_CLEAN_FILES = options.sh +CONFIG_CLEAN_VPATH_FILES = +am__objects_1 = json_eq-json_eq.$(OBJEXT) +am_json_eq_OBJECTS = $(am__objects_1) +json_eq_OBJECTS = $(am_json_eq_OBJECTS) +am__DEPENDENCIES_1 = +json_eq_DEPENDENCIES = $(am__DEPENDENCIES_1) +AM_V_lt = $(am__v_lt_@AM_V@) +am__v_lt_ = $(am__v_lt_@AM_DEFAULT_V@) +am__v_lt_0 = --silent +am__v_lt_1 = +json_eq_LINK = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \ + $(json_eq_LDFLAGS) $(LDFLAGS) -o $@ +AM_V_P = $(am__v_P_@AM_V@) +am__v_P_ = $(am__v_P_@AM_DEFAULT_V@) +am__v_P_0 = false +am__v_P_1 = : +AM_V_GEN = $(am__v_GEN_@AM_V@) +am__v_GEN_ = $(am__v_GEN_@AM_DEFAULT_V@) +am__v_GEN_0 = @echo " GEN " $@; +am__v_GEN_1 = +AM_V_at = $(am__v_at_@AM_V@) +am__v_at_ = $(am__v_at_@AM_DEFAULT_V@) +am__v_at_0 = @ +am__v_at_1 = +DEFAULT_INCLUDES = -I.@am__isrc@ -I$(top_builddir) +depcomp = $(SHELL) $(top_srcdir)/depcomp +am__depfiles_maybe = depfiles +am__mv = mv -f +COMPILE = $(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(AM_CPPFLAGS) \ + $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) +LTCOMPILE = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=compile $(CC) $(DEFS) \ + $(DEFAULT_INCLUDES) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) \ + $(AM_CFLAGS) $(CFLAGS) +AM_V_CC = $(am__v_CC_@AM_V@) +am__v_CC_ = $(am__v_CC_@AM_DEFAULT_V@) +am__v_CC_0 = @echo " CC " $@; +am__v_CC_1 = +CCLD = $(CC) +LINK = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \ + $(LIBTOOLFLAGS) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \ + $(AM_LDFLAGS) $(LDFLAGS) -o $@ +AM_V_CCLD = $(am__v_CCLD_@AM_V@) +am__v_CCLD_ = $(am__v_CCLD_@AM_DEFAULT_V@) +am__v_CCLD_0 = @echo " CCLD " $@; +am__v_CCLD_1 = +SOURCES = $(json_eq_SOURCES) +DIST_SOURCES = $(json_eq_SOURCES) +am__can_run_installinfo = \ + case $$AM_UPDATE_INFO_DIR in \ + n|no|NO) false;; \ + *) (install-info --version) >/dev/null 2>&1;; \ + esac +am__tagged_files = $(HEADERS) $(SOURCES) $(TAGS_FILES) $(LISP) +# Read a list of newline-separated strings from the standard input, +# and print each of them once, without duplicates. Input order is +# *not* preserved. +am__uniquify_input = $(AWK) '\ + BEGIN { nonempty = 0; } \ + { items[$$0] = 1; nonempty = 1; } \ + END { if (nonempty) { for (i in items) print i; }; } \ +' +# Make sure the list of sources is unique. This is necessary because, +# e.g., the same source file might be shared among _SOURCES variables +# for different programs/libraries. +am__define_uniq_tagged_files = \ + list='$(am__tagged_files)'; \ + unique=`for i in $$list; do \ + if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \ + done | $(am__uniquify_input)` +ETAGS = etags +CTAGS = ctags +am__tty_colors_dummy = \ + mgn= red= grn= lgn= blu= brg= std=; \ + am__color_tests=no +am__tty_colors = { \ + $(am__tty_colors_dummy); \ + if test "X$(AM_COLOR_TESTS)" = Xno; then \ + am__color_tests=no; \ + elif test "X$(AM_COLOR_TESTS)" = Xalways; then \ + am__color_tests=yes; \ + elif test "X$$TERM" != Xdumb && { test -t 1; } 2>/dev/null; then \ + am__color_tests=yes; \ + fi; \ + if test $$am__color_tests = yes; then \ + red=''; \ + grn=''; \ + lgn=''; \ + blu=''; \ + mgn=''; \ + brg=''; \ + std=''; \ + fi; \ +} +am__vpath_adj_setup = srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`; +am__vpath_adj = case $$p in \ + $(srcdir)/*) f=`echo "$$p" | sed "s|^$$srcdirstrip/||"`;; \ + *) f=$$p;; \ + esac; +am__strip_dir = f=`echo $$p | sed -e 's|^.*/||'`; +am__install_max = 40 +am__nobase_strip_setup = \ + srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*|]/\\\\&/g'` +am__nobase_strip = \ + for p in $$list; do echo "$$p"; done | sed -e "s|$$srcdirstrip/||" +am__nobase_list = $(am__nobase_strip_setup); \ + for p in $$list; do echo "$$p $$p"; done | \ + sed "s| $$srcdirstrip/| |;"' / .*\//!s/ .*/ ./; s,\( .*\)/[^/]*$$,\1,' | \ + $(AWK) 'BEGIN { files["."] = "" } { files[$$2] = files[$$2] " " $$1; \ + if (++n[$$2] == $(am__install_max)) \ + { print $$2, files[$$2]; n[$$2] = 0; files[$$2] = "" } } \ + END { for (dir in files) print dir, files[dir] }' +am__base_list = \ + sed '$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;s/\n/ /g' | \ + sed '$$!N;$$!N;$$!N;$$!N;s/\n/ /g' +am__uninstall_files_from_dir = { \ + test -z "$$files" \ + || { test ! -d "$$dir" && test ! -f "$$dir" && test ! -r "$$dir"; } \ + || { echo " ( cd '$$dir' && rm -f" $$files ")"; \ + $(am__cd) "$$dir" && rm -f $$files; }; \ + } +am__recheck_rx = ^[ ]*:recheck:[ ]* +am__global_test_result_rx = ^[ ]*:global-test-result:[ ]* +am__copy_in_global_log_rx = ^[ ]*:copy-in-global-log:[ ]* +# A command that, given a newline-separated list of test names on the +# standard input, print the name of the tests that are to be re-run +# upon "make recheck". +am__list_recheck_tests = $(AWK) '{ \ + recheck = 1; \ + while ((rc = (getline line < ($$0 ".trs"))) != 0) \ + { \ + if (rc < 0) \ + { \ + if ((getline line2 < ($$0 ".log")) < 0) \ + recheck = 0; \ + break; \ + } \ + else if (line ~ /$(am__recheck_rx)[nN][Oo]/) \ + { \ + recheck = 0; \ + break; \ + } \ + else if (line ~ /$(am__recheck_rx)[yY][eE][sS]/) \ + { \ + break; \ + } \ + }; \ + if (recheck) \ + print $$0; \ + close ($$0 ".trs"); \ + close ($$0 ".log"); \ +}' +# A command that, given a newline-separated list of test names on the +# standard input, create the global log from their .trs and .log files. +am__create_global_log = $(AWK) ' \ +function fatal(msg) \ +{ \ + print "fatal: making $@: " msg | "cat >&2"; \ + exit 1; \ +} \ +function rst_section(header) \ +{ \ + print header; \ + len = length(header); \ + for (i = 1; i <= len; i = i + 1) \ + printf "="; \ + printf "\n\n"; \ +} \ +{ \ + copy_in_global_log = 1; \ + global_test_result = "RUN"; \ + while ((rc = (getline line < ($$0 ".trs"))) != 0) \ + { \ + if (rc < 0) \ + fatal("failed to read from " $$0 ".trs"); \ + if (line ~ /$(am__global_test_result_rx)/) \ + { \ + sub("$(am__global_test_result_rx)", "", line); \ + sub("[ ]*$$", "", line); \ + global_test_result = line; \ + } \ + else if (line ~ /$(am__copy_in_global_log_rx)[nN][oO]/) \ + copy_in_global_log = 0; \ + }; \ + if (copy_in_global_log) \ + { \ + rst_section(global_test_result ": " $$0); \ + while ((rc = (getline line < ($$0 ".log"))) != 0) \ + { \ + if (rc < 0) \ + fatal("failed to read from " $$0 ".log"); \ + print line; \ + }; \ + printf "\n"; \ + }; \ + close ($$0 ".trs"); \ + close ($$0 ".log"); \ +}' +# Restructured Text title. +am__rst_title = { sed 's/.*/ & /;h;s/./=/g;p;x;s/ *$$//;p;g' && echo; } +# Solaris 10 'make', and several other traditional 'make' implementations, +# pass "-e" to $(SHELL), and POSIX 2008 even requires this. Work around it +# by disabling -e (using the XSI extension "set +e") if it's set. +am__sh_e_setup = case $$- in *e*) set +e;; esac +# Default flags passed to test drivers. +am__common_driver_flags = \ + --color-tests "$$am__color_tests" \ + --enable-hard-errors "$$am__enable_hard_errors" \ + --expect-failure "$$am__expect_failure" +# To be inserted before the command running the test. Creates the +# directory for the log if needed. Stores in $dir the directory +# containing $f, in $tst the test, in $log the log. Executes the +# developer- defined test setup AM_TESTS_ENVIRONMENT (if any), and +# passes TESTS_ENVIRONMENT. Set up options for the wrapper that +# will run the test scripts (or their associated LOG_COMPILER, if +# thy have one). +am__check_pre = \ +$(am__sh_e_setup); \ +$(am__vpath_adj_setup) $(am__vpath_adj) \ +$(am__tty_colors); \ +srcdir=$(srcdir); export srcdir; \ +case "$@" in \ + */*) am__odir=`echo "./$@" | sed 's|/[^/]*$$||'`;; \ + *) am__odir=.;; \ +esac; \ +test "x$$am__odir" = x"." || test -d "$$am__odir" \ + || $(MKDIR_P) "$$am__odir" || exit $$?; \ +if test -f "./$$f"; then dir=./; \ +elif test -f "$$f"; then dir=; \ +else dir="$(srcdir)/"; fi; \ +tst=$$dir$$f; log='$@'; \ +if test -n '$(DISABLE_HARD_ERRORS)'; then \ + am__enable_hard_errors=no; \ +else \ + am__enable_hard_errors=yes; \ +fi; \ +case " $(XFAIL_TESTS) " in \ + *[\ \ ]$$f[\ \ ]* | *[\ \ ]$$dir$$f[\ \ ]*) \ + am__expect_failure=yes;; \ + *) \ + am__expect_failure=no;; \ +esac; \ +$(AM_TESTS_ENVIRONMENT) $(TESTS_ENVIRONMENT) +# A shell command to get the names of the tests scripts with any registered +# extension removed (i.e., equivalently, the names of the test logs, with +# the '.log' extension removed). The result is saved in the shell variable +# '$bases'. This honors runtime overriding of TESTS and TEST_LOGS. Sadly, +# we cannot use something simpler, involving e.g., "$(TEST_LOGS:.log=)", +# since that might cause problem with VPATH rewrites for suffix-less tests. +# See also 'test-harness-vpath-rewrite.sh' and 'test-trs-basic.sh'. +am__set_TESTS_bases = \ + bases='$(TEST_LOGS)'; \ + bases=`for i in $$bases; do echo $$i; done | sed 's/\.log$$//'`; \ + bases=`echo $$bases` +RECHECK_LOGS = $(TEST_LOGS) +AM_RECURSIVE_TARGETS = check recheck +TEST_SUITE_LOG = test-suite.log +TEST_EXTENSIONS = @EXEEXT@ .test +LOG_DRIVER = $(SHELL) $(top_srcdir)/test-driver +LOG_COMPILE = $(LOG_COMPILER) $(AM_LOG_FLAGS) $(LOG_FLAGS) +am__set_b = \ + case '$@' in \ + */*) \ + case '$*' in \ + */*) b='$*';; \ + *) b=`echo '$@' | sed 's/\.log$$//'`; \ + esac;; \ + *) \ + b='$*';; \ + esac +am__test_logs1 = $(TESTS:=.log) +am__test_logs2 = $(am__test_logs1:@EXEEXT@.log=.log) +TEST_LOGS = $(am__test_logs2:.test.log=.log) +TEST_LOG_DRIVER = $(SHELL) $(top_srcdir)/test-driver +TEST_LOG_COMPILE = $(TEST_LOG_COMPILER) $(AM_TEST_LOG_FLAGS) \ + $(TEST_LOG_FLAGS) +am__DIST_COMMON = $(srcdir)/Makefile.in $(srcdir)/options.sh.in \ + $(top_srcdir)/depcomp $(top_srcdir)/test-driver +DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST) +ACLOCAL = @ACLOCAL@ +AMTAR = @AMTAR@ +AM_DEFAULT_VERBOSITY = @AM_DEFAULT_VERBOSITY@ +AR = @AR@ +AUTOCONF = @AUTOCONF@ +AUTOHEADER = @AUTOHEADER@ +AUTOMAKE = @AUTOMAKE@ +AWK = @AWK@ +CC = @CC@ +CCDEPMODE = @CCDEPMODE@ +CFLAGS = @CFLAGS@ +CPP = @CPP@ +CPPFLAGS = @CPPFLAGS@ +CYGPATH_W = @CYGPATH_W@ +DEFS = @DEFS@ +DEPDIR = @DEPDIR@ +DLLTOOL = @DLLTOOL@ +DSYMUTIL = @DSYMUTIL@ +DUMPBIN = @DUMPBIN@ +ECHO_C = @ECHO_C@ +ECHO_N = @ECHO_N@ +ECHO_T = @ECHO_T@ +EGREP = @EGREP@ +EXEEXT = @EXEEXT@ +FEATURE_REGEXP = @FEATURE_REGEXP@ +FGREP = @FGREP@ +GREP = @GREP@ +INSTALL = @INSTALL@ +INSTALL_DATA = @INSTALL_DATA@ +INSTALL_PROGRAM = @INSTALL_PROGRAM@ +INSTALL_SCRIPT = @INSTALL_SCRIPT@ +INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@ +JSON_C_CFLAGS = @JSON_C_CFLAGS@ +JSON_C_LIBS = @JSON_C_LIBS@ +LD = @LD@ +LDFLAGS = @LDFLAGS@ +LIBESTR_CFLAGS = @LIBESTR_CFLAGS@ +LIBESTR_LIBS = @LIBESTR_LIBS@ +LIBLOGNORM_CFLAGS = @LIBLOGNORM_CFLAGS@ +LIBLOGNORM_LIBS = @LIBLOGNORM_LIBS@ +LIBOBJS = @LIBOBJS@ +LIBS = @LIBS@ +LIBTOOL = @LIBTOOL@ +LIPO = @LIPO@ +LN_S = @LN_S@ +LTLIBOBJS = @LTLIBOBJS@ +LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAKEINFO = @MAKEINFO@ +MANIFEST_TOOL = @MANIFEST_TOOL@ +MKDIR_P = @MKDIR_P@ +NM = @NM@ +NMEDIT = @NMEDIT@ +OBJDUMP = @OBJDUMP@ +OBJEXT = @OBJEXT@ +OTOOL = @OTOOL@ +OTOOL64 = @OTOOL64@ +PACKAGE = @PACKAGE@ +PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@ +PACKAGE_NAME = @PACKAGE_NAME@ +PACKAGE_STRING = @PACKAGE_STRING@ +PACKAGE_TARNAME = @PACKAGE_TARNAME@ +PACKAGE_URL = @PACKAGE_URL@ +PACKAGE_VERSION = @PACKAGE_VERSION@ +PATH_SEPARATOR = @PATH_SEPARATOR@ +PCRE_CFLAGS = @PCRE_CFLAGS@ +PCRE_LIBS = @PCRE_LIBS@ +PKG_CONFIG = @PKG_CONFIG@ +PKG_CONFIG_LIBDIR = @PKG_CONFIG_LIBDIR@ +PKG_CONFIG_PATH = @PKG_CONFIG_PATH@ +RANLIB = @RANLIB@ +SED = @SED@ +SET_MAKE = @SET_MAKE@ +SHELL = @SHELL@ +SPHINXBUILD = @SPHINXBUILD@ +STRIP = @STRIP@ +VALGRIND = @VALGRIND@ +VERSION = @VERSION@ +WARN_CFLAGS = @WARN_CFLAGS@ +WARN_LDFLAGS = @WARN_LDFLAGS@ +WARN_SCANNERFLAGS = @WARN_SCANNERFLAGS@ +abs_builddir = @abs_builddir@ +abs_srcdir = @abs_srcdir@ +abs_top_builddir = @abs_top_builddir@ +abs_top_srcdir = @abs_top_srcdir@ +ac_ct_AR = @ac_ct_AR@ +ac_ct_CC = @ac_ct_CC@ +ac_ct_DUMPBIN = @ac_ct_DUMPBIN@ +am__include = @am__include@ +am__leading_dot = @am__leading_dot@ +am__quote = @am__quote@ +am__tar = @am__tar@ +am__untar = @am__untar@ +bindir = @bindir@ +build = @build@ +build_alias = @build_alias@ +build_cpu = @build_cpu@ +build_os = @build_os@ +build_vendor = @build_vendor@ +builddir = @builddir@ +datadir = @datadir@ +datarootdir = @datarootdir@ +docdir = @docdir@ +dvidir = @dvidir@ +exec_prefix = @exec_prefix@ +host = @host@ +host_alias = @host_alias@ +host_cpu = @host_cpu@ +host_os = @host_os@ +host_vendor = @host_vendor@ +htmldir = @htmldir@ +includedir = @includedir@ +infodir = @infodir@ +install_sh = @install_sh@ +libdir = @libdir@ +libexecdir = @libexecdir@ +localedir = @localedir@ +localstatedir = @localstatedir@ +mandir = @mandir@ +mkdir_p = @mkdir_p@ +oldincludedir = @oldincludedir@ +pdfdir = @pdfdir@ +pkg_config_libs_private = @pkg_config_libs_private@ +prefix = @prefix@ +program_transform_name = @program_transform_name@ +psdir = @psdir@ +runstatedir = @runstatedir@ +sbindir = @sbindir@ +sharedstatedir = @sharedstatedir@ +srcdir = @srcdir@ +sysconfdir = @sysconfdir@ +target_alias = @target_alias@ +top_build_prefix = @top_build_prefix@ +top_builddir = @top_builddir@ +top_srcdir = @top_srcdir@ +# re-enable if we really need the c program check check_PROGRAMS = json_eq user_test +json_eq_self_sources = json_eq.c +json_eq_SOURCES = $(json_eq_self_sources) +json_eq_CPPFLAGS = $(JSON_C_CFLAGS) $(WARN_CFLAGS) -I$(top_srcdir)/src +json_eq_LDADD = $(JSON_C_LIBS) -lm -lestr +json_eq_LDFLAGS = -no-install + +#user_test_SOURCES = user_test.c +#user_test_CPPFLAGS = $(LIBLOGNORM_CFLAGS) $(JSON_C_CFLAGS) $(LIBESTR_CFLAGS) +#user_test_LDADD = $(JSON_C_LIBS) $(LIBLOGNORM_LIBS) $(LIBESTR_LIBS) ../compat/compat.la +#user_test_LDFLAGS = -no-install + +# The following tests are for the new pdag-based engine (v2+). +# +# There are some notes due: +# +# removed field_float_with_invalid_ruledef.sh because test is not valid. +# more info: https://github.com/rsyslog/liblognorm/issues/98 +# note that probably the other currently disable *_invalid_*.sh +# tests are also affected. +# +# there seems to be a problem with some format in cisco-interface-spec +# Probably this was just not seen in v1, because of some impreciseness +# in the ptree normalizer. Pushing equivalent v2 test back until v2 +# implementation is further developed. + +# now come tests for the legacy (v1) engine +TESTS_SHELLSCRIPTS = usrdef_simple.sh usrdef_two.sh usrdef_twotypes.sh \ + usrdef_actual1.sh usrdef_ipaddr.sh usrdef_ipaddr_dotdot.sh \ + usrdef_ipaddr_dotdot2.sh usrdef_ipaddr_dotdot3.sh \ + missing_line_ending.sh lognormalizer-invld-call.sh \ + string_rb_simple.sh string_rb_simple_2_lines.sh names.sh \ + literal.sh include.sh include_RULEBASES.sh seq_simple.sh \ + runaway_rule.sh runaway_rule_comment.sh annotate.sh \ + alternative_simple.sh alternative_three.sh \ + alternative_nested.sh alternative_segfault.sh \ + repeat_very_simple.sh repeat_simple.sh \ + repeat_mismatch_in_while.sh repeat_while_alternative.sh \ + repeat_alternative_nested.sh parser_prios.sh \ + parser_whitespace.sh parser_whitespace_jsoncnf.sh parser_LF.sh \ + parser_LF_jsoncnf.sh strict_prefix_actual_sample1.sh \ + strict_prefix_matching_1.sh strict_prefix_matching_2.sh \ + field_string.sh field_string_perm_chars.sh field_number.sh \ + field_number-fmt_number.sh field_number_maxval.sh \ + field_hexnumber.sh field_hexnumber-fmt_number.sh \ + field_hexnumber_jsoncnf.sh field_hexnumber_range.sh \ + field_hexnumber_range_jsoncnf.sh rule_last_str_short.sh \ + field_mac48.sh field_mac48_jsoncnf.sh field_name_value.sh \ + field_name_value_jsoncnf.sh field_kernel_timestamp.sh \ + field_kernel_timestamp_jsoncnf.sh field_whitespace.sh \ + rule_last_str_long.sh field_whitespace_jsoncnf.sh \ + field_rest.sh field_rest_jsoncnf.sh field_json.sh \ + field_json_jsoncnf.sh field_cee-syslog.sh \ + field_cee-syslog_jsoncnf.sh field_ipv6.sh \ + field_ipv6_jsoncnf.sh field_v2-iptables.sh \ + field_v2-iptables_jsoncnf.sh field_cef.sh field_cef_jsoncnf.sh \ + field_checkpoint-lea.sh field_checkpoint-lea_jsoncnf.sh \ + field_duration.sh field_duration_jsoncnf.sh field_float.sh \ + field_float-fmt_number.sh field_float_jsoncnf.sh \ + field_rfc5424timestamp-fmt_timestamp-unix.sh \ + field_rfc5424timestamp-fmt_timestamp-unix-ms.sh \ + very_long_logline_jsoncnf.sh missing_line_ending_v1.sh \ + runaway_rule_v1.sh runaway_rule_comment_v1.sh \ + field_hexnumber_v1.sh field_mac48_v1.sh field_name_value_v1.sh \ + field_kernel_timestamp_v1.sh field_whitespace_v1.sh \ + field_rest_v1.sh field_json_v1.sh field_cee-syslog_v1.sh \ + field_ipv6_v1.sh field_v2-iptables_v1.sh field_cef_v1.sh \ + field_checkpoint-lea_v1.sh field_duration_v1.sh \ + field_float_v1.sh field_cee-syslog_v1.sh field_tokenized.sh \ + field_tokenized_with_invalid_ruledef.sh field_recursive.sh \ + field_tokenized_recursive.sh field_interpret.sh \ + field_interpret_with_invalid_ruledef.sh field_descent.sh \ + field_descent_with_invalid_ruledef.sh field_suffixed.sh \ + field_suffixed_with_invalid_ruledef.sh \ + field_cisco-interface-spec.sh \ + field_float_with_invalid_ruledef.sh very_long_logline.sh + +#re-add to TESTS if needed: user_test +TESTS = $(TESTS_SHELLSCRIPTS) $(am__append_1) +REGEXP_TESTS = \ + field_regex_default_group_parse_and_return.sh \ + field_regex_invalid_args.sh \ + field_regex_with_consume_group.sh \ + field_regex_with_consume_group_and_return_group.sh \ + field_regex_with_negation.sh \ + field_tokenized_with_regex.sh \ + field_regex_while_regex_support_is_disabled.sh + +EXTRA_DIST = exec.sh \ + $(TESTS_SHELLSCRIPTS) \ + $(REGEXP_TESTS) \ + $(json_eq_self_sources) \ + $(user_test_SOURCES) + +all: all-am + +.SUFFIXES: +.SUFFIXES: .c .lo .log .o .obj .test .test$(EXEEXT) .trs +$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) + @for dep in $?; do \ + case '$(am__configure_deps)' in \ + *$$dep*) \ + ( cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh ) \ + && { if test -f $@; then exit 0; else break; fi; }; \ + exit 1;; \ + esac; \ + done; \ + echo ' cd $(top_srcdir) && $(AUTOMAKE) --gnu tests/Makefile'; \ + $(am__cd) $(top_srcdir) && \ + $(AUTOMAKE) --gnu tests/Makefile +Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status + @case '$?' in \ + *config.status*) \ + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \ + *) \ + echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \ + cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \ + esac; + +$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh + +$(top_srcdir)/configure: $(am__configure_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(ACLOCAL_M4): $(am__aclocal_m4_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(am__aclocal_m4_deps): +options.sh: $(top_builddir)/config.status $(srcdir)/options.sh.in + cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ + +clean-checkPROGRAMS: + @list='$(check_PROGRAMS)'; test -n "$$list" || exit 0; \ + echo " rm -f" $$list; \ + rm -f $$list || exit $$?; \ + test -n "$(EXEEXT)" || exit 0; \ + list=`for p in $$list; do echo "$$p"; done | sed 's/$(EXEEXT)$$//'`; \ + echo " rm -f" $$list; \ + rm -f $$list + +json_eq$(EXEEXT): $(json_eq_OBJECTS) $(json_eq_DEPENDENCIES) $(EXTRA_json_eq_DEPENDENCIES) + @rm -f json_eq$(EXEEXT) + $(AM_V_CCLD)$(json_eq_LINK) $(json_eq_OBJECTS) $(json_eq_LDADD) $(LIBS) + +mostlyclean-compile: + -rm -f *.$(OBJEXT) + +distclean-compile: + -rm -f *.tab.c + +@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/json_eq-json_eq.Po@am__quote@ + +.c.o: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(COMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ $< +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(COMPILE) -c -o $@ $< + +.c.obj: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(COMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ `$(CYGPATH_W) '$<'` +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(COMPILE) -c -o $@ `$(CYGPATH_W) '$<'` + +.c.lo: +@am__fastdepCC_TRUE@ $(AM_V_CC)$(LTCOMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ $< +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Plo +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='$<' object='$@' libtool=yes @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(LTCOMPILE) -c -o $@ $< + +json_eq-json_eq.o: json_eq.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(json_eq_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT json_eq-json_eq.o -MD -MP -MF $(DEPDIR)/json_eq-json_eq.Tpo -c -o json_eq-json_eq.o `test -f 'json_eq.c' || echo '$(srcdir)/'`json_eq.c +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/json_eq-json_eq.Tpo $(DEPDIR)/json_eq-json_eq.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='json_eq.c' object='json_eq-json_eq.o' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(json_eq_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o json_eq-json_eq.o `test -f 'json_eq.c' || echo '$(srcdir)/'`json_eq.c + +json_eq-json_eq.obj: json_eq.c +@am__fastdepCC_TRUE@ $(AM_V_CC)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(json_eq_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -MT json_eq-json_eq.obj -MD -MP -MF $(DEPDIR)/json_eq-json_eq.Tpo -c -o json_eq-json_eq.obj `if test -f 'json_eq.c'; then $(CYGPATH_W) 'json_eq.c'; else $(CYGPATH_W) '$(srcdir)/json_eq.c'; fi` +@am__fastdepCC_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/json_eq-json_eq.Tpo $(DEPDIR)/json_eq-json_eq.Po +@AMDEP_TRUE@@am__fastdepCC_FALSE@ $(AM_V_CC)source='json_eq.c' object='json_eq-json_eq.obj' libtool=no @AMDEPBACKSLASH@ +@AMDEP_TRUE@@am__fastdepCC_FALSE@ DEPDIR=$(DEPDIR) $(CCDEPMODE) $(depcomp) @AMDEPBACKSLASH@ +@am__fastdepCC_FALSE@ $(AM_V_CC@am__nodep@)$(CC) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(json_eq_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS) -c -o json_eq-json_eq.obj `if test -f 'json_eq.c'; then $(CYGPATH_W) 'json_eq.c'; else $(CYGPATH_W) '$(srcdir)/json_eq.c'; fi` + +mostlyclean-libtool: + -rm -f *.lo + +clean-libtool: + -rm -rf .libs _libs + +ID: $(am__tagged_files) + $(am__define_uniq_tagged_files); mkid -fID $$unique +tags: tags-am +TAGS: tags + +tags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files) + set x; \ + here=`pwd`; \ + $(am__define_uniq_tagged_files); \ + shift; \ + if test -z "$(ETAGS_ARGS)$$*$$unique"; then :; else \ + test -n "$$unique" || unique=$$empty_fix; \ + if test $$# -gt 0; then \ + $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \ + "$$@" $$unique; \ + else \ + $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \ + $$unique; \ + fi; \ + fi +ctags: ctags-am + +CTAGS: ctags +ctags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files) + $(am__define_uniq_tagged_files); \ + test -z "$(CTAGS_ARGS)$$unique" \ + || $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \ + $$unique + +GTAGS: + here=`$(am__cd) $(top_builddir) && pwd` \ + && $(am__cd) $(top_srcdir) \ + && gtags -i $(GTAGS_ARGS) "$$here" +cscopelist: cscopelist-am + +cscopelist-am: $(am__tagged_files) + list='$(am__tagged_files)'; \ + case "$(srcdir)" in \ + [\\/]* | ?:[\\/]*) sdir="$(srcdir)" ;; \ + *) sdir=$(subdir)/$(srcdir) ;; \ + esac; \ + for i in $$list; do \ + if test -f "$$i"; then \ + echo "$(subdir)/$$i"; \ + else \ + echo "$$sdir/$$i"; \ + fi; \ + done >> $(top_builddir)/cscope.files + +distclean-tags: + -rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags + +# Recover from deleted '.trs' file; this should ensure that +# "rm -f foo.log; make foo.trs" re-run 'foo.test', and re-create +# both 'foo.log' and 'foo.trs'. Break the recipe in two subshells +# to avoid problems with "make -n". +.log.trs: + rm -f $< $@ + $(MAKE) $(AM_MAKEFLAGS) $< + +# Leading 'am--fnord' is there to ensure the list of targets does not +# expand to empty, as could happen e.g. with make check TESTS=''. +am--fnord $(TEST_LOGS) $(TEST_LOGS:.log=.trs): $(am__force_recheck) +am--force-recheck: + @: + +$(TEST_SUITE_LOG): $(TEST_LOGS) + @$(am__set_TESTS_bases); \ + am__f_ok () { test -f "$$1" && test -r "$$1"; }; \ + redo_bases=`for i in $$bases; do \ + am__f_ok $$i.trs && am__f_ok $$i.log || echo $$i; \ + done`; \ + if test -n "$$redo_bases"; then \ + redo_logs=`for i in $$redo_bases; do echo $$i.log; done`; \ + redo_results=`for i in $$redo_bases; do echo $$i.trs; done`; \ + if $(am__make_dryrun); then :; else \ + rm -f $$redo_logs && rm -f $$redo_results || exit 1; \ + fi; \ + fi; \ + if test -n "$$am__remaking_logs"; then \ + echo "fatal: making $(TEST_SUITE_LOG): possible infinite" \ + "recursion detected" >&2; \ + elif test -n "$$redo_logs"; then \ + am__remaking_logs=yes $(MAKE) $(AM_MAKEFLAGS) $$redo_logs; \ + fi; \ + if $(am__make_dryrun); then :; else \ + st=0; \ + errmsg="fatal: making $(TEST_SUITE_LOG): failed to create"; \ + for i in $$redo_bases; do \ + test -f $$i.trs && test -r $$i.trs \ + || { echo "$$errmsg $$i.trs" >&2; st=1; }; \ + test -f $$i.log && test -r $$i.log \ + || { echo "$$errmsg $$i.log" >&2; st=1; }; \ + done; \ + test $$st -eq 0 || exit 1; \ + fi + @$(am__sh_e_setup); $(am__tty_colors); $(am__set_TESTS_bases); \ + ws='[ ]'; \ + results=`for b in $$bases; do echo $$b.trs; done`; \ + test -n "$$results" || results=/dev/null; \ + all=` grep "^$$ws*:test-result:" $$results | wc -l`; \ + pass=` grep "^$$ws*:test-result:$$ws*PASS" $$results | wc -l`; \ + fail=` grep "^$$ws*:test-result:$$ws*FAIL" $$results | wc -l`; \ + skip=` grep "^$$ws*:test-result:$$ws*SKIP" $$results | wc -l`; \ + xfail=`grep "^$$ws*:test-result:$$ws*XFAIL" $$results | wc -l`; \ + xpass=`grep "^$$ws*:test-result:$$ws*XPASS" $$results | wc -l`; \ + error=`grep "^$$ws*:test-result:$$ws*ERROR" $$results | wc -l`; \ + if test `expr $$fail + $$xpass + $$error` -eq 0; then \ + success=true; \ + else \ + success=false; \ + fi; \ + br='==================='; br=$$br$$br$$br$$br; \ + result_count () \ + { \ + if test x"$$1" = x"--maybe-color"; then \ + maybe_colorize=yes; \ + elif test x"$$1" = x"--no-color"; then \ + maybe_colorize=no; \ + else \ + echo "$@: invalid 'result_count' usage" >&2; exit 4; \ + fi; \ + shift; \ + desc=$$1 count=$$2; \ + if test $$maybe_colorize = yes && test $$count -gt 0; then \ + color_start=$$3 color_end=$$std; \ + else \ + color_start= color_end=; \ + fi; \ + echo "$${color_start}# $$desc $$count$${color_end}"; \ + }; \ + create_testsuite_report () \ + { \ + result_count $$1 "TOTAL:" $$all "$$brg"; \ + result_count $$1 "PASS: " $$pass "$$grn"; \ + result_count $$1 "SKIP: " $$skip "$$blu"; \ + result_count $$1 "XFAIL:" $$xfail "$$lgn"; \ + result_count $$1 "FAIL: " $$fail "$$red"; \ + result_count $$1 "XPASS:" $$xpass "$$red"; \ + result_count $$1 "ERROR:" $$error "$$mgn"; \ + }; \ + { \ + echo "$(PACKAGE_STRING): $(subdir)/$(TEST_SUITE_LOG)" | \ + $(am__rst_title); \ + create_testsuite_report --no-color; \ + echo; \ + echo ".. contents:: :depth: 2"; \ + echo; \ + for b in $$bases; do echo $$b; done \ + | $(am__create_global_log); \ + } >$(TEST_SUITE_LOG).tmp || exit 1; \ + mv $(TEST_SUITE_LOG).tmp $(TEST_SUITE_LOG); \ + if $$success; then \ + col="$$grn"; \ + else \ + col="$$red"; \ + test x"$$VERBOSE" = x || cat $(TEST_SUITE_LOG); \ + fi; \ + echo "$${col}$$br$${std}"; \ + echo "$${col}Testsuite summary for $(PACKAGE_STRING)$${std}"; \ + echo "$${col}$$br$${std}"; \ + create_testsuite_report --maybe-color; \ + echo "$$col$$br$$std"; \ + if $$success; then :; else \ + echo "$${col}See $(subdir)/$(TEST_SUITE_LOG)$${std}"; \ + if test -n "$(PACKAGE_BUGREPORT)"; then \ + echo "$${col}Please report to $(PACKAGE_BUGREPORT)$${std}"; \ + fi; \ + echo "$$col$$br$$std"; \ + fi; \ + $$success || exit 1 + +check-TESTS: + @list='$(RECHECK_LOGS)'; test -z "$$list" || rm -f $$list + @list='$(RECHECK_LOGS:.log=.trs)'; test -z "$$list" || rm -f $$list + @test -z "$(TEST_SUITE_LOG)" || rm -f $(TEST_SUITE_LOG) + @set +e; $(am__set_TESTS_bases); \ + log_list=`for i in $$bases; do echo $$i.log; done`; \ + trs_list=`for i in $$bases; do echo $$i.trs; done`; \ + log_list=`echo $$log_list`; trs_list=`echo $$trs_list`; \ + $(MAKE) $(AM_MAKEFLAGS) $(TEST_SUITE_LOG) TEST_LOGS="$$log_list"; \ + exit $$?; +recheck: all $(check_PROGRAMS) + @test -z "$(TEST_SUITE_LOG)" || rm -f $(TEST_SUITE_LOG) + @set +e; $(am__set_TESTS_bases); \ + bases=`for i in $$bases; do echo $$i; done \ + | $(am__list_recheck_tests)` || exit 1; \ + log_list=`for i in $$bases; do echo $$i.log; done`; \ + log_list=`echo $$log_list`; \ + $(MAKE) $(AM_MAKEFLAGS) $(TEST_SUITE_LOG) \ + am__force_recheck=am--force-recheck \ + TEST_LOGS="$$log_list"; \ + exit $$? +usrdef_simple.sh.log: usrdef_simple.sh + @p='usrdef_simple.sh'; \ + b='usrdef_simple.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +usrdef_two.sh.log: usrdef_two.sh + @p='usrdef_two.sh'; \ + b='usrdef_two.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +usrdef_twotypes.sh.log: usrdef_twotypes.sh + @p='usrdef_twotypes.sh'; \ + b='usrdef_twotypes.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +usrdef_actual1.sh.log: usrdef_actual1.sh + @p='usrdef_actual1.sh'; \ + b='usrdef_actual1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +usrdef_ipaddr.sh.log: usrdef_ipaddr.sh + @p='usrdef_ipaddr.sh'; \ + b='usrdef_ipaddr.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +usrdef_ipaddr_dotdot.sh.log: usrdef_ipaddr_dotdot.sh + @p='usrdef_ipaddr_dotdot.sh'; \ + b='usrdef_ipaddr_dotdot.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +usrdef_ipaddr_dotdot2.sh.log: usrdef_ipaddr_dotdot2.sh + @p='usrdef_ipaddr_dotdot2.sh'; \ + b='usrdef_ipaddr_dotdot2.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +usrdef_ipaddr_dotdot3.sh.log: usrdef_ipaddr_dotdot3.sh + @p='usrdef_ipaddr_dotdot3.sh'; \ + b='usrdef_ipaddr_dotdot3.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +missing_line_ending.sh.log: missing_line_ending.sh + @p='missing_line_ending.sh'; \ + b='missing_line_ending.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +lognormalizer-invld-call.sh.log: lognormalizer-invld-call.sh + @p='lognormalizer-invld-call.sh'; \ + b='lognormalizer-invld-call.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +string_rb_simple.sh.log: string_rb_simple.sh + @p='string_rb_simple.sh'; \ + b='string_rb_simple.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +string_rb_simple_2_lines.sh.log: string_rb_simple_2_lines.sh + @p='string_rb_simple_2_lines.sh'; \ + b='string_rb_simple_2_lines.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +names.sh.log: names.sh + @p='names.sh'; \ + b='names.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +literal.sh.log: literal.sh + @p='literal.sh'; \ + b='literal.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +include.sh.log: include.sh + @p='include.sh'; \ + b='include.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +include_RULEBASES.sh.log: include_RULEBASES.sh + @p='include_RULEBASES.sh'; \ + b='include_RULEBASES.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +seq_simple.sh.log: seq_simple.sh + @p='seq_simple.sh'; \ + b='seq_simple.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +runaway_rule.sh.log: runaway_rule.sh + @p='runaway_rule.sh'; \ + b='runaway_rule.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +runaway_rule_comment.sh.log: runaway_rule_comment.sh + @p='runaway_rule_comment.sh'; \ + b='runaway_rule_comment.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +annotate.sh.log: annotate.sh + @p='annotate.sh'; \ + b='annotate.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +alternative_simple.sh.log: alternative_simple.sh + @p='alternative_simple.sh'; \ + b='alternative_simple.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +alternative_three.sh.log: alternative_three.sh + @p='alternative_three.sh'; \ + b='alternative_three.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +alternative_nested.sh.log: alternative_nested.sh + @p='alternative_nested.sh'; \ + b='alternative_nested.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +alternative_segfault.sh.log: alternative_segfault.sh + @p='alternative_segfault.sh'; \ + b='alternative_segfault.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +repeat_very_simple.sh.log: repeat_very_simple.sh + @p='repeat_very_simple.sh'; \ + b='repeat_very_simple.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +repeat_simple.sh.log: repeat_simple.sh + @p='repeat_simple.sh'; \ + b='repeat_simple.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +repeat_mismatch_in_while.sh.log: repeat_mismatch_in_while.sh + @p='repeat_mismatch_in_while.sh'; \ + b='repeat_mismatch_in_while.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +repeat_while_alternative.sh.log: repeat_while_alternative.sh + @p='repeat_while_alternative.sh'; \ + b='repeat_while_alternative.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +repeat_alternative_nested.sh.log: repeat_alternative_nested.sh + @p='repeat_alternative_nested.sh'; \ + b='repeat_alternative_nested.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +parser_prios.sh.log: parser_prios.sh + @p='parser_prios.sh'; \ + b='parser_prios.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +parser_whitespace.sh.log: parser_whitespace.sh + @p='parser_whitespace.sh'; \ + b='parser_whitespace.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +parser_whitespace_jsoncnf.sh.log: parser_whitespace_jsoncnf.sh + @p='parser_whitespace_jsoncnf.sh'; \ + b='parser_whitespace_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +parser_LF.sh.log: parser_LF.sh + @p='parser_LF.sh'; \ + b='parser_LF.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +parser_LF_jsoncnf.sh.log: parser_LF_jsoncnf.sh + @p='parser_LF_jsoncnf.sh'; \ + b='parser_LF_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +strict_prefix_actual_sample1.sh.log: strict_prefix_actual_sample1.sh + @p='strict_prefix_actual_sample1.sh'; \ + b='strict_prefix_actual_sample1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +strict_prefix_matching_1.sh.log: strict_prefix_matching_1.sh + @p='strict_prefix_matching_1.sh'; \ + b='strict_prefix_matching_1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +strict_prefix_matching_2.sh.log: strict_prefix_matching_2.sh + @p='strict_prefix_matching_2.sh'; \ + b='strict_prefix_matching_2.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_string.sh.log: field_string.sh + @p='field_string.sh'; \ + b='field_string.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_string_perm_chars.sh.log: field_string_perm_chars.sh + @p='field_string_perm_chars.sh'; \ + b='field_string_perm_chars.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_number.sh.log: field_number.sh + @p='field_number.sh'; \ + b='field_number.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_number-fmt_number.sh.log: field_number-fmt_number.sh + @p='field_number-fmt_number.sh'; \ + b='field_number-fmt_number.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_number_maxval.sh.log: field_number_maxval.sh + @p='field_number_maxval.sh'; \ + b='field_number_maxval.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_hexnumber.sh.log: field_hexnumber.sh + @p='field_hexnumber.sh'; \ + b='field_hexnumber.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_hexnumber-fmt_number.sh.log: field_hexnumber-fmt_number.sh + @p='field_hexnumber-fmt_number.sh'; \ + b='field_hexnumber-fmt_number.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_hexnumber_jsoncnf.sh.log: field_hexnumber_jsoncnf.sh + @p='field_hexnumber_jsoncnf.sh'; \ + b='field_hexnumber_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_hexnumber_range.sh.log: field_hexnumber_range.sh + @p='field_hexnumber_range.sh'; \ + b='field_hexnumber_range.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_hexnumber_range_jsoncnf.sh.log: field_hexnumber_range_jsoncnf.sh + @p='field_hexnumber_range_jsoncnf.sh'; \ + b='field_hexnumber_range_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +rule_last_str_short.sh.log: rule_last_str_short.sh + @p='rule_last_str_short.sh'; \ + b='rule_last_str_short.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_mac48.sh.log: field_mac48.sh + @p='field_mac48.sh'; \ + b='field_mac48.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_mac48_jsoncnf.sh.log: field_mac48_jsoncnf.sh + @p='field_mac48_jsoncnf.sh'; \ + b='field_mac48_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_name_value.sh.log: field_name_value.sh + @p='field_name_value.sh'; \ + b='field_name_value.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_name_value_jsoncnf.sh.log: field_name_value_jsoncnf.sh + @p='field_name_value_jsoncnf.sh'; \ + b='field_name_value_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_kernel_timestamp.sh.log: field_kernel_timestamp.sh + @p='field_kernel_timestamp.sh'; \ + b='field_kernel_timestamp.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_kernel_timestamp_jsoncnf.sh.log: field_kernel_timestamp_jsoncnf.sh + @p='field_kernel_timestamp_jsoncnf.sh'; \ + b='field_kernel_timestamp_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_whitespace.sh.log: field_whitespace.sh + @p='field_whitespace.sh'; \ + b='field_whitespace.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +rule_last_str_long.sh.log: rule_last_str_long.sh + @p='rule_last_str_long.sh'; \ + b='rule_last_str_long.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_whitespace_jsoncnf.sh.log: field_whitespace_jsoncnf.sh + @p='field_whitespace_jsoncnf.sh'; \ + b='field_whitespace_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_rest.sh.log: field_rest.sh + @p='field_rest.sh'; \ + b='field_rest.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_rest_jsoncnf.sh.log: field_rest_jsoncnf.sh + @p='field_rest_jsoncnf.sh'; \ + b='field_rest_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_json.sh.log: field_json.sh + @p='field_json.sh'; \ + b='field_json.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_json_jsoncnf.sh.log: field_json_jsoncnf.sh + @p='field_json_jsoncnf.sh'; \ + b='field_json_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_cee-syslog.sh.log: field_cee-syslog.sh + @p='field_cee-syslog.sh'; \ + b='field_cee-syslog.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_cee-syslog_jsoncnf.sh.log: field_cee-syslog_jsoncnf.sh + @p='field_cee-syslog_jsoncnf.sh'; \ + b='field_cee-syslog_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_ipv6.sh.log: field_ipv6.sh + @p='field_ipv6.sh'; \ + b='field_ipv6.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_ipv6_jsoncnf.sh.log: field_ipv6_jsoncnf.sh + @p='field_ipv6_jsoncnf.sh'; \ + b='field_ipv6_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_v2-iptables.sh.log: field_v2-iptables.sh + @p='field_v2-iptables.sh'; \ + b='field_v2-iptables.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_v2-iptables_jsoncnf.sh.log: field_v2-iptables_jsoncnf.sh + @p='field_v2-iptables_jsoncnf.sh'; \ + b='field_v2-iptables_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_cef.sh.log: field_cef.sh + @p='field_cef.sh'; \ + b='field_cef.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_cef_jsoncnf.sh.log: field_cef_jsoncnf.sh + @p='field_cef_jsoncnf.sh'; \ + b='field_cef_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_checkpoint-lea.sh.log: field_checkpoint-lea.sh + @p='field_checkpoint-lea.sh'; \ + b='field_checkpoint-lea.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_checkpoint-lea_jsoncnf.sh.log: field_checkpoint-lea_jsoncnf.sh + @p='field_checkpoint-lea_jsoncnf.sh'; \ + b='field_checkpoint-lea_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_duration.sh.log: field_duration.sh + @p='field_duration.sh'; \ + b='field_duration.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_duration_jsoncnf.sh.log: field_duration_jsoncnf.sh + @p='field_duration_jsoncnf.sh'; \ + b='field_duration_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_float.sh.log: field_float.sh + @p='field_float.sh'; \ + b='field_float.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_float-fmt_number.sh.log: field_float-fmt_number.sh + @p='field_float-fmt_number.sh'; \ + b='field_float-fmt_number.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_float_jsoncnf.sh.log: field_float_jsoncnf.sh + @p='field_float_jsoncnf.sh'; \ + b='field_float_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_rfc5424timestamp-fmt_timestamp-unix.sh.log: field_rfc5424timestamp-fmt_timestamp-unix.sh + @p='field_rfc5424timestamp-fmt_timestamp-unix.sh'; \ + b='field_rfc5424timestamp-fmt_timestamp-unix.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_rfc5424timestamp-fmt_timestamp-unix-ms.sh.log: field_rfc5424timestamp-fmt_timestamp-unix-ms.sh + @p='field_rfc5424timestamp-fmt_timestamp-unix-ms.sh'; \ + b='field_rfc5424timestamp-fmt_timestamp-unix-ms.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +very_long_logline_jsoncnf.sh.log: very_long_logline_jsoncnf.sh + @p='very_long_logline_jsoncnf.sh'; \ + b='very_long_logline_jsoncnf.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +missing_line_ending_v1.sh.log: missing_line_ending_v1.sh + @p='missing_line_ending_v1.sh'; \ + b='missing_line_ending_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +runaway_rule_v1.sh.log: runaway_rule_v1.sh + @p='runaway_rule_v1.sh'; \ + b='runaway_rule_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +runaway_rule_comment_v1.sh.log: runaway_rule_comment_v1.sh + @p='runaway_rule_comment_v1.sh'; \ + b='runaway_rule_comment_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_hexnumber_v1.sh.log: field_hexnumber_v1.sh + @p='field_hexnumber_v1.sh'; \ + b='field_hexnumber_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_mac48_v1.sh.log: field_mac48_v1.sh + @p='field_mac48_v1.sh'; \ + b='field_mac48_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_name_value_v1.sh.log: field_name_value_v1.sh + @p='field_name_value_v1.sh'; \ + b='field_name_value_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_kernel_timestamp_v1.sh.log: field_kernel_timestamp_v1.sh + @p='field_kernel_timestamp_v1.sh'; \ + b='field_kernel_timestamp_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_whitespace_v1.sh.log: field_whitespace_v1.sh + @p='field_whitespace_v1.sh'; \ + b='field_whitespace_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_rest_v1.sh.log: field_rest_v1.sh + @p='field_rest_v1.sh'; \ + b='field_rest_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_json_v1.sh.log: field_json_v1.sh + @p='field_json_v1.sh'; \ + b='field_json_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_cee-syslog_v1.sh.log: field_cee-syslog_v1.sh + @p='field_cee-syslog_v1.sh'; \ + b='field_cee-syslog_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_ipv6_v1.sh.log: field_ipv6_v1.sh + @p='field_ipv6_v1.sh'; \ + b='field_ipv6_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_v2-iptables_v1.sh.log: field_v2-iptables_v1.sh + @p='field_v2-iptables_v1.sh'; \ + b='field_v2-iptables_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_cef_v1.sh.log: field_cef_v1.sh + @p='field_cef_v1.sh'; \ + b='field_cef_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_checkpoint-lea_v1.sh.log: field_checkpoint-lea_v1.sh + @p='field_checkpoint-lea_v1.sh'; \ + b='field_checkpoint-lea_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_duration_v1.sh.log: field_duration_v1.sh + @p='field_duration_v1.sh'; \ + b='field_duration_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_float_v1.sh.log: field_float_v1.sh + @p='field_float_v1.sh'; \ + b='field_float_v1.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_tokenized.sh.log: field_tokenized.sh + @p='field_tokenized.sh'; \ + b='field_tokenized.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_tokenized_with_invalid_ruledef.sh.log: field_tokenized_with_invalid_ruledef.sh + @p='field_tokenized_with_invalid_ruledef.sh'; \ + b='field_tokenized_with_invalid_ruledef.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_recursive.sh.log: field_recursive.sh + @p='field_recursive.sh'; \ + b='field_recursive.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_tokenized_recursive.sh.log: field_tokenized_recursive.sh + @p='field_tokenized_recursive.sh'; \ + b='field_tokenized_recursive.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_interpret.sh.log: field_interpret.sh + @p='field_interpret.sh'; \ + b='field_interpret.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_interpret_with_invalid_ruledef.sh.log: field_interpret_with_invalid_ruledef.sh + @p='field_interpret_with_invalid_ruledef.sh'; \ + b='field_interpret_with_invalid_ruledef.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_descent.sh.log: field_descent.sh + @p='field_descent.sh'; \ + b='field_descent.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_descent_with_invalid_ruledef.sh.log: field_descent_with_invalid_ruledef.sh + @p='field_descent_with_invalid_ruledef.sh'; \ + b='field_descent_with_invalid_ruledef.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_suffixed.sh.log: field_suffixed.sh + @p='field_suffixed.sh'; \ + b='field_suffixed.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_suffixed_with_invalid_ruledef.sh.log: field_suffixed_with_invalid_ruledef.sh + @p='field_suffixed_with_invalid_ruledef.sh'; \ + b='field_suffixed_with_invalid_ruledef.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_cisco-interface-spec.sh.log: field_cisco-interface-spec.sh + @p='field_cisco-interface-spec.sh'; \ + b='field_cisco-interface-spec.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_float_with_invalid_ruledef.sh.log: field_float_with_invalid_ruledef.sh + @p='field_float_with_invalid_ruledef.sh'; \ + b='field_float_with_invalid_ruledef.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +very_long_logline.sh.log: very_long_logline.sh + @p='very_long_logline.sh'; \ + b='very_long_logline.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_regex_default_group_parse_and_return.sh.log: field_regex_default_group_parse_and_return.sh + @p='field_regex_default_group_parse_and_return.sh'; \ + b='field_regex_default_group_parse_and_return.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_regex_invalid_args.sh.log: field_regex_invalid_args.sh + @p='field_regex_invalid_args.sh'; \ + b='field_regex_invalid_args.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_regex_with_consume_group.sh.log: field_regex_with_consume_group.sh + @p='field_regex_with_consume_group.sh'; \ + b='field_regex_with_consume_group.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_regex_with_consume_group_and_return_group.sh.log: field_regex_with_consume_group_and_return_group.sh + @p='field_regex_with_consume_group_and_return_group.sh'; \ + b='field_regex_with_consume_group_and_return_group.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_regex_with_negation.sh.log: field_regex_with_negation.sh + @p='field_regex_with_negation.sh'; \ + b='field_regex_with_negation.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_tokenized_with_regex.sh.log: field_tokenized_with_regex.sh + @p='field_tokenized_with_regex.sh'; \ + b='field_tokenized_with_regex.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +field_regex_while_regex_support_is_disabled.sh.log: field_regex_while_regex_support_is_disabled.sh + @p='field_regex_while_regex_support_is_disabled.sh'; \ + b='field_regex_while_regex_support_is_disabled.sh'; \ + $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +.test.log: + @p='$<'; \ + $(am__set_b); \ + $(am__check_pre) $(TEST_LOG_DRIVER) --test-name "$$f" \ + --log-file $$b.log --trs-file $$b.trs \ + $(am__common_driver_flags) $(AM_TEST_LOG_DRIVER_FLAGS) $(TEST_LOG_DRIVER_FLAGS) -- $(TEST_LOG_COMPILE) \ + "$$tst" $(AM_TESTS_FD_REDIRECT) +@am__EXEEXT_TRUE@.test$(EXEEXT).log: +@am__EXEEXT_TRUE@ @p='$<'; \ +@am__EXEEXT_TRUE@ $(am__set_b); \ +@am__EXEEXT_TRUE@ $(am__check_pre) $(TEST_LOG_DRIVER) --test-name "$$f" \ +@am__EXEEXT_TRUE@ --log-file $$b.log --trs-file $$b.trs \ +@am__EXEEXT_TRUE@ $(am__common_driver_flags) $(AM_TEST_LOG_DRIVER_FLAGS) $(TEST_LOG_DRIVER_FLAGS) -- $(TEST_LOG_COMPILE) \ +@am__EXEEXT_TRUE@ "$$tst" $(AM_TESTS_FD_REDIRECT) + +distdir: $(DISTFILES) + @srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + list='$(DISTFILES)'; \ + dist_files=`for file in $$list; do echo $$file; done | \ + sed -e "s|^$$srcdirstrip/||;t" \ + -e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \ + case $$dist_files in \ + */*) $(MKDIR_P) `echo "$$dist_files" | \ + sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \ + sort -u` ;; \ + esac; \ + for file in $$dist_files; do \ + if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \ + if test -d $$d/$$file; then \ + dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \ + if test -d "$(distdir)/$$file"; then \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \ + cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \ + else \ + test -f "$(distdir)/$$file" \ + || cp -p $$d/$$file "$(distdir)/$$file" \ + || exit 1; \ + fi; \ + done +check-am: all-am + $(MAKE) $(AM_MAKEFLAGS) $(check_PROGRAMS) + $(MAKE) $(AM_MAKEFLAGS) check-TESTS +check: check-am +all-am: Makefile +installdirs: +install: install-am +install-exec: install-exec-am +install-data: install-data-am +uninstall: uninstall-am + +install-am: all-am + @$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am + +installcheck: installcheck-am +install-strip: + if test -z '$(STRIP)'; then \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + install; \ + else \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \ + fi +mostlyclean-generic: + -test -z "$(TEST_LOGS)" || rm -f $(TEST_LOGS) + -test -z "$(TEST_LOGS:.log=.trs)" || rm -f $(TEST_LOGS:.log=.trs) + -test -z "$(TEST_SUITE_LOG)" || rm -f $(TEST_SUITE_LOG) + +clean-generic: + +distclean-generic: + -test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES) + -test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES) + +maintainer-clean-generic: + @echo "This command is intended for maintainers to use" + @echo "it deletes files that may require special tools to rebuild." +clean: clean-am + +clean-am: clean-checkPROGRAMS clean-generic clean-libtool \ + mostlyclean-am + +distclean: distclean-am + -rm -rf ./$(DEPDIR) + -rm -f Makefile +distclean-am: clean-am distclean-compile distclean-generic \ + distclean-tags + +dvi: dvi-am + +dvi-am: + +html: html-am + +html-am: + +info: info-am + +info-am: + +install-data-am: + +install-dvi: install-dvi-am + +install-dvi-am: + +install-exec-am: + +install-html: install-html-am + +install-html-am: + +install-info: install-info-am + +install-info-am: + +install-man: + +install-pdf: install-pdf-am + +install-pdf-am: + +install-ps: install-ps-am + +install-ps-am: + +installcheck-am: + +maintainer-clean: maintainer-clean-am + -rm -rf ./$(DEPDIR) + -rm -f Makefile +maintainer-clean-am: distclean-am maintainer-clean-generic + +mostlyclean: mostlyclean-am + +mostlyclean-am: mostlyclean-compile mostlyclean-generic \ + mostlyclean-libtool + +pdf: pdf-am + +pdf-am: + +ps: ps-am + +ps-am: + +uninstall-am: + +.MAKE: check-am install-am install-strip + +.PHONY: CTAGS GTAGS TAGS all all-am check check-TESTS check-am clean \ + clean-checkPROGRAMS clean-generic clean-libtool cscopelist-am \ + ctags ctags-am distclean distclean-compile distclean-generic \ + distclean-libtool distclean-tags distdir dvi dvi-am html \ + html-am info info-am install install-am install-data \ + install-data-am install-dvi install-dvi-am install-exec \ + install-exec-am install-html install-html-am install-info \ + install-info-am install-man install-pdf install-pdf-am \ + install-ps install-ps-am install-strip installcheck \ + installcheck-am installdirs maintainer-clean \ + maintainer-clean-generic mostlyclean mostlyclean-compile \ + mostlyclean-generic mostlyclean-libtool pdf pdf-am ps ps-am \ + recheck tags tags-am uninstall uninstall-am + +.PRECIOUS: Makefile + + +# Tell versions [3.59,3.63) of GNU make to not export all variables. +# Otherwise a system limit (for SysV at least) may be exceeded. +.NOEXPORT: diff --git a/tests/alternative_nested.sh b/tests/alternative_nested.sh new file mode 100755 index 0000000..6007624 --- /dev/null +++ b/tests/alternative_nested.sh @@ -0,0 +1,24 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple alternative syntax" +add_rule 'version=2' +add_rule 'rule=:a % + {"type":"alternative", + "parser": [ + [ + {"type":"number", "name":"num1"}, + {"type":"literal", "text":":"}, + {"type":"number", "name":"num"}, + ], + {"type":"hexnumber", "name":"hex"} + ] + }% b' +execute 'a 47:11 b' +assert_output_json_eq '{"num": "11", "num1": "47" }' +execute 'a 0x4711 b' +assert_output_json_eq '{ "hex": "0x4711" }' + +cleanup_tmp_files diff --git a/tests/alternative_segfault.sh b/tests/alternative_segfault.sh new file mode 100755 index 0000000..4ab6dfd --- /dev/null +++ b/tests/alternative_segfault.sh @@ -0,0 +1,16 @@ +# added 2016-10-17 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "a case that caused a segfault in practice" +add_rule 'version=2' +add_rule 'rule=:%host:ipv4% %{"type":"alternative","parser":[{"type":"literal","text":"-"},{"type":"number","name":"identd"}]}% %OK:word%' +execute '1.2.3.4 - TEST_OK' +assert_output_json_eq '{ "OK": "TEST_OK", "host": "1.2.3.4" }' +execute '1.2.3.4 100 TEST_OK' +assert_output_json_eq '{ "OK": "TEST_OK", "identd": "100", "host": "1.2.3.4" }' +execute '1.2.3.4 ERR TEST_OK' +assert_output_json_eq '{ "originalmsg": "1.2.3.4 ERR TEST_OK", "unparsed-data": "ERR TEST_OK" }' + +cleanup_tmp_files diff --git a/tests/alternative_simple.sh b/tests/alternative_simple.sh new file mode 100755 index 0000000..51c4b5b --- /dev/null +++ b/tests/alternative_simple.sh @@ -0,0 +1,14 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple alternative syntax" +add_rule 'version=2' +add_rule 'rule=:a %{"type":"alternative", "parser":[{"name":"num", "type":"number"}, {"name":"hex", "type":"hexnumber"}]}% b' +execute 'a 4711 b' +assert_output_json_eq '{ "num": "4711" }' +execute 'a 0x4711 b' +assert_output_json_eq '{ "hex": "0x4711" }' + +cleanup_tmp_files diff --git a/tests/alternative_three.sh b/tests/alternative_three.sh new file mode 100755 index 0000000..520f292 --- /dev/null +++ b/tests/alternative_three.sh @@ -0,0 +1,16 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple alternative syntax" +add_rule 'version=2' +add_rule 'rule=:a %{"type":"alternative", "parser":[{"name":"num", "type":"number"}, {"name":"hex", "type":"hexnumber"}, {"name":"wrd", "type":"word"}]}% b' +execute 'a 4711 b' +assert_output_json_eq '{ "num": "4711" }' +execute 'a 0x4711 b' +assert_output_json_eq '{ "hex": "0x4711" }' +execute 'a 0xyz b' +assert_output_json_eq '{ "wrd": "0xyz" }' + +cleanup_tmp_files diff --git a/tests/annotate.sh b/tests/annotate.sh new file mode 100755 index 0000000..c753dc2 --- /dev/null +++ b/tests/annotate.sh @@ -0,0 +1,23 @@ +# added 2016-11-08 by Rainer Gerhards +. $srcdir/exec.sh + +test_def $0 "annotate functionality" + +reset_rules +add_rule 'version=2' +add_rule 'rule=ABC,WIN:<%-:number%>1 %-:date-rfc5424% %-:word% %tag:word% - - -' +add_rule 'rule=ABC:<%-:number%>1 %-:date-rfc5424% %-:word% %tag:word% + - -' +add_rule 'rule=WIN:<%-:number%>1 %-:date-rfc5424% %-:word% %tag:word% . - -' +add_rule 'annotate=WIN:+annot1="WIN" # inline-comment' +add_rule 'annotate=ABC:+annot2="ABC"' + +execute '<37>1 2016-11-03T23:59:59+03:00 server.example.net TAG . - -' +assert_output_json_eq '{ "tag": "TAG", "annot1": "WIN" }' + +execute '<37>1 2016-11-03T23:59:59+03:00 server.example.net TAG + - -' +assert_output_json_eq '{ "tag": "TAG", "annot2": "ABC" }' + +execute '<6>1 2016-09-02T07:41:07+02:00 server.example.net TAG - - -' +assert_output_json_eq '{ "tag": "TAG", "annot1": "WIN", "annot2": "ABC" }' + +cleanup_tmp_files diff --git a/tests/exec.sh b/tests/exec.sh new file mode 100644 index 0000000..5f5eea0 --- /dev/null +++ b/tests/exec.sh @@ -0,0 +1,98 @@ +# environment variables: +# GREP - if set, can be used to use alternative grep version +# Most important use case is to use GNU grep (ggrep) +# on Solaris. If unset, use "grep". +set -e + +if [ "x$debug" == "xon" ]; then #get core-dump on crash + ulimit -c unlimited +fi + +cmd=../src/ln_test + +. ./options.sh + +test_def() { + test_file=$(basename $1) + test_name=$(echo $test_file | sed -e 's/\..*//g') + + echo =============================================================================== + echo "[${test_file}]: test for ${2}" +} + +execute() { + if [ "x$debug" == "xon" ]; then + echo "======rulebase=======" + cat tmp.rulebase + echo "=====================" + set -x + fi + if [ "$1" == "file" ]; then + $cmd $ln_opts -r tmp.rulebase -e json > test.out < $2 + else + echo "$1" | $cmd $ln_opts -r tmp.rulebase -e json > test.out + fi + echo "Out:" + cat test.out + if [ "x$debug" == "xon" ]; then + set +x + fi +} + +execute_with_string() { + # $1 must be rulebase string + # $2 must be sample string + if [ "x$debug" == "xon" ]; then + echo "======rulebase=======" + cat tmp.rulebase + echo "=====================" + set -x + fi + echo "$2" | $cmd $ln_opts -R "$1" -e json > test.out + echo "Out:" + cat test.out + if [ "x$debug" == "xon" ]; then + set +x + fi +} + +assert_output_contains() { + if [ "x$GREP" == "x" ]; then + GREP=grep + fi + cat test.out | $GREP -F "$1" +} + +assert_output_json_eq() { + ./json_eq "$1" "$(cat test.out)" +} + +rulebase_file_name() { + if [ "x$1" == "x" ]; then + echo tmp.rulebase + else + echo $1.rulebase + fi +} + +reset_rules() { + rb_file=$(rulebase_file_name $1) + rm -f $rb_file +} + +add_rule() { + rb_file=$(rulebase_file_name $2) + echo $1 >> $rb_file +} + +add_rule_no_LF() { + rb_file=$(rulebase_file_name $2) + echo -n $1 >> $rb_file +} + + +cleanup_tmp_files() { + rm -f test.out *.rulebase +} + +reset_rules diff --git a/tests/field_cee-syslog.sh b/tests/field_cee-syslog.sh new file mode 100755 index 0000000..2a86ec1 --- /dev/null +++ b/tests/field_cee-syslog.sh @@ -0,0 +1,29 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "JSON field" +add_rule 'version=2' +add_rule 'rule=:%field:cee-syslog%' + +execute '@cee:{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee:{"f1": "1", "f2": 2} ' # note the trailing space +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee: {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee: {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# +# Things that MUST NOT work +# +execute '@cee: {"f1": "1", "f2": 2} data' +assert_output_json_eq '{ "originalmsg": "@cee: {\"f1\": \"1\", \"f2\": 2} data", "unparsed-data": "@cee: {\"f1\": \"1\", \"f2\": 2} data" }' + + +cleanup_tmp_files + diff --git a/tests/field_cee-syslog_jsoncnf.sh b/tests/field_cee-syslog_jsoncnf.sh new file mode 100755 index 0000000..4a5a49f --- /dev/null +++ b/tests/field_cee-syslog_jsoncnf.sh @@ -0,0 +1,29 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "JSON field" +add_rule 'version=2' +add_rule 'rule=:%{"name":"field", "type":"cee-syslog"}%' + +execute '@cee:{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee:{"f1": "1", "f2": 2} ' # note the trailing space +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee: {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee: {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# +# Things that MUST NOT work +# +execute '@cee: {"f1": "1", "f2": 2} data' +assert_output_json_eq '{ "originalmsg": "@cee: {\"f1\": \"1\", \"f2\": 2} data", "unparsed-data": "@cee: {\"f1\": \"1\", \"f2\": 2} data" }' + + +cleanup_tmp_files + diff --git a/tests/field_cee-syslog_v1.sh b/tests/field_cee-syslog_v1.sh new file mode 100755 index 0000000..481c03c --- /dev/null +++ b/tests/field_cee-syslog_v1.sh @@ -0,0 +1,28 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "JSON field" +add_rule 'rule=:%field:cee-syslog%' + +execute '@cee:{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee:{"f1": "1", "f2": 2} ' # note the trailing space +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee: {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '@cee: {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# +# Things that MUST NOT work +# +execute '@cee: {"f1": "1", "f2": 2} data' +assert_output_json_eq '{ "originalmsg": "@cee: {\"f1\": \"1\", \"f2\": 2} data", "unparsed-data": "@cee: {\"f1\": \"1\", \"f2\": 2} data" }' + + +cleanup_tmp_files + diff --git a/tests/field_cef.sh b/tests/field_cef.sh new file mode 100755 index 0000000..8f429c5 --- /dev/null +++ b/tests/field_cef.sh @@ -0,0 +1,58 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "CEF parser" +add_rule 'version=2' +add_rule 'rule=:%f:cef%' + +# fabricated tests to test specific functionality +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product\|1\|\\|Version|Signature ID|some name|Severity| aa=field1 bb=this is a name\=value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product|1|\\", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a name=value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \= value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a = value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity|' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=value' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "name": "value" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=val\nue' # embedded LF +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "name": "val\nue" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value' #invalid punctuation in extension +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value" }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\alue' #invalid escape in extension +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\\alue", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\\alue" }' + +execute 'CEF:0|V\endor|Product|Version|Signature ID|some name|Severity| name=value' #invalid escape in header +assert_output_json_eq '{ "originalmsg": "CEF:0|V\\endor|Product|Version|Signature ID|some name|Severity| name=value", "unparsed-data": "CEF:0|V\\endor|Product|Version|Signature ID|some name|Severity| name=value" }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ' # single trailing space - valid +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ' # multiple trailing spaces - invalid +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| " }' + +execute 'CEF:0|Vendor' +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor", "unparsed-data": "CEF:0|Vendor" }' + +execute 'CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \= value cc=field 3' +assert_output_json_eq '{ "originalmsg": "CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \\= value cc=field 3", "unparsed-data": "CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \\= value cc=field 3" }' + +execute '' +assert_output_json_eq '{ "originalmsg": "", "unparsed-data": "" }' + +# finally, a use case from practice +execute 'CEF:0|ArcSight|ArcSight|10.0.0.15.0|rule:101|FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt|High| eventId=24934046519 type=2 mrt=8888882444085 sessionId=0 generatorID=34rSQWFOOOCAVlswcKFkbA\=\= categorySignificance=/Normal categoryBehavior=/Execute/Query categoryDeviceGroup=/Application categoryOutcome=/Success categoryObject=/Host/Application modelConfidence=0 severity=0 relevance=10 assetCriticality=0 priority=2 art=1427882454263 cat=/Detection/FOO/UNIX/Direct Root Connection Attempt deviceSeverity=Warning rt=1427881661000 shost=server.foo.bar src=10.0.0.1 sourceZoneID=MRL4p30sFOOO8panjcQnFbw\=\= sourceZoneURI=/All Zones/FOO Solutions/Server Subnet/UK/PAR-WDC-12-CELL5-PROD S2U 1 10.0.0.1-10.0.0.1 sourceGeoCountryCode=GB sourceGeoLocationInfo=London slong=-0.90843 slat=51.9039 dhost=server.foo.bar dst=10.0.0.1 destinationZoneID=McFOOO0sBABCUHR83pKJmQA\=\= destinationZoneURI=/All Zones/FOO Solutions/Prod/AMERICAS/FOO 10.0.0.1-10.0.0.1 duser=johndoe destinationGeoCountryCode=US destinationGeoLocationInfo=Jersey City dlong=-90.0435 dlat=30.732 fname=FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt filePath=/All Rules/Real-time Rules/ACBP-ACCESS CONTROL and AUTHORIZATION/FOO/Unix Server/FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt fileType=Rule ruleThreadId=NQVtdFOOABDrKsmLWpyq8g\=\= cs2= flexString2=DC0001-988 locality=1 cs2Label=Configuration Resource ahost=foo.bar agt=10.0.0.1 av=10.0.0.12 atz=Europe/Berlin aid=34rSQWFOOOBCAVlswcKFkbA\=\= at=superagent_ng dvchost=server.foo.bar dvc=10.0.0.1 deviceZoneID=Mbb8pFOOODol1dBKdURJA\=\= deviceZoneURI=/All Zones/FOO Solutions/Prod/GERMANY/FOO US2 Class6 A 508 10.0.0.1-10.0.0.1 dtz=Europe/Berlin deviceFacility=Rules Engine eventAnnotationStageUpdateTime=1427882444192 eventAnnotationModificationTime=1427882444192 eventAnnotationAuditTrail=1,1427453188050,root,Queued,,,,\n eventAnnotationVersion=1 eventAnnotationFlags=0 eventAnnotationEndTime=1427881661000 eventAnnotationManagerReceiptTime=1427882444085 _cefVer=0.1 ad.arcSightEventPath=3VcygrkkBABCAYFOOLlU13A\=\= baseEventIds=24934003731"' +assert_output_json_eq '{ "f": { "DeviceVendor": "ArcSight", "DeviceProduct": "ArcSight", "DeviceVersion": "10.0.0.15.0", "SignatureID": "rule:101", "Name": "FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "Severity": "High", "Extensions": { "eventId": "24934046519", "type": "2", "mrt": "8888882444085", "sessionId": "0", "generatorID": "34rSQWFOOOCAVlswcKFkbA==", "categorySignificance": "\/Normal", "categoryBehavior": "\/Execute\/Query", "categoryDeviceGroup": "\/Application", "categoryOutcome": "\/Success", "categoryObject": "\/Host\/Application", "modelConfidence": "0", "severity": "0", "relevance": "10", "assetCriticality": "0", "priority": "2", "art": "1427882454263", "cat": "\/Detection\/FOO\/UNIX\/Direct Root Connection Attempt", "deviceSeverity": "Warning", "rt": "1427881661000", "shost": "server.foo.bar", "src": "10.0.0.1", "sourceZoneID": "MRL4p30sFOOO8panjcQnFbw==", "sourceZoneURI": "\/All Zones\/FOO Solutions\/Server Subnet\/UK\/PAR-WDC-12-CELL5-PROD S2U 1 10.0.0.1-10.0.0.1", "sourceGeoCountryCode": "GB", "sourceGeoLocationInfo": "London", "slong": "-0.90843", "slat": "51.9039", "dhost": "server.foo.bar", "dst": "10.0.0.1", "destinationZoneID": "McFOOO0sBABCUHR83pKJmQA==", "destinationZoneURI": "\/All Zones\/FOO Solutions\/Prod\/AMERICAS\/FOO 10.0.0.1-10.0.0.1", "duser": "johndoe", "destinationGeoCountryCode": "US", "destinationGeoLocationInfo": "Jersey City", "dlong": "-90.0435", "dlat": "30.732", "fname": "FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "filePath": "\/All Rules\/Real-time Rules\/ACBP-ACCESS CONTROL and AUTHORIZATION\/FOO\/Unix Server\/FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "fileType": "Rule", "ruleThreadId": "NQVtdFOOABDrKsmLWpyq8g==", "cs2": "", "flexString2": "DC0001-988", "locality": "1", "cs2Label": "Configuration Resource", "ahost": "foo.bar", "agt": "10.0.0.1", "av": "10.0.0.12", "atz": "Europe\/Berlin", "aid": "34rSQWFOOOBCAVlswcKFkbA==", "at": "superagent_ng", "dvchost": "server.foo.bar", "dvc": "10.0.0.1", "deviceZoneID": "Mbb8pFOOODol1dBKdURJA==", "deviceZoneURI": "\/All Zones\/FOO Solutions\/Prod\/GERMANY\/FOO US2 Class6 A 508 10.0.0.1-10.0.0.1", "dtz": "Europe\/Berlin", "deviceFacility": "Rules Engine", "eventAnnotationStageUpdateTime": "1427882444192", "eventAnnotationModificationTime": "1427882444192", "eventAnnotationAuditTrail": "1,1427453188050,root,Queued,,,,\n", "eventAnnotationVersion": "1", "eventAnnotationFlags": "0", "eventAnnotationEndTime": "1427881661000", "eventAnnotationManagerReceiptTime": "1427882444085", "_cefVer": "0.1", "ad.arcSightEventPath": "3VcygrkkBABCAYFOOLlU13A==", "baseEventIds": "24934003731\"" } } }' + + +cleanup_tmp_files + diff --git a/tests/field_cef_jsoncnf.sh b/tests/field_cef_jsoncnf.sh new file mode 100755 index 0000000..213cc2b --- /dev/null +++ b/tests/field_cef_jsoncnf.sh @@ -0,0 +1,58 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "CEF parser" +add_rule 'version=2' +add_rule 'rule=:%{"name":"f", "type":"cef"}%' + +# fabricated tests to test specific functionality +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product\|1\|\\|Version|Signature ID|some name|Severity| aa=field1 bb=this is a name\=value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product|1|\\", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a name=value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \= value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a = value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity|' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=value' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "name": "value" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=val\nue' # embedded LF +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "name": "val\nue" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value' #invalid punctuation in extension +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value" }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\alue' #invalid escape in extension +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\\alue", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\\alue" }' + +execute 'CEF:0|V\endor|Product|Version|Signature ID|some name|Severity| name=value' #invalid escape in header +assert_output_json_eq '{ "originalmsg": "CEF:0|V\\endor|Product|Version|Signature ID|some name|Severity| name=value", "unparsed-data": "CEF:0|V\\endor|Product|Version|Signature ID|some name|Severity| name=value" }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ' # single trailing space - valid +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ' # multiple trailing spaces - invalid +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| " }' + +execute 'CEF:0|Vendor' +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor", "unparsed-data": "CEF:0|Vendor" }' + +execute 'CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \= value cc=field 3' +assert_output_json_eq '{ "originalmsg": "CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \\= value cc=field 3", "unparsed-data": "CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \\= value cc=field 3" }' + +execute '' +assert_output_json_eq '{ "originalmsg": "", "unparsed-data": "" }' + +# finally, a use case from practice +execute 'CEF:0|ArcSight|ArcSight|10.0.0.15.0|rule:101|FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt|High| eventId=24934046519 type=2 mrt=8888882444085 sessionId=0 generatorID=34rSQWFOOOCAVlswcKFkbA\=\= categorySignificance=/Normal categoryBehavior=/Execute/Query categoryDeviceGroup=/Application categoryOutcome=/Success categoryObject=/Host/Application modelConfidence=0 severity=0 relevance=10 assetCriticality=0 priority=2 art=1427882454263 cat=/Detection/FOO/UNIX/Direct Root Connection Attempt deviceSeverity=Warning rt=1427881661000 shost=server.foo.bar src=10.0.0.1 sourceZoneID=MRL4p30sFOOO8panjcQnFbw\=\= sourceZoneURI=/All Zones/FOO Solutions/Server Subnet/UK/PAR-WDC-12-CELL5-PROD S2U 1 10.0.0.1-10.0.0.1 sourceGeoCountryCode=GB sourceGeoLocationInfo=London slong=-0.90843 slat=51.9039 dhost=server.foo.bar dst=10.0.0.1 destinationZoneID=McFOOO0sBABCUHR83pKJmQA\=\= destinationZoneURI=/All Zones/FOO Solutions/Prod/AMERICAS/FOO 10.0.0.1-10.0.0.1 duser=johndoe destinationGeoCountryCode=US destinationGeoLocationInfo=Jersey City dlong=-90.0435 dlat=30.732 fname=FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt filePath=/All Rules/Real-time Rules/ACBP-ACCESS CONTROL and AUTHORIZATION/FOO/Unix Server/FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt fileType=Rule ruleThreadId=NQVtdFOOABDrKsmLWpyq8g\=\= cs2= flexString2=DC0001-988 locality=1 cs2Label=Configuration Resource ahost=foo.bar agt=10.0.0.1 av=10.0.0.12 atz=Europe/Berlin aid=34rSQWFOOOBCAVlswcKFkbA\=\= at=superagent_ng dvchost=server.foo.bar dvc=10.0.0.1 deviceZoneID=Mbb8pFOOODol1dBKdURJA\=\= deviceZoneURI=/All Zones/FOO Solutions/Prod/GERMANY/FOO US2 Class6 A 508 10.0.0.1-10.0.0.1 dtz=Europe/Berlin deviceFacility=Rules Engine eventAnnotationStageUpdateTime=1427882444192 eventAnnotationModificationTime=1427882444192 eventAnnotationAuditTrail=1,1427453188050,root,Queued,,,,\n eventAnnotationVersion=1 eventAnnotationFlags=0 eventAnnotationEndTime=1427881661000 eventAnnotationManagerReceiptTime=1427882444085 _cefVer=0.1 ad.arcSightEventPath=3VcygrkkBABCAYFOOLlU13A\=\= baseEventIds=24934003731"' +assert_output_json_eq '{ "f": { "DeviceVendor": "ArcSight", "DeviceProduct": "ArcSight", "DeviceVersion": "10.0.0.15.0", "SignatureID": "rule:101", "Name": "FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "Severity": "High", "Extensions": { "eventId": "24934046519", "type": "2", "mrt": "8888882444085", "sessionId": "0", "generatorID": "34rSQWFOOOCAVlswcKFkbA==", "categorySignificance": "\/Normal", "categoryBehavior": "\/Execute\/Query", "categoryDeviceGroup": "\/Application", "categoryOutcome": "\/Success", "categoryObject": "\/Host\/Application", "modelConfidence": "0", "severity": "0", "relevance": "10", "assetCriticality": "0", "priority": "2", "art": "1427882454263", "cat": "\/Detection\/FOO\/UNIX\/Direct Root Connection Attempt", "deviceSeverity": "Warning", "rt": "1427881661000", "shost": "server.foo.bar", "src": "10.0.0.1", "sourceZoneID": "MRL4p30sFOOO8panjcQnFbw==", "sourceZoneURI": "\/All Zones\/FOO Solutions\/Server Subnet\/UK\/PAR-WDC-12-CELL5-PROD S2U 1 10.0.0.1-10.0.0.1", "sourceGeoCountryCode": "GB", "sourceGeoLocationInfo": "London", "slong": "-0.90843", "slat": "51.9039", "dhost": "server.foo.bar", "dst": "10.0.0.1", "destinationZoneID": "McFOOO0sBABCUHR83pKJmQA==", "destinationZoneURI": "\/All Zones\/FOO Solutions\/Prod\/AMERICAS\/FOO 10.0.0.1-10.0.0.1", "duser": "johndoe", "destinationGeoCountryCode": "US", "destinationGeoLocationInfo": "Jersey City", "dlong": "-90.0435", "dlat": "30.732", "fname": "FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "filePath": "\/All Rules\/Real-time Rules\/ACBP-ACCESS CONTROL and AUTHORIZATION\/FOO\/Unix Server\/FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "fileType": "Rule", "ruleThreadId": "NQVtdFOOABDrKsmLWpyq8g==", "cs2": "", "flexString2": "DC0001-988", "locality": "1", "cs2Label": "Configuration Resource", "ahost": "foo.bar", "agt": "10.0.0.1", "av": "10.0.0.12", "atz": "Europe\/Berlin", "aid": "34rSQWFOOOBCAVlswcKFkbA==", "at": "superagent_ng", "dvchost": "server.foo.bar", "dvc": "10.0.0.1", "deviceZoneID": "Mbb8pFOOODol1dBKdURJA==", "deviceZoneURI": "\/All Zones\/FOO Solutions\/Prod\/GERMANY\/FOO US2 Class6 A 508 10.0.0.1-10.0.0.1", "dtz": "Europe\/Berlin", "deviceFacility": "Rules Engine", "eventAnnotationStageUpdateTime": "1427882444192", "eventAnnotationModificationTime": "1427882444192", "eventAnnotationAuditTrail": "1,1427453188050,root,Queued,,,,\n", "eventAnnotationVersion": "1", "eventAnnotationFlags": "0", "eventAnnotationEndTime": "1427881661000", "eventAnnotationManagerReceiptTime": "1427882444085", "_cefVer": "0.1", "ad.arcSightEventPath": "3VcygrkkBABCAYFOOLlU13A==", "baseEventIds": "24934003731\"" } } }' + + +cleanup_tmp_files + diff --git a/tests/field_cef_v1.sh b/tests/field_cef_v1.sh new file mode 100755 index 0000000..4c7773c --- /dev/null +++ b/tests/field_cef_v1.sh @@ -0,0 +1,57 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "CEF parser" +add_rule 'rule=:%f:cef%' + +# fabricated tests to test specific functionality +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product\|1\|\\|Version|Signature ID|some name|Severity| aa=field1 bb=this is a name\=value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product|1|\\", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a name=value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \= value cc=field 3' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "aa": "field1", "bb": "this is a = value", "cc": "field 3" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity|' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=value' +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "name": "value" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=val\nue' # embedded LF +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { "name": "val\nue" } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value' #invalid punctuation in extension +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| n,me=value" }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\alue' #invalid escape in extension +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\\alue", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| name=v\\alue" }' + +execute 'CEF:0|V\endor|Product|Version|Signature ID|some name|Severity| name=value' #invalid escape in header +assert_output_json_eq '{ "originalmsg": "CEF:0|V\\endor|Product|Version|Signature ID|some name|Severity| name=value", "unparsed-data": "CEF:0|V\\endor|Product|Version|Signature ID|some name|Severity| name=value" }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ' # single trailing space - valid +assert_output_json_eq '{ "f": { "DeviceVendor": "Vendor", "DeviceProduct": "Product", "DeviceVersion": "Version", "SignatureID": "Signature ID", "Name": "some name", "Severity": "Severity", "Extensions": { } } }' + +execute 'CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ' # multiple trailing spaces - invalid +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| ", "unparsed-data": "CEF:0|Vendor|Product|Version|Signature ID|some name|Severity| " }' + +execute 'CEF:0|Vendor' +assert_output_json_eq '{ "originalmsg": "CEF:0|Vendor", "unparsed-data": "CEF:0|Vendor" }' + +execute 'CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \= value cc=field 3' +assert_output_json_eq '{ "originalmsg": "CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \\= value cc=field 3", "unparsed-data": "CEF:1|Vendor|Product|Version|Signature ID|some name|Severity| aa=field1 bb=this is a \\= value cc=field 3" }' + +execute '' +assert_output_json_eq '{ "originalmsg": "", "unparsed-data": "" }' + +# finally, a use case from practice +execute 'CEF:0|ArcSight|ArcSight|10.0.0.15.0|rule:101|FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt|High| eventId=24934046519 type=2 mrt=8888882444085 sessionId=0 generatorID=34rSQWFOOOCAVlswcKFkbA\=\= categorySignificance=/Normal categoryBehavior=/Execute/Query categoryDeviceGroup=/Application categoryOutcome=/Success categoryObject=/Host/Application modelConfidence=0 severity=0 relevance=10 assetCriticality=0 priority=2 art=1427882454263 cat=/Detection/FOO/UNIX/Direct Root Connection Attempt deviceSeverity=Warning rt=1427881661000 shost=server.foo.bar src=10.0.0.1 sourceZoneID=MRL4p30sFOOO8panjcQnFbw\=\= sourceZoneURI=/All Zones/FOO Solutions/Server Subnet/UK/PAR-WDC-12-CELL5-PROD S2U 1 10.0.0.1-10.0.0.1 sourceGeoCountryCode=GB sourceGeoLocationInfo=London slong=-0.90843 slat=51.9039 dhost=server.foo.bar dst=10.0.0.1 destinationZoneID=McFOOO0sBABCUHR83pKJmQA\=\= destinationZoneURI=/All Zones/FOO Solutions/Prod/AMERICAS/FOO 10.0.0.1-10.0.0.1 duser=johndoe destinationGeoCountryCode=US destinationGeoLocationInfo=Jersey City dlong=-90.0435 dlat=30.732 fname=FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt filePath=/All Rules/Real-time Rules/ACBP-ACCESS CONTROL and AUTHORIZATION/FOO/Unix Server/FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt fileType=Rule ruleThreadId=NQVtdFOOABDrKsmLWpyq8g\=\= cs2= flexString2=DC0001-988 locality=1 cs2Label=Configuration Resource ahost=foo.bar agt=10.0.0.1 av=10.0.0.12 atz=Europe/Berlin aid=34rSQWFOOOBCAVlswcKFkbA\=\= at=superagent_ng dvchost=server.foo.bar dvc=10.0.0.1 deviceZoneID=Mbb8pFOOODol1dBKdURJA\=\= deviceZoneURI=/All Zones/FOO Solutions/Prod/GERMANY/FOO US2 Class6 A 508 10.0.0.1-10.0.0.1 dtz=Europe/Berlin deviceFacility=Rules Engine eventAnnotationStageUpdateTime=1427882444192 eventAnnotationModificationTime=1427882444192 eventAnnotationAuditTrail=1,1427453188050,root,Queued,,,,\n eventAnnotationVersion=1 eventAnnotationFlags=0 eventAnnotationEndTime=1427881661000 eventAnnotationManagerReceiptTime=1427882444085 _cefVer=0.1 ad.arcSightEventPath=3VcygrkkBABCAYFOOLlU13A\=\= baseEventIds=24934003731"' +assert_output_json_eq '{ "f": { "DeviceVendor": "ArcSight", "DeviceProduct": "ArcSight", "DeviceVersion": "10.0.0.15.0", "SignatureID": "rule:101", "Name": "FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "Severity": "High", "Extensions": { "eventId": "24934046519", "type": "2", "mrt": "8888882444085", "sessionId": "0", "generatorID": "34rSQWFOOOCAVlswcKFkbA==", "categorySignificance": "\/Normal", "categoryBehavior": "\/Execute\/Query", "categoryDeviceGroup": "\/Application", "categoryOutcome": "\/Success", "categoryObject": "\/Host\/Application", "modelConfidence": "0", "severity": "0", "relevance": "10", "assetCriticality": "0", "priority": "2", "art": "1427882454263", "cat": "\/Detection\/FOO\/UNIX\/Direct Root Connection Attempt", "deviceSeverity": "Warning", "rt": "1427881661000", "shost": "server.foo.bar", "src": "10.0.0.1", "sourceZoneID": "MRL4p30sFOOO8panjcQnFbw==", "sourceZoneURI": "\/All Zones\/FOO Solutions\/Server Subnet\/UK\/PAR-WDC-12-CELL5-PROD S2U 1 10.0.0.1-10.0.0.1", "sourceGeoCountryCode": "GB", "sourceGeoLocationInfo": "London", "slong": "-0.90843", "slat": "51.9039", "dhost": "server.foo.bar", "dst": "10.0.0.1", "destinationZoneID": "McFOOO0sBABCUHR83pKJmQA==", "destinationZoneURI": "\/All Zones\/FOO Solutions\/Prod\/AMERICAS\/FOO 10.0.0.1-10.0.0.1", "duser": "johndoe", "destinationGeoCountryCode": "US", "destinationGeoLocationInfo": "Jersey City", "dlong": "-90.0435", "dlat": "30.732", "fname": "FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "filePath": "\/All Rules\/Real-time Rules\/ACBP-ACCESS CONTROL and AUTHORIZATION\/FOO\/Unix Server\/FOO-UNIX-Bypassing Golden Host-Direct Root Connection Attempt", "fileType": "Rule", "ruleThreadId": "NQVtdFOOABDrKsmLWpyq8g==", "cs2": "", "flexString2": "DC0001-988", "locality": "1", "cs2Label": "Configuration Resource", "ahost": "foo.bar", "agt": "10.0.0.1", "av": "10.0.0.12", "atz": "Europe\/Berlin", "aid": "34rSQWFOOOBCAVlswcKFkbA==", "at": "superagent_ng", "dvchost": "server.foo.bar", "dvc": "10.0.0.1", "deviceZoneID": "Mbb8pFOOODol1dBKdURJA==", "deviceZoneURI": "\/All Zones\/FOO Solutions\/Prod\/GERMANY\/FOO US2 Class6 A 508 10.0.0.1-10.0.0.1", "dtz": "Europe\/Berlin", "deviceFacility": "Rules Engine", "eventAnnotationStageUpdateTime": "1427882444192", "eventAnnotationModificationTime": "1427882444192", "eventAnnotationAuditTrail": "1,1427453188050,root,Queued,,,,\n", "eventAnnotationVersion": "1", "eventAnnotationFlags": "0", "eventAnnotationEndTime": "1427881661000", "eventAnnotationManagerReceiptTime": "1427882444085", "_cefVer": "0.1", "ad.arcSightEventPath": "3VcygrkkBABCAYFOOLlU13A==", "baseEventIds": "24934003731\"" } } }' + + +cleanup_tmp_files + diff --git a/tests/field_checkpoint-lea.sh b/tests/field_checkpoint-lea.sh new file mode 100755 index 0000000..3b55699 --- /dev/null +++ b/tests/field_checkpoint-lea.sh @@ -0,0 +1,14 @@ +# added 2015-06-18 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "Checkpoint LEA parser" +add_rule 'version=2' +add_rule 'rule=:%f:checkpoint-lea%' + +execute 'tcp_flags: RST-ACK; src: 192.168.0.1;' +assert_output_json_eq '{ "f": { "tcp_flags": "RST-ACK", "src": "192.168.0.1" } }' + + +cleanup_tmp_files + diff --git a/tests/field_checkpoint-lea_jsoncnf.sh b/tests/field_checkpoint-lea_jsoncnf.sh new file mode 100755 index 0000000..6610df0 --- /dev/null +++ b/tests/field_checkpoint-lea_jsoncnf.sh @@ -0,0 +1,14 @@ +# added 2015-06-18 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "Checkpoint LEA parser" +add_rule 'version=2' +add_rule 'rule=:%{"name":"f", "type":"checkpoint-lea"}%' + +execute 'tcp_flags: RST-ACK; src: 192.168.0.1;' +assert_output_json_eq '{ "f": { "tcp_flags": "RST-ACK", "src": "192.168.0.1" } }' + + +cleanup_tmp_files + diff --git a/tests/field_checkpoint-lea_v1.sh b/tests/field_checkpoint-lea_v1.sh new file mode 100755 index 0000000..2a7c6ad --- /dev/null +++ b/tests/field_checkpoint-lea_v1.sh @@ -0,0 +1,13 @@ +# added 2015-06-18 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "Checkpoint LEA parser" +add_rule 'rule=:%f:checkpoint-lea%' + +execute 'tcp_flags: RST-ACK; src: 192.168.0.1;' +assert_output_json_eq '{ "f": { "tcp_flags": "RST-ACK", "src": "192.168.0.1" } }' + + +cleanup_tmp_files + diff --git a/tests/field_cisco-interface-spec.sh b/tests/field_cisco-interface-spec.sh new file mode 100755 index 0000000..1cf95ab --- /dev/null +++ b/tests/field_cisco-interface-spec.sh @@ -0,0 +1,44 @@ +# added 2015-04-13 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "cisco-interface-spec syntax" +add_rule 'rule=:begin %field:cisco-interface-spec% end' + +execute 'begin outside:176.97.252.102/50349 end' +assert_output_json_eq '{"field": { "interface": "outside", "ip": "176.97.252.102", "port": "50349" } }' + +execute 'begin outside:176.97.252.102/50349(DOMAIN\rainer) end' +# we need to add the backslash escape for the testbench plumbing +assert_output_json_eq '{"field": { "interface": "outside", "ip": "176.97.252.102", "port": "50349", "user": "DOMAIN\\rainer" } }' + +execute 'begin outside:176.97.252.102/50349(test/rainer) end' +# we need to add the backslash escape for the testbench plumbing +assert_output_json_eq '{"field": { "interface": "outside", "ip": "176.97.252.102", "port": "50349", "user": "test/rainer" } }' + +execute 'begin outside:176.97.252.102/50349(rainer) end' +# we need to add the backslash escape for the testbench plumbing +assert_output_json_eq '{"field": { "interface": "outside", "ip": "176.97.252.102", "port": "50349", "user": "rainer" } }' + +execute 'begin outside:192.168.1.13/50179 (192.168.1.13/50179)(LOCAL\some.user) end' +assert_output_json_eq ' { "field": { "interface": "outside", "ip": "192.168.1.13", "port": "50179", "ip2": "192.168.1.13", "port2": "50179", "user": "LOCAL\\some.user" } }' + +execute 'begin outside:192.168.1.13/50179 (192.168.1.13/50179) (LOCAL\some.user) end' +assert_output_json_eq ' { "field": { "interface": "outside", "ip": "192.168.1.13", "port": "50179", "ip2": "192.168.1.13", "port2": "50179", "user": "LOCAL\\some.user" } }' + +execute 'begin 192.168.1.13/50179 (192.168.1.13/50179) (LOCAL\without.if) end' +assert_output_json_eq ' { "field": { "ip": "192.168.1.13", "port": "50179", "ip2": "192.168.1.13", "port2": "50179", "user": "LOCAL\\without.if" } }' + +# +# Test for things that MUST NOT match! +# + +# the SP before the second IP is missing: +execute 'begin outside:192.168.1.13/50179(192.168.1.13/50179)(LOCAL\some.user) end' +# note: the expected result looks a bit strange. This is the case because we +# cannot (yet?) detect that "(192.168.1.13/50179)" is not a valid user name. +assert_output_json_eq '{ "originalmsg": "begin outside:192.168.1.13\/50179(192.168.1.13\/50179)(LOCAL\\some.user) end", "unparsed-data": "(LOCAL\\some.user) end" }' + + +cleanup_tmp_files + diff --git a/tests/field_descent.sh b/tests/field_descent.sh new file mode 100755 index 0000000..5ebb98d --- /dev/null +++ b/tests/field_descent.sh @@ -0,0 +1,84 @@ +# added 2014-12-11 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "descent based parsing field" + +#descent with default tail field +add_rule 'rule=:blocked on %device:word% %net:descent:./child.rulebase%at %tm:date-rfc5424%' +reset_rules 'child' +add_rule 'rule=:%ip_addr:ipv4% %tail:rest%' 'child' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %tail:rest%' 'child' +execute 'blocked on gw-1 10.20.30.40 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"ip_addr": "10.20.30.40"}, "tm": "2014-12-08T08:53:33.05+05:30"}' +execute 'blocked on gw-1 10.20.30.40/16 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}, "tm": "2014-12-08T08:53:33.05+05:30"}' + +#descent with tail field being explicitly named 'tail' +reset_rules +add_rule 'rule=:blocked on %device:word% %net:descent:./field.rulebase:tail%at %tm:date-rfc5424%' +reset_rules 'field' +add_rule 'rule=:%ip_addr:ipv4% %tail:rest%' 'field' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %tail:rest%' 'field' +execute 'blocked on gw-1 10.20.30.40 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"ip_addr": "10.20.30.40"}, "tm": "2014-12-08T08:53:33.05+05:30"}' +execute 'blocked on gw-1 10.20.30.40/16 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}, "tm": "2014-12-08T08:53:33.05+05:30"}' + +#descent with tail field having arbirary name +reset_rules +add_rule 'rule=:blocked on %device:word% %net:descent:./subset.rulebase:remaining%at %tm:date-rfc5424%' +reset_rules 'subset' +add_rule 'rule=:%ip_addr:ipv4% %remaining:rest%' 'subset' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %remaining:rest%' 'subset' +execute 'blocked on gw-1 10.20.30.40 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"ip_addr": "10.20.30.40"}, "tm": "2014-12-08T08:53:33.05+05:30"}' +execute 'blocked on gw-1 10.20.30.40/16 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}, "tm": "2014-12-08T08:53:33.05+05:30"}' + +#head call handling with with separate rulebase and tail field with with arbitrary name (this is what recursive field can't do) +reset_rules +add_rule 'rule=:%net:descent:./alt.rulebase:remains%blocked on %device:word%' +reset_rules 'alt' +add_rule 'rule=:%ip_addr:ipv4% %remains:rest%' 'alt' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %remains:rest%' 'alt' +execute '10.20.30.40 blocked on gw-1' +assert_output_json_eq '{"device": "gw-1", "net": {"ip_addr": "10.20.30.40"}}' +execute '10.20.30.40/16 blocked on gw-1' +assert_output_json_eq '{"device": "gw-1", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}}' + +#descent-field which calls another descent-field +reset_rules +add_rule 'rule=:%op:descent:./op.rulebase:rest% on %device:word%' +reset_rules 'op' +add_rule 'rule=:%net:descent:./alt.rulebase:remains%%action:word%%rest:rest%' 'op' +reset_rules 'alt' +add_rule 'rule=:%ip_addr:ipv4% %remains:rest%' 'alt' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %remains:rest%' 'alt' +execute '10.20.30.40 blocked on gw-1' +assert_output_json_eq '{"op": {"action": "blocked", "net": {"ip_addr": "10.20.30.40"}}, "device": "gw-1"}' +execute '10.20.30.40/16 unblocked on gw-2' +assert_output_json_eq '{"op": {"action": "unblocked", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}}, "device": "gw-2"}' + +#descent with file name having lognorm special char +add_rule 'rule=:blocked on %device:word% %net:descent:./part\x3anet.rulebase%at %tm:date-rfc5424%' +reset_rules 'part:net' +add_rule 'rule=:%ip_addr:ipv4% %tail:rest%' 'part:net' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %tail:rest%' 'part:net' +execute 'blocked on gw-1 10.20.30.40 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"ip_addr": "10.20.30.40"}, "tm": "2014-12-08T08:53:33.05+05:30"}' +execute 'blocked on gw-1 10.20.30.40/16 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}, "tm": "2014-12-08T08:53:33.05+05:30"}' + + +cleanup_tmp_files + diff --git a/tests/field_descent_with_invalid_ruledef.sh b/tests/field_descent_with_invalid_ruledef.sh new file mode 100755 index 0000000..cb0011f --- /dev/null +++ b/tests/field_descent_with_invalid_ruledef.sh @@ -0,0 +1,89 @@ +# added 2014-12-15 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "descent based parsing field, with invalid ruledef" + +#invalid parent field name +add_rule 'rule=:%net:desce%' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + +#no args +add_rule 'rule=:%net:descent%' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + +#incorrect rulebase file path +rm -f $srcdir/quux.rulebase +add_rule 'rule=:%net:descent:./quux.rulebase%' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + +#invalid content in rulebase file +reset_rules +add_rule 'rule=:%net:descent:./child.rulebase%' +reset_rules 'child' +add_rule 'rule=:%ip_addr:ipv4 %tail:rest%' 'child' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + +#empty child rulebase file +reset_rules +add_rule 'rule=:%net:descent:./child.rulebase%' +reset_rules 'child' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + +#no rulebase given +reset_rules +add_rule 'rule=:%net:descent:' +reset_rules 'child' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + +#no rulebase and no tail-field given +reset_rules +add_rule 'rule=:%net:descent::' +reset_rules 'child' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + +#no rulebase given, but has valid tail-field +reset_rules +add_rule 'rule=:%net:descent::foo' +reset_rules 'child' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + +#empty tail-field given +echo empty tail-field given +rm tmp.rulebase +reset_rules +add_rule 'rule=:A%net:descent:./child.rulebase:%' +reset_rules 'child' +add_rule 'rule=:%ip_addr:ipv4% %tail:rest%' 'child' +execute 'A10.20.30.40 foo' +assert_output_json_eq '{ "net": { "tail": "foo", "ip_addr": "10.20.30.40" } }' + +#named tail-field not populated +echo tail-field not populated +reset_rules +add_rule 'rule=:%net:descent:./child.rulebase:foo% foo' +reset_rules 'child' +add_rule 'rule=:%ip_addr:ipv4% %tail:rest%' 'child' +execute '10.20.30.40 foo' +assert_output_json_eq '{ "originalmsg": "10.20.30.40 foo", "unparsed-data": "10.20.30.40 foo" }' + + +cleanup_tmp_files + diff --git a/tests/field_duration.sh b/tests/field_duration.sh new file mode 100755 index 0000000..735e3aa --- /dev/null +++ b/tests/field_duration.sh @@ -0,0 +1,30 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "duration syntax" +add_rule 'version=2' +add_rule 'rule=:duration %field:duration% bytes' +add_rule 'rule=:duration %field:duration%' + +execute 'duration 0:00:42 bytes' +assert_output_json_eq '{"field": "0:00:42"}' + +execute 'duration 0:00:42' +assert_output_json_eq '{"field": "0:00:42"}' + +execute 'duration 9:00:42 bytes' +assert_output_json_eq '{"field": "9:00:42"}' + +execute 'duration 00:00:42 bytes' +assert_output_json_eq '{"field": "00:00:42"}' + +execute 'duration 37:59:42 bytes' +assert_output_json_eq '{"field": "37:59:42"}' + +execute 'duration 37:60:42 bytes' +assert_output_contains '"unparsed-data": "37:60:42 bytes"' + + +cleanup_tmp_files + diff --git a/tests/field_duration_jsoncnf.sh b/tests/field_duration_jsoncnf.sh new file mode 100755 index 0000000..957ddd6 --- /dev/null +++ b/tests/field_duration_jsoncnf.sh @@ -0,0 +1,30 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "duration syntax" +add_rule 'version=2' +add_rule 'rule=:duration %{"name":"field", "type":"duration"}% bytes' +add_rule 'rule=:duration %{"name":"field", "type":"duration"}%' + +execute 'duration 0:00:42 bytes' +assert_output_json_eq '{"field": "0:00:42"}' + +execute 'duration 0:00:42' +assert_output_json_eq '{"field": "0:00:42"}' + +execute 'duration 9:00:42 bytes' +assert_output_json_eq '{"field": "9:00:42"}' + +execute 'duration 00:00:42 bytes' +assert_output_json_eq '{"field": "00:00:42"}' + +execute 'duration 37:59:42 bytes' +assert_output_json_eq '{"field": "37:59:42"}' + +execute 'duration 37:60:42 bytes' +assert_output_contains '"unparsed-data": "37:60:42 bytes"' + + +cleanup_tmp_files + diff --git a/tests/field_duration_v1.sh b/tests/field_duration_v1.sh new file mode 100755 index 0000000..efd642e --- /dev/null +++ b/tests/field_duration_v1.sh @@ -0,0 +1,38 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "duration syntax" +add_rule 'rule=:duration %field:duration% bytes' +add_rule 'rule=:duration %field:duration%' + +execute 'duration 0:00:42 bytes' +assert_output_json_eq '{"field": "0:00:42"}' + +execute 'duration 0:00:42' +assert_output_json_eq '{"field": "0:00:42"}' + +execute 'duration 9:00:42 bytes' +assert_output_json_eq '{"field": "9:00:42"}' + +execute 'duration 00:00:42 bytes' +assert_output_json_eq '{"field": "00:00:42"}' + +execute 'duration 37:59:42 bytes' +assert_output_json_eq '{"field": "37:59:42"}' + +execute 'duration 37:60:42 bytes' +assert_output_contains '"unparsed-data": "37:60:42 bytes"' + + +cleanup_tmp_files + diff --git a/tests/field_float-fmt_number.sh b/tests/field_float-fmt_number.sh new file mode 100755 index 0000000..71cc68a --- /dev/null +++ b/tests/field_float-fmt_number.sh @@ -0,0 +1,36 @@ +# added 2017-10-02 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "float field" +add_rule 'version=2' +add_rule 'rule=:here is a number %{ "type":"float", "name":"num", "format":"number"}% in floating pt form' +execute 'here is a number 15.9 in floating pt form' +assert_output_json_eq '{"num": 15.9}' + +reset_rules + +# note: floating point numbers are tricky to get right, even more so if negative. +add_rule 'version=2' +add_rule 'rule=:here is a negative number %{ "type":"float", "name":"num", "format":"number"}% for you' +execute 'here is a negative number -4.2 for you' +assert_output_json_eq '{"num": -4.2}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:here is another real number %{ "type":"float", "name":"num", "format":"number"}%.' +execute 'here is another real number 2.71.' +assert_output_json_eq '{"num": 2.71}' + + +cleanup_tmp_files diff --git a/tests/field_float.sh b/tests/field_float.sh new file mode 100755 index 0000000..c7c964c --- /dev/null +++ b/tests/field_float.sh @@ -0,0 +1,27 @@ +# added 2015-02-25 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "float field" +add_rule 'version=2' +add_rule 'rule=:here is a number %num:float% in floating pt form' +execute 'here is a number 15.9 in floating pt form' +assert_output_json_eq '{"num": "15.9"}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:here is a negative number %num:float% for you' +execute 'here is a negative number -4.2 for you' +assert_output_json_eq '{"num": "-4.2"}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:here is another real number %real_no:float%.' +execute 'here is another real number 2.71.' +assert_output_json_eq '{"real_no": "2.71"}' + + +cleanup_tmp_files + diff --git a/tests/field_float_jsoncnf.sh b/tests/field_float_jsoncnf.sh new file mode 100755 index 0000000..0c53c19 --- /dev/null +++ b/tests/field_float_jsoncnf.sh @@ -0,0 +1,27 @@ +# added 2015-02-25 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "float field" +add_rule 'version=2' +add_rule 'rule=:here is a number %{"name":"num", "type":"float"}% in floating pt form' +execute 'here is a number 15.9 in floating pt form' +assert_output_json_eq '{"num": "15.9"}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:here is a negative number %{"name":"num", "type":"float"}% for you' +execute 'here is a negative number -4.2 for you' +assert_output_json_eq '{"num": "-4.2"}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:here is another real number %{"name":"real_no", "type":"float"}%.' +execute 'here is another real number 2.71.' +assert_output_json_eq '{"real_no": "2.71"}' + + +cleanup_tmp_files + diff --git a/tests/field_float_v1.sh b/tests/field_float_v1.sh new file mode 100755 index 0000000..b2f63b9 --- /dev/null +++ b/tests/field_float_v1.sh @@ -0,0 +1,33 @@ +# added 2015-02-25 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "float field" +add_rule 'rule=:here is a number %num:float% in floating pt form' +execute 'here is a number 15.9 in floating pt form' +assert_output_json_eq '{"num": "15.9"}' + +reset_rules + +add_rule 'rule=:here is a negative number %num:float% for you' +execute 'here is a negative number -4.2 for you' +assert_output_json_eq '{"num": "-4.2"}' + +reset_rules + +add_rule 'rule=:here is another real number %real_no:float%.' +execute 'here is another real number 2.71.' +assert_output_json_eq '{"real_no": "2.71"}' + + +cleanup_tmp_files + diff --git a/tests/field_float_with_invalid_ruledef.sh b/tests/field_float_with_invalid_ruledef.sh new file mode 100755 index 0000000..bdaebaa --- /dev/null +++ b/tests/field_float_with_invalid_ruledef.sh @@ -0,0 +1,13 @@ +# added 2015-02-26 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "float with invalid field-declaration" + +add_rule 'rule=:%no:flo% foo' +execute '10.0 foo' +assert_output_json_eq '{ "originalmsg": "10.0 foo", "unparsed-data": "10.0 foo" }' + + +cleanup_tmp_files + diff --git a/tests/field_hexnumber-fmt_number.sh b/tests/field_hexnumber-fmt_number.sh new file mode 100755 index 0000000..5156117 --- /dev/null +++ b/tests/field_hexnumber-fmt_number.sh @@ -0,0 +1,18 @@ +# added 2017-10-02 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "hexnumber field" +add_rule 'version=2' +add_rule 'rule=:here is a number %{ "type":"hexnumber", "name":"num", "format":"number"} % in hex form' +execute 'here is a number 0x1234 in hex form' +assert_output_json_eq '{"num": 4660}' + +#check cases where parsing failure must occur +execute 'here is a number 0x1234in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0x1234in hex form", "unparsed-data": "0x1234in hex form" }' + + +cleanup_tmp_files + diff --git a/tests/field_hexnumber.sh b/tests/field_hexnumber.sh new file mode 100755 index 0000000..99a6990 --- /dev/null +++ b/tests/field_hexnumber.sh @@ -0,0 +1,18 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "hexnumber field" +add_rule 'version=2' +add_rule 'rule=:here is a number %num:hexnumber% in hex form' +execute 'here is a number 0x1234 in hex form' +assert_output_json_eq '{"num": "0x1234"}' + +#check cases where parsing failure must occur +execute 'here is a number 0x1234in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0x1234in hex form", "unparsed-data": "0x1234in hex form" }' + + +cleanup_tmp_files + diff --git a/tests/field_hexnumber_jsoncnf.sh b/tests/field_hexnumber_jsoncnf.sh new file mode 100755 index 0000000..50339db --- /dev/null +++ b/tests/field_hexnumber_jsoncnf.sh @@ -0,0 +1,15 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "hexnumber field" +add_rule 'version=2' +add_rule 'rule=:here is a number %{"name":"num", "type":"hexnumber"}% in hex form' +execute 'here is a number 0x1234 in hex form' +assert_output_json_eq '{"num": "0x1234"}' + +#check cases where parsing failure must occur +execute 'here is a number 0x1234in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0x1234in hex form", "unparsed-data": "0x1234in hex form" }' + +cleanup_tmp_files diff --git a/tests/field_hexnumber_range.sh b/tests/field_hexnumber_range.sh new file mode 100755 index 0000000..99c085c --- /dev/null +++ b/tests/field_hexnumber_range.sh @@ -0,0 +1,22 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "hexnumber field with range checks" +add_rule 'version=2' +add_rule 'rule=:here is a number %num:hexnumber{"maxval":191}% in hex form' +execute 'here is a number 0x12 in hex form' +assert_output_json_eq '{"num": "0x12"}' +execute 'here is a number 0x0 in hex form' +assert_output_json_eq '{"num": "0x0"}' +execute 'here is a number 0xBf in hex form' +assert_output_json_eq '{"num": "0xBf"}' + +#check cases where parsing failure must occur +execute 'here is a number 0xc0 in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0xc0 in hex form", "unparsed-data": "0xc0 in hex form" }' + + +cleanup_tmp_files + diff --git a/tests/field_hexnumber_range_jsoncnf.sh b/tests/field_hexnumber_range_jsoncnf.sh new file mode 100755 index 0000000..efad522 --- /dev/null +++ b/tests/field_hexnumber_range_jsoncnf.sh @@ -0,0 +1,22 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "hexnumber field with range checks" +add_rule 'version=2' +add_rule 'rule=:here is a number %{"name":"num", "type":"hexnumber", "maxval":191}% in hex form' +execute 'here is a number 0x12 in hex form' +assert_output_json_eq '{"num": "0x12"}' +execute 'here is a number 0x0 in hex form' +assert_output_json_eq '{"num": "0x0"}' +execute 'here is a number 0xBf in hex form' +assert_output_json_eq '{"num": "0xBf"}' + +#check cases where parsing failure must occur +execute 'here is a number 0xc0 in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0xc0 in hex form", "unparsed-data": "0xc0 in hex form" }' + + +cleanup_tmp_files + diff --git a/tests/field_hexnumber_v1.sh b/tests/field_hexnumber_v1.sh new file mode 100755 index 0000000..967da07 --- /dev/null +++ b/tests/field_hexnumber_v1.sh @@ -0,0 +1,26 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi + +. $srcdir/exec.sh + +test_def $0 "hexnumber field" +add_rule 'rule=:here is a number %num:hexnumber% in hex form' +execute 'here is a number 0x1234 in hex form' +assert_output_json_eq '{"num": "0x1234"}' + +#check cases where parsing failure must occur +execute 'here is a number 0x1234in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0x1234in hex form", "unparsed-data": "0x1234in hex form" }' + + +cleanup_tmp_files + diff --git a/tests/field_interpret.sh b/tests/field_interpret.sh new file mode 100755 index 0000000..4a07132 --- /dev/null +++ b/tests/field_interpret.sh @@ -0,0 +1,65 @@ +# added 2014-12-11 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "value interpreting field" + +add_rule 'rule=:%session_count:interpret:int:word% sessions established' +execute '64 sessions established' +assert_output_json_eq '{"session_count": 64}' + +reset_rules +add_rule 'rule=:max sessions limit reached: %at_limit:interpret:bool:word%' +execute 'max sessions limit reached: true' +assert_output_json_eq '{"at_limit": true}' +execute 'max sessions limit reached: false' +assert_output_json_eq '{"at_limit": false}' +execute 'max sessions limit reached: TRUE' +assert_output_json_eq '{"at_limit": true}' +execute 'max sessions limit reached: FALSE' +assert_output_json_eq '{"at_limit": false}' +execute 'max sessions limit reached: yes' +assert_output_json_eq '{"at_limit": true}' +execute 'max sessions limit reached: no' +assert_output_json_eq '{"at_limit": false}' +execute 'max sessions limit reached: YES' +assert_output_json_eq '{"at_limit": true}' +execute 'max sessions limit reached: NO' +assert_output_json_eq '{"at_limit": false}' + +reset_rules +add_rule 'rule=:record count for shard [%shard:interpret:base16int:char-to:]%] is %record_count:interpret:base10int:number% and %latency_percentile:interpret:float:char-to:\x25%\x25ile latency is %latency:interpret:float:word% %latency_unit:word%' +execute 'record count for shard [3F] is 50000 and 99.99%ile latency is 2.1 seconds' +assert_output_json_eq '{"shard": 63, "record_count": 50000, "latency_percentile": 99.99, "latency": 2.1, "latency_unit" : "seconds"}' + +reset_rules +add_rule 'rule=:%latency_percentile:interpret:float:char-to:\x25%\x25ile latency is %latency:interpret:float:word%' +execute '98.1%ile latency is 1.999123' +assert_output_json_eq '{"latency_percentile": 98.1, "latency": 1.999123}' + +reset_rules +add_rule 'rule=:%latency_percentile:interpret:float:number%' +add_rule 'rule=:%latency_percentile:interpret:int:number%' +add_rule 'rule=:%latency_percentile:interpret:base16int:number%' +add_rule 'rule=:%latency_percentile:interpret:base10int:number%' +add_rule 'rule=:%latency_percentile:interpret:boolean:number%' +execute 'foo' +assert_output_json_eq '{ "originalmsg": "foo", "unparsed-data": "foo" }' + +reset_rules +add_rule 'rule=:gc pause: %pause_time:interpret:float:float%ms' +execute 'gc pause: 12.3ms' +assert_output_json_eq '{"pause_time": 12.3}' + + +cleanup_tmp_files + diff --git a/tests/field_interpret_with_invalid_ruledef.sh b/tests/field_interpret_with_invalid_ruledef.sh new file mode 100755 index 0000000..a303edf --- /dev/null +++ b/tests/field_interpret_with_invalid_ruledef.sh @@ -0,0 +1,59 @@ +# added 2014-12-11 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "value interpreting field, with invalid ruledef" + +add_rule 'rule=:%session_count:interpret:int:wd% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:interpret:int:% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:interpret:int% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:interpret:in% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:interpret:in:word% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:interpret:in:wd% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:interpret::word% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:interpret::% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:inter::% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + +reset_rules +add_rule 'rule=:%session_count:inter:int:word% sessions established' +execute '64 sessions established' +assert_output_json_eq '{ "originalmsg": "64 sessions established", "unparsed-data": "64 sessions established" }' + + + +cleanup_tmp_files + diff --git a/tests/field_ipv6.sh b/tests/field_ipv6.sh new file mode 100755 index 0000000..3887a55 --- /dev/null +++ b/tests/field_ipv6.sh @@ -0,0 +1,57 @@ +# added 2015-06-23 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +add_rule 'version=2' +test_def $0 "IPv6 parser" +add_rule 'rule=:%f:ipv6%' + +# examples from RFC4291, sect. 2.2 +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345:6789' +assert_output_json_eq '{ "f": "ABCD:EF01:2345:6789:ABCD:EF01:2345:6789" }' +execute 'ABCD:EF01:2345:6789:abcd:EF01:2345:6789' # mixed hex case +assert_output_json_eq '{ "f": "ABCD:EF01:2345:6789:abcd:EF01:2345:6789" }' +execute '2001:DB8:0:0:8:800:200C:417A' +assert_output_json_eq '{ "f": "2001:DB8:0:0:8:800:200C:417A" }' + +execute '0:0:0:0:0:0:0:1' +assert_output_json_eq '{ "f": "0:0:0:0:0:0:0:1" }' + +execute '2001:DB8::8:800:200C:417A' +assert_output_json_eq '{ "f": "2001:DB8::8:800:200C:417A" }' +execute 'FF01::101' +assert_output_json_eq '{ "f": "FF01::101" }' +execute '::1' +assert_output_json_eq '{ "f": "::1" }' +execute '::' +assert_output_json_eq '{ "f": "::" }' + +execute '0:0:0:0:0:0:13.1.68.3' +assert_output_json_eq '{ "f": "0:0:0:0:0:0:13.1.68.3" }' +execute '::13.1.68.3' +assert_output_json_eq '{ "f": "::13.1.68.3" }' +execute '::FFFF:129.144.52.38' +assert_output_json_eq '{ "f": "::FFFF:129.144.52.38" }' + +# invalid samples +execute '2001:DB8::8::800:200C:417A' # two :: sequences +assert_output_json_eq '{ "originalmsg": "2001:DB8::8::800:200C:417A", "unparsed-data": "2001:DB8::8::800:200C:417A" }' + +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345::6789' # :: with too many blocks +assert_output_json_eq '{ "originalmsg": "ABCD:EF01:2345:6789:ABCD:EF01:2345::6789", "unparsed-data": "ABCD:EF01:2345:6789:ABCD:EF01:2345::6789" }' + +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798' # too many blocks (9) +assert_output_json_eq '{"originalmsg": "ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798", "unparsed-data": "ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798" }' + +execute ':0:0:0:0:0:0:1' # missing first digit +assert_output_json_eq '{ "originalmsg": ":0:0:0:0:0:0:1", "unparsed-data": ":0:0:0:0:0:0:1" }' + +execute '0:0:0:0:0:0:0:' # missing last digit +assert_output_json_eq '{ "originalmsg": "0:0:0:0:0:0:0:", "unparsed-data": "0:0:0:0:0:0:0:" }' + +execute '13.1.68.3' # pure IPv4 address +assert_output_json_eq '{ "originalmsg": "13.1.68.3", "unparsed-data": "13.1.68.3" }' + + +cleanup_tmp_files + diff --git a/tests/field_ipv6_jsoncnf.sh b/tests/field_ipv6_jsoncnf.sh new file mode 100755 index 0000000..3a75059 --- /dev/null +++ b/tests/field_ipv6_jsoncnf.sh @@ -0,0 +1,57 @@ +# added 2015-06-23 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "IPv6 parser" +add_rule 'version=2' +add_rule 'rule=:%{"name":"f", "type":"ipv6"}%' + +# examples from RFC4291, sect. 2.2 +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345:6789' +assert_output_json_eq '{ "f": "ABCD:EF01:2345:6789:ABCD:EF01:2345:6789" }' +execute 'ABCD:EF01:2345:6789:abcd:EF01:2345:6789' # mixed hex case +assert_output_json_eq '{ "f": "ABCD:EF01:2345:6789:abcd:EF01:2345:6789" }' +execute '2001:DB8:0:0:8:800:200C:417A' +assert_output_json_eq '{ "f": "2001:DB8:0:0:8:800:200C:417A" }' + +execute '0:0:0:0:0:0:0:1' +assert_output_json_eq '{ "f": "0:0:0:0:0:0:0:1" }' + +execute '2001:DB8::8:800:200C:417A' +assert_output_json_eq '{ "f": "2001:DB8::8:800:200C:417A" }' +execute 'FF01::101' +assert_output_json_eq '{ "f": "FF01::101" }' +execute '::1' +assert_output_json_eq '{ "f": "::1" }' +execute '::' +assert_output_json_eq '{ "f": "::" }' + +execute '0:0:0:0:0:0:13.1.68.3' +assert_output_json_eq '{ "f": "0:0:0:0:0:0:13.1.68.3" }' +execute '::13.1.68.3' +assert_output_json_eq '{ "f": "::13.1.68.3" }' +execute '::FFFF:129.144.52.38' +assert_output_json_eq '{ "f": "::FFFF:129.144.52.38" }' + +# invalid samples +execute '2001:DB8::8::800:200C:417A' # two :: sequences +assert_output_json_eq '{ "originalmsg": "2001:DB8::8::800:200C:417A", "unparsed-data": "2001:DB8::8::800:200C:417A" }' + +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345::6789' # :: with too many blocks +assert_output_json_eq '{ "originalmsg": "ABCD:EF01:2345:6789:ABCD:EF01:2345::6789", "unparsed-data": "ABCD:EF01:2345:6789:ABCD:EF01:2345::6789" }' + +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798' # too many blocks (9) +assert_output_json_eq '{"originalmsg": "ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798", "unparsed-data": "ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798" }' + +execute ':0:0:0:0:0:0:1' # missing first digit +assert_output_json_eq '{ "originalmsg": ":0:0:0:0:0:0:1", "unparsed-data": ":0:0:0:0:0:0:1" }' + +execute '0:0:0:0:0:0:0:' # missing last digit +assert_output_json_eq '{ "originalmsg": "0:0:0:0:0:0:0:", "unparsed-data": "0:0:0:0:0:0:0:" }' + +execute '13.1.68.3' # pure IPv4 address +assert_output_json_eq '{ "originalmsg": "13.1.68.3", "unparsed-data": "13.1.68.3" }' + + +cleanup_tmp_files + diff --git a/tests/field_ipv6_v1.sh b/tests/field_ipv6_v1.sh new file mode 100755 index 0000000..3655685 --- /dev/null +++ b/tests/field_ipv6_v1.sh @@ -0,0 +1,56 @@ +# added 2015-06-23 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "IPv6 parser" +add_rule 'rule=:%f:ipv6%' + +# examples from RFC4291, sect. 2.2 +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345:6789' +assert_output_json_eq '{ "f": "ABCD:EF01:2345:6789:ABCD:EF01:2345:6789" }' +execute 'ABCD:EF01:2345:6789:abcd:EF01:2345:6789' # mixed hex case +assert_output_json_eq '{ "f": "ABCD:EF01:2345:6789:abcd:EF01:2345:6789" }' +execute '2001:DB8:0:0:8:800:200C:417A' +assert_output_json_eq '{ "f": "2001:DB8:0:0:8:800:200C:417A" }' + +execute '0:0:0:0:0:0:0:1' +assert_output_json_eq '{ "f": "0:0:0:0:0:0:0:1" }' + +execute '2001:DB8::8:800:200C:417A' +assert_output_json_eq '{ "f": "2001:DB8::8:800:200C:417A" }' +execute 'FF01::101' +assert_output_json_eq '{ "f": "FF01::101" }' +execute '::1' +assert_output_json_eq '{ "f": "::1" }' +execute '::' +assert_output_json_eq '{ "f": "::" }' + +execute '0:0:0:0:0:0:13.1.68.3' +assert_output_json_eq '{ "f": "0:0:0:0:0:0:13.1.68.3" }' +execute '::13.1.68.3' +assert_output_json_eq '{ "f": "::13.1.68.3" }' +execute '::FFFF:129.144.52.38' +assert_output_json_eq '{ "f": "::FFFF:129.144.52.38" }' + +# invalid samples +execute '2001:DB8::8::800:200C:417A' # two :: sequences +assert_output_json_eq '{ "originalmsg": "2001:DB8::8::800:200C:417A", "unparsed-data": "2001:DB8::8::800:200C:417A" }' + +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345::6789' # :: with too many blocks +assert_output_json_eq '{ "originalmsg": "ABCD:EF01:2345:6789:ABCD:EF01:2345::6789", "unparsed-data": "ABCD:EF01:2345:6789:ABCD:EF01:2345::6789" }' + +execute 'ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798' # too many blocks (9) +assert_output_json_eq '{"originalmsg": "ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798", "unparsed-data": "ABCD:EF01:2345:6789:ABCD:EF01:2345:1:6798" }' + +execute ':0:0:0:0:0:0:1' # missing first digit +assert_output_json_eq '{ "originalmsg": ":0:0:0:0:0:0:1", "unparsed-data": ":0:0:0:0:0:0:1" }' + +execute '0:0:0:0:0:0:0:' # missing last digit +assert_output_json_eq '{ "originalmsg": "0:0:0:0:0:0:0:", "unparsed-data": "0:0:0:0:0:0:0:" }' + +execute '13.1.68.3' # pure IPv4 address +assert_output_json_eq '{ "originalmsg": "13.1.68.3", "unparsed-data": "13.1.68.3" }' + + +cleanup_tmp_files + diff --git a/tests/field_json.sh b/tests/field_json.sh new file mode 100755 index 0000000..53e0915 --- /dev/null +++ b/tests/field_json.sh @@ -0,0 +1,65 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "JSON field" +add_rule 'version=2' +add_rule 'rule=:%field:json%' + +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# let's see if something more complicated still works, so ADD some +# more rules +add_rule 'rule=:begin %field:json%' +add_rule 'rule=:begin %field:json%end' +add_rule 'rule=:%field:json%end' + +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +#check if trailinge whitspace is ignored +execute '{"f1": "1", "f2": 2} ' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute 'begin {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute 'begin {"f1": "1", "f2": 2}end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +# note: the parser takes all whitespace after the JSON +# to be part of it! +execute 'begin {"f1": "1", "f2": 2} end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +execute 'begin {"f1": "1", "f2": 2} end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '{"f1": "1", "f2": 2}end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +#check cases where parsing failure must occur +execute '{"f1": "1", f2: 2}' +assert_output_json_eq '{ "originalmsg": "{\"f1\": \"1\", f2: 2}", "unparsed-data": "{\"f1\": \"1\", f2: 2}" }' + +#some more complex cases +add_rule 'rule=:%field1:json%-%field2:json%' + +execute '{"f1": "1"}-{"f2": 2}' +assert_output_json_eq '{ "field2": { "f2": 2 }, "field1": { "f1": "1" } }' + +# re-check previsous def still works +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# now check some strange cases +reset_rules +add_rule 'version=2' +add_rule 'rule=:%field:json%' + +# this check is because of bug in json-c: +# https://github.com/json-c/json-c/issues/181 +execute '15:00' +assert_output_json_eq '{ "originalmsg": "15:00", "unparsed-data": "15:00" }' + + +cleanup_tmp_files + diff --git a/tests/field_json_jsoncnf.sh b/tests/field_json_jsoncnf.sh new file mode 100755 index 0000000..4573e2d --- /dev/null +++ b/tests/field_json_jsoncnf.sh @@ -0,0 +1,65 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "JSON field" +add_rule 'version=2' +add_rule 'rule=:%{"name":"field", "type":"json"}%' + +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# let's see if something more complicated still works, so ADD some +# more rules +add_rule 'rule=:begin %field:json%' +add_rule 'rule=:begin %field:json%end' +add_rule 'rule=:%field:json%end' + +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +#check if trailinge whitspace is ignored +execute '{"f1": "1", "f2": 2} ' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute 'begin {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute 'begin {"f1": "1", "f2": 2}end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +# note: the parser takes all whitespace after the JSON +# to be part of it! +execute 'begin {"f1": "1", "f2": 2} end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +execute 'begin {"f1": "1", "f2": 2} end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '{"f1": "1", "f2": 2}end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +#check cases where parsing failure must occur +execute '{"f1": "1", f2: 2}' +assert_output_json_eq '{ "originalmsg": "{\"f1\": \"1\", f2: 2}", "unparsed-data": "{\"f1\": \"1\", f2: 2}" }' + +#some more complex cases +add_rule 'rule=:%field1:json%-%field2:json%' + +execute '{"f1": "1"}-{"f2": 2}' +assert_output_json_eq '{ "field2": { "f2": 2 }, "field1": { "f1": "1" } }' + +# re-check previsous def still works +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# now check some strange cases +reset_rules +add_rule 'version=2' +add_rule 'rule=:%field:json%' + +# this check is because of bug in json-c: +# https://github.com/json-c/json-c/issues/181 +execute '15:00' +assert_output_json_eq '{ "originalmsg": "15:00", "unparsed-data": "15:00" }' + + +cleanup_tmp_files + diff --git a/tests/field_json_v1.sh b/tests/field_json_v1.sh new file mode 100755 index 0000000..a7c440e --- /dev/null +++ b/tests/field_json_v1.sh @@ -0,0 +1,63 @@ +# added 2015-03-01 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "JSON field" +add_rule 'rule=:%field:json%' + +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# let's see if something more complicated still works, so ADD some +# more rules +add_rule 'rule=:begin %field:json%' +add_rule 'rule=:begin %field:json%end' +add_rule 'rule=:%field:json%end' + +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +#check if trailinge whitspace is ignored +execute '{"f1": "1", "f2": 2} ' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute 'begin {"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute 'begin {"f1": "1", "f2": 2}end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +# note: the parser takes all whitespace after the JSON +# to be part of it! +execute 'begin {"f1": "1", "f2": 2} end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' +execute 'begin {"f1": "1", "f2": 2} end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +execute '{"f1": "1", "f2": 2}end' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +#check cases where parsing failure must occur +execute '{"f1": "1", f2: 2}' +assert_output_json_eq '{ "originalmsg": "{\"f1\": \"1\", f2: 2}", "unparsed-data": "{\"f1\": \"1\", f2: 2}" }' + +#some more complex cases +add_rule 'rule=:%field1:json%-%field2:json%' + +execute '{"f1": "1"}-{"f2": 2}' +assert_output_json_eq '{ "field2": { "f2": 2 }, "field1": { "f1": "1" } }' + +# re-check previsous def still works +execute '{"f1": "1", "f2": 2}' +assert_output_json_eq '{ "field": { "f1": "1", "f2": 2 } }' + +# now check some strange cases +reset_rules +add_rule 'rule=:%field:json%' + +# this check is because of bug in json-c: +# https://github.com/json-c/json-c/issues/181 +execute '15:00' +assert_output_json_eq '{ "originalmsg": "15:00", "unparsed-data": "15:00" }' + + +cleanup_tmp_files + diff --git a/tests/field_kernel_timestamp.sh b/tests/field_kernel_timestamp.sh new file mode 100755 index 0000000..e2a5afb --- /dev/null +++ b/tests/field_kernel_timestamp.sh @@ -0,0 +1,52 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "kernel timestamp parser" +add_rule 'version=2' +add_rule 'rule=:begin %timestamp:kernel-timestamp% end' +execute 'begin [12345.123456] end' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:begin %timestamp:kernel-timestamp%' +execute 'begin [12345.123456]' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:%timestamp:kernel-timestamp%' +execute '[12345.123456]' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +execute '[154469.133028]' +assert_output_json_eq '{ "timestamp": "[154469.133028]"}' + +execute '[123456789012.123456]' +assert_output_json_eq '{ "timestamp": "[123456789012.123456]"}' + +#check cases where parsing failure must occur +execute '[1234.123456]' +assert_output_json_eq '{"originalmsg": "[1234.123456]", "unparsed-data": "[1234.123456]" }' + +execute '[1234567890123.123456]' +assert_output_json_eq '{"originalmsg": "[1234567890123.123456]", "unparsed-data": "[1234567890123.123456]" }' + +execute '[123456789012.12345]' +assert_output_json_eq '{ "originalmsg": "[123456789012.12345]", "unparsed-data": "[123456789012.12345]" }' + +execute '[123456789012.1234567]' +assert_output_json_eq '{ "originalmsg": "[123456789012.1234567]", "unparsed-data": "[123456789012.1234567]" }' + +execute '(123456789012.123456]' +assert_output_json_eq '{ "originalmsg": "(123456789012.123456]", "unparsed-data": "(123456789012.123456]" }' + +execute '[123456789012.123456' +assert_output_json_eq '{ "originalmsg": "[123456789012.123456", "unparsed-data": "[123456789012.123456" }' + + +cleanup_tmp_files + diff --git a/tests/field_kernel_timestamp_jsoncnf.sh b/tests/field_kernel_timestamp_jsoncnf.sh new file mode 100755 index 0000000..32f9287 --- /dev/null +++ b/tests/field_kernel_timestamp_jsoncnf.sh @@ -0,0 +1,52 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "kernel timestamp parser" +add_rule 'version=2' +add_rule 'rule=:begin %{"name":"timestamp", "type":"kernel-timestamp"}% end' +execute 'begin [12345.123456] end' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:begin %{"name":"timestamp", "type":"kernel-timestamp"}%' +execute 'begin [12345.123456]' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +reset_rules + +add_rule 'version=2' +add_rule 'rule=:%{"name":"timestamp", "type":"kernel-timestamp"}%' +execute '[12345.123456]' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +execute '[154469.133028]' +assert_output_json_eq '{ "timestamp": "[154469.133028]"}' + +execute '[123456789012.123456]' +assert_output_json_eq '{ "timestamp": "[123456789012.123456]"}' + +#check cases where parsing failure must occur +execute '[1234.123456]' +assert_output_json_eq '{"originalmsg": "[1234.123456]", "unparsed-data": "[1234.123456]" }' + +execute '[1234567890123.123456]' +assert_output_json_eq '{"originalmsg": "[1234567890123.123456]", "unparsed-data": "[1234567890123.123456]" }' + +execute '[123456789012.12345]' +assert_output_json_eq '{ "originalmsg": "[123456789012.12345]", "unparsed-data": "[123456789012.12345]" }' + +execute '[123456789012.1234567]' +assert_output_json_eq '{ "originalmsg": "[123456789012.1234567]", "unparsed-data": "[123456789012.1234567]" }' + +execute '(123456789012.123456]' +assert_output_json_eq '{ "originalmsg": "(123456789012.123456]", "unparsed-data": "(123456789012.123456]" }' + +execute '[123456789012.123456' +assert_output_json_eq '{ "originalmsg": "[123456789012.123456", "unparsed-data": "[123456789012.123456" }' + + +cleanup_tmp_files + diff --git a/tests/field_kernel_timestamp_v1.sh b/tests/field_kernel_timestamp_v1.sh new file mode 100755 index 0000000..25f96e7 --- /dev/null +++ b/tests/field_kernel_timestamp_v1.sh @@ -0,0 +1,58 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "kernel timestamp parser" +add_rule 'rule=:begin %timestamp:kernel-timestamp% end' +execute 'begin [12345.123456] end' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +reset_rules + +add_rule 'rule=:begin %timestamp:kernel-timestamp%' +execute 'begin [12345.123456]' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +reset_rules + +add_rule 'rule=:%timestamp:kernel-timestamp%' +execute '[12345.123456]' +assert_output_json_eq '{ "timestamp": "[12345.123456]"}' + +execute '[154469.133028]' +assert_output_json_eq '{ "timestamp": "[154469.133028]"}' + +execute '[123456789012.123456]' +assert_output_json_eq '{ "timestamp": "[123456789012.123456]"}' + +#check cases where parsing failure must occur +execute '[1234.123456]' +assert_output_json_eq '{"originalmsg": "[1234.123456]", "unparsed-data": "[1234.123456]" }' + +execute '[1234567890123.123456]' +assert_output_json_eq '{"originalmsg": "[1234567890123.123456]", "unparsed-data": "[1234567890123.123456]" }' + +execute '[123456789012.12345]' +assert_output_json_eq '{ "originalmsg": "[123456789012.12345]", "unparsed-data": "[123456789012.12345]" }' + +execute '[123456789012.1234567]' +assert_output_json_eq '{ "originalmsg": "[123456789012.1234567]", "unparsed-data": "[123456789012.1234567]" }' + +execute '(123456789012.123456]' +assert_output_json_eq '{ "originalmsg": "(123456789012.123456]", "unparsed-data": "(123456789012.123456]" }' + +execute '[123456789012.123456' +assert_output_json_eq '{ "originalmsg": "[123456789012.123456", "unparsed-data": "[123456789012.123456" }' + + +cleanup_tmp_files + diff --git a/tests/field_mac48.sh b/tests/field_mac48.sh new file mode 100755 index 0000000..4881846 --- /dev/null +++ b/tests/field_mac48.sh @@ -0,0 +1,26 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "dmac48 syntax" + +reset_rules +add_rule 'version=2' +add_rule 'rule=:%field:mac48%' + +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +execute 'f0-f6-1c-5f-cc-a2' +assert_output_json_eq '{"field": "f0-f6-1c-5f-cc-a2"}' + +# things that need to NOT match + +execute 'f0-f6:1c:5f:cc-a2' +assert_output_json_eq '{ "originalmsg": "f0-f6:1c:5f:cc-a2", "unparsed-data": "f0-f6:1c:5f:cc-a2" }' + +execute 'f0:f6:1c:xf:cc:a2' +assert_output_json_eq '{ "originalmsg": "f0:f6:1c:xf:cc:a2", "unparsed-data": "f0:f6:1c:xf:cc:a2" }' + + +cleanup_tmp_files diff --git a/tests/field_mac48_jsoncnf.sh b/tests/field_mac48_jsoncnf.sh new file mode 100755 index 0000000..ae7575a --- /dev/null +++ b/tests/field_mac48_jsoncnf.sh @@ -0,0 +1,25 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "dmac48 syntax" +add_rule 'version=2' +add_rule 'rule=:%{"name":"field", "type":"mac48"}%' + +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +execute 'f0-f6-1c-5f-cc-a2' +assert_output_json_eq '{"field": "f0-f6-1c-5f-cc-a2"}' + +# things that need to NOT match + +execute 'f0-f6:1c:5f:cc-a2' +assert_output_json_eq '{ "originalmsg": "f0-f6:1c:5f:cc-a2", "unparsed-data": "f0-f6:1c:5f:cc-a2" }' + +execute 'f0:f6:1c:xf:cc:a2' +assert_output_json_eq '{ "originalmsg": "f0:f6:1c:xf:cc:a2", "unparsed-data": "f0:f6:1c:xf:cc:a2" }' + + +cleanup_tmp_files + diff --git a/tests/field_mac48_v1.sh b/tests/field_mac48_v1.sh new file mode 100755 index 0000000..bd2898e --- /dev/null +++ b/tests/field_mac48_v1.sh @@ -0,0 +1,24 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "dmac48 syntax" +add_rule 'rule=:%field:mac48%' + +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +execute 'f0-f6-1c-5f-cc-a2' +assert_output_json_eq '{"field": "f0-f6-1c-5f-cc-a2"}' + +# things that need to NOT match + +execute 'f0-f6:1c:5f:cc-a2' +assert_output_json_eq '{ "originalmsg": "f0-f6:1c:5f:cc-a2", "unparsed-data": "f0-f6:1c:5f:cc-a2" }' + +execute 'f0:f6:1c:xf:cc:a2' +assert_output_json_eq '{ "originalmsg": "f0:f6:1c:xf:cc:a2", "unparsed-data": "f0:f6:1c:xf:cc:a2" }' + + +cleanup_tmp_files + diff --git a/tests/field_name_value.sh b/tests/field_name_value.sh new file mode 100755 index 0000000..3a6c8fe --- /dev/null +++ b/tests/field_name_value.sh @@ -0,0 +1,33 @@ +# added 2015-04-25 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "name/value parser" +add_rule 'version=2' +add_rule 'rule=:%f:name-value-list%' + +execute 'name=value' +assert_output_json_eq '{ "f": { "name": "value" } }' + +execute 'name1=value1 name2=value2 name3=value3' +assert_output_json_eq '{ "f": { "name1": "value1", "name2": "value2", "name3": "value3" } }' + +execute 'name1=value1 name2=value2 name3=value3 ' +assert_output_json_eq '{ "f": { "name1": "value1", "name2": "value2", "name3": "value3" } }' + +execute 'name1= name2=value2 name3=value3 ' +assert_output_json_eq '{ "f": { "name1": "", "name2": "value2", "name3": "value3" } }' + +execute 'origin=core.action processed=67 failed=0 suspended=0 suspended.duration=0 resumed=0 ' +assert_output_json_eq '{ "f": { "origin": "core.action", "processed": "67", "failed": "0", "suspended": "0", "suspended.duration": "0", "resumed": "0" } }' + +# check for required non-matches +execute 'name' +assert_output_json_eq ' {"originalmsg": "name", "unparsed-data": "name" }' + +execute 'noname1 name2=value2 name3=value3 ' +assert_output_json_eq '{ "originalmsg": "noname1 name2=value2 name3=value3 ", "unparsed-data": "noname1 name2=value2 name3=value3 " }' + + +cleanup_tmp_files + diff --git a/tests/field_name_value_jsoncnf.sh b/tests/field_name_value_jsoncnf.sh new file mode 100755 index 0000000..a307f5c --- /dev/null +++ b/tests/field_name_value_jsoncnf.sh @@ -0,0 +1,33 @@ +# added 2015-04-25 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "name/value parser" +add_rule 'version=2' +add_rule 'rule=:%{"name":"f", "type":"name-value-list"}%' + +execute 'name=value' +assert_output_json_eq '{ "f": { "name": "value" } }' + +execute 'name1=value1 name2=value2 name3=value3' +assert_output_json_eq '{ "f": { "name1": "value1", "name2": "value2", "name3": "value3" } }' + +execute 'name1=value1 name2=value2 name3=value3 ' +assert_output_json_eq '{ "f": { "name1": "value1", "name2": "value2", "name3": "value3" } }' + +execute 'name1= name2=value2 name3=value3 ' +assert_output_json_eq '{ "f": { "name1": "", "name2": "value2", "name3": "value3" } }' + +execute 'origin=core.action processed=67 failed=0 suspended=0 suspended.duration=0 resumed=0 ' +assert_output_json_eq '{ "f": { "origin": "core.action", "processed": "67", "failed": "0", "suspended": "0", "suspended.duration": "0", "resumed": "0" } }' + +# check for required non-matches +execute 'name' +assert_output_json_eq ' {"originalmsg": "name", "unparsed-data": "name" }' + +execute 'noname1 name2=value2 name3=value3 ' +assert_output_json_eq '{ "originalmsg": "noname1 name2=value2 name3=value3 ", "unparsed-data": "noname1 name2=value2 name3=value3 " }' + + +cleanup_tmp_files + diff --git a/tests/field_name_value_v1.sh b/tests/field_name_value_v1.sh new file mode 100755 index 0000000..e853b59 --- /dev/null +++ b/tests/field_name_value_v1.sh @@ -0,0 +1,32 @@ +# added 2015-04-25 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "name/value parser" +add_rule 'rule=:%f:name-value-list%' + +execute 'name=value' +assert_output_json_eq '{ "f": { "name": "value" } }' + +execute 'name1=value1 name2=value2 name3=value3' +assert_output_json_eq '{ "f": { "name1": "value1", "name2": "value2", "name3": "value3" } }' + +execute 'name1=value1 name2=value2 name3=value3 ' +assert_output_json_eq '{ "f": { "name1": "value1", "name2": "value2", "name3": "value3" } }' + +execute 'name1= name2=value2 name3=value3 ' +assert_output_json_eq '{ "f": { "name1": "", "name2": "value2", "name3": "value3" } }' + +execute 'origin=core.action processed=67 failed=0 suspended=0 suspended.duration=0 resumed=0 ' +assert_output_json_eq '{ "f": { "origin": "core.action", "processed": "67", "failed": "0", "suspended": "0", "suspended.duration": "0", "resumed": "0" } }' + +# check for required non-matches +execute 'name' +assert_output_json_eq ' {"originalmsg": "name", "unparsed-data": "name" }' + +execute 'noname1 name2=value2 name3=value3 ' +assert_output_json_eq '{ "originalmsg": "noname1 name2=value2 name3=value3 ", "unparsed-data": "noname1 name2=value2 name3=value3 " }' + + +cleanup_tmp_files + diff --git a/tests/field_number-fmt_number.sh b/tests/field_number-fmt_number.sh new file mode 100755 index 0000000..9c453e6 --- /dev/null +++ b/tests/field_number-fmt_number.sh @@ -0,0 +1,17 @@ +# added 2017-10-02 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "number field in native format" +add_rule 'version=2' +add_rule 'rule=:here is a number %{ "type":"number", "name":"num", "format":"number"}% in dec form' +execute 'here is a number 1234 in dec form' +assert_output_json_eq '{"num": 1234}' + +#check cases where parsing failure must occur +execute 'here is a number 1234in dec form' +assert_output_json_eq '{ "originalmsg": "here is a number 1234in dec form", "unparsed-data": "in dec form" }' + + +cleanup_tmp_files diff --git a/tests/field_number.sh b/tests/field_number.sh new file mode 100755 index 0000000..f7de663 --- /dev/null +++ b/tests/field_number.sh @@ -0,0 +1,17 @@ +# added 2017-10-02 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "number field" +add_rule 'version=2' +add_rule 'rule=:here is a number %num:number% in dec form' +execute 'here is a number 1234 in dec form' +assert_output_json_eq '{"num": "1234"}' + +#check cases where parsing failure must occur +execute 'here is a number 1234in dec form' +assert_output_json_eq '{ "originalmsg": "here is a number 1234in dec form", "unparsed-data": "in dec form" }' + + +cleanup_tmp_files diff --git a/tests/field_number_maxval.sh b/tests/field_number_maxval.sh new file mode 100755 index 0000000..4b9dbc3 --- /dev/null +++ b/tests/field_number_maxval.sh @@ -0,0 +1,17 @@ +# added 2017-10-02 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "number field with maxval" +add_rule 'version=2' +add_rule 'rule=:here is a number %{ "type":"number", "name":"num", "maxval":1000}% in dec form' +execute 'here is a number 234 in dec form' +assert_output_json_eq '{"num": "234"}' + +#check cases where parsing failure must occur +execute 'here is a number 1234in dec form' +assert_output_json_eq '{ "originalmsg": "here is a number 1234in dec form", "unparsed-data": "1234in dec form" }' + + +cleanup_tmp_files diff --git a/tests/field_recursive.sh b/tests/field_recursive.sh new file mode 100755 index 0000000..9897325 --- /dev/null +++ b/tests/field_recursive.sh @@ -0,0 +1,69 @@ +# added 2014-11-26 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "recursive parsing field" + +#tail recursion with default tail field +add_rule 'rule=:%word:word% %next:recursive%' +add_rule 'rule=:%word:word%' +execute '123 abc 456 def' +assert_output_json_eq '{"word": "123", "next": {"word": "abc", "next": {"word": "456", "next" : {"word": "def"}}}}' + +#tail recursion with explicitly named 'tail' field +reset_rules +add_rule 'rule=:%word:word% %next:recursive:tail%' +add_rule 'rule=:%word:word%' +execute '123 abc 456 def' +assert_output_json_eq '{"word": "123", "next": {"word": "abc", "next": {"word": "456", "next" : {"word": "def"}}}}' + +#tail recursion with tail field having arbirary name +reset_rules +add_rule 'rule=:%word:word% %next:recursive:foo%' +add_rule 'rule=:%word:word%' +execute '123 abc 456 def' +assert_output_json_eq '{"word": "123", "next": {"word": "abc", "next": {"word": "456", "next" : {"word": "def"}}}}' + +#non tail recursion with default tail field +reset_rules +add_rule 'rule=:blocked on %device:word% %net:recursive%at %tm:date-rfc5424%' +add_rule 'rule=:%ip_addr:ipv4% %tail:rest%' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %tail:rest%' +execute 'blocked on gw-1 10.20.30.40 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"ip_addr": "10.20.30.40"}, "tm": "2014-12-08T08:53:33.05+05:30"}' +execute 'blocked on gw-1 10.20.30.40/16 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}, "tm": "2014-12-08T08:53:33.05+05:30"}' + +#non tail recursion with tail field being explicitly named 'tail' +reset_rules +add_rule 'rule=:blocked on %device:word% %net:recursive:tail%at %tm:date-rfc5424%' +add_rule 'rule=:%ip_addr:ipv4% %tail:rest%' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %tail:rest%' +execute 'blocked on gw-1 10.20.30.40 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"ip_addr": "10.20.30.40"}, "tm": "2014-12-08T08:53:33.05+05:30"}' +execute 'blocked on gw-1 10.20.30.40/16 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}, "tm": "2014-12-08T08:53:33.05+05:30"}' + +#non tail recursion with tail field having arbirary name +reset_rules +add_rule 'rule=:blocked on %device:word% %net:recursive:remaining%at %tm:date-rfc5424%' +add_rule 'rule=:%ip_addr:ipv4% %remaining:rest%' +add_rule 'rule=:%subnet_addr:ipv4%/%mask:number% %remaining:rest%' +execute 'blocked on gw-1 10.20.30.40 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"ip_addr": "10.20.30.40"}, "tm": "2014-12-08T08:53:33.05+05:30"}' +execute 'blocked on gw-1 10.20.30.40/16 at 2014-12-08T08:53:33.05+05:30' +assert_output_json_eq '{"device": "gw-1", "net": {"subnet_addr": "10.20.30.40", "mask": "16"}, "tm": "2014-12-08T08:53:33.05+05:30"}' + + + +cleanup_tmp_files + diff --git a/tests/field_regex_default_group_parse_and_return.sh b/tests/field_regex_default_group_parse_and_return.sh new file mode 100755 index 0000000..bcb5318 --- /dev/null +++ b/tests/field_regex_default_group_parse_and_return.sh @@ -0,0 +1,24 @@ +# added 2014-11-14 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +export ln_opts='-oallowRegex' +. $srcdir/exec.sh + +test_def $0 "type ERE for regex field" +add_rule 'rule=:%first:regex:[a-z]+% %second:regex:\d+\x25\x3a[a-f0-9]+\x25%' +execute 'foo 122%:7a%' +assert_output_contains '"first": "foo"' +assert_output_contains '"second": "122%:7a%"' + + + +cleanup_tmp_files + diff --git a/tests/field_regex_invalid_args.sh b/tests/field_regex_invalid_args.sh new file mode 100755 index 0000000..0a85b93 --- /dev/null +++ b/tests/field_regex_invalid_args.sh @@ -0,0 +1,45 @@ +# added 2014-11-14 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +export ln_opts='-oallowRegex' +. $srcdir/exec.sh + +test_def $0 "invalid type for regex field with everything else defaulted" + +add_rule 'rule=:%first:regex:[a-z]+:Q%' +execute 'foo' +assert_output_contains '"originalmsg": "foo"' +assert_output_contains '"unparsed-data": "foo"' + +reset_rules +add_rule 'rule=:%first:regex:[a-z]+:%' +execute 'foo' +assert_output_contains '"originalmsg": "foo"' +assert_output_contains '"unparsed-data": "foo"' + +reset_rules +add_rule 'rule=:%first:regex:[a-z]+:0:%' +execute 'foo' +assert_output_contains '"originalmsg": "foo"' +assert_output_contains '"unparsed-data": "foo"' + +reset_rules +add_rule 'rule=:%first:regex:[a-z]+:0:0q%' +execute 'foo' +assert_output_contains '"originalmsg": "foo"' +assert_output_contains '"unparsed-data": "foo"' + +reset_rules +add_rule 'rule=:%first:regex:[a-z]+:0a:0%' +execute 'foo' +assert_output_contains '"originalmsg": "foo"' +assert_output_contains '"unparsed-data": "foo"' + +reset_rules +add_rule 'rule=:%first:regex:::::%%%' +execute 'foo' +assert_output_contains '"originalmsg": "foo"' +assert_output_contains '"unparsed-data": "foo"' + + +cleanup_tmp_files + diff --git a/tests/field_regex_while_regex_support_is_disabled.sh b/tests/field_regex_while_regex_support_is_disabled.sh new file mode 100755 index 0000000..4700177 --- /dev/null +++ b/tests/field_regex_while_regex_support_is_disabled.sh @@ -0,0 +1,13 @@ +# added 2014-11-14 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "field regex, while regex support is disabled" +add_rule 'rule=:%first:regex:[a-z]+%' +execute 'foo' +assert_output_contains '"originalmsg": "foo"' +assert_output_contains '"unparsed-data": "foo"' + + +cleanup_tmp_files + diff --git a/tests/field_regex_with_consume_group.sh b/tests/field_regex_with_consume_group.sh new file mode 100755 index 0000000..335618d --- /dev/null +++ b/tests/field_regex_with_consume_group.sh @@ -0,0 +1,29 @@ +# added 2014-11-14 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +export ln_opts='-oallowRegex' +. $srcdir/exec.sh + +test_def $0 "regex field with consume-group" +add_rule 'rule=:%first:regex:([a-z]{2}([a-f0-9]+,)+):0%%rest:rest%' +execute 'ad1234abcd,4567ef12,8901abef' +assert_output_contains '"first": "ad1234abcd,4567ef12,"' +assert_output_contains '"rest": "8901abef"' +reset_rules +add_rule 'rule=:%first:regex:(([a-z]{2})([a-f0-9]+,)+):2%%rest:rest%' +execute 'ad1234abcd,4567ef12,8901abef' +assert_output_contains '"first": "ad"' +assert_output_contains '"rest": "1234abcd,4567ef12,8901abef"' + + + +cleanup_tmp_files + diff --git a/tests/field_regex_with_consume_group_and_return_group.sh b/tests/field_regex_with_consume_group_and_return_group.sh new file mode 100755 index 0000000..6db3596 --- /dev/null +++ b/tests/field_regex_with_consume_group_and_return_group.sh @@ -0,0 +1,30 @@ +# added 2014-11-14 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +export ln_opts='-oallowRegex' +. $srcdir/exec.sh + +test_def $0 "regex field with consume-group and return-group" +set -x +add_rule 'rule=:%first:regex:[a-z]{2}(([a-f0-9]+),)+:0:2%%rest:rest%' +execute 'ad1234abcd,4567ef12,8901abef' +assert_output_contains '"first": "4567ef12"' +assert_output_contains '"rest": "8901abef"' +reset_rules +add_rule 'rule=:%first:regex:(([a-z]{2})(([a-f0-9]+),)+):2:4%%rest:rest%' +execute 'ad1234abcd,4567ef12,8901abef' +assert_output_contains '"first": "4567ef12"' +assert_output_contains '"rest": "1234abcd,4567ef12,8901abef"' + + + +cleanup_tmp_files + diff --git a/tests/field_regex_with_negation.sh b/tests/field_regex_with_negation.sh new file mode 100755 index 0000000..f498683 --- /dev/null +++ b/tests/field_regex_with_negation.sh @@ -0,0 +1,28 @@ +# added 2014-11-17 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +export ln_opts='-oallowRegex' +. $srcdir/exec.sh + +test_def $0 "regex field with negation" +add_rule 'rule=:%text:regex:[^,]+%,%more:rest%' +execute '123,abc' +assert_output_contains '"text": "123"' +assert_output_contains '"more": "abc"' +reset_rules +add_rule 'rule=:%text:regex:([^ ,|]+( |\||,)?)+%%more:rest%' +execute '123 abc|456 789,def|ghi,jkl| and some more text' +assert_output_contains '"text": "123 abc|456 789,def|ghi,jkl|"' +assert_output_contains '"more": " and some more text"' + + +cleanup_tmp_files + diff --git a/tests/field_rest.sh b/tests/field_rest.sh new file mode 100755 index 0000000..91d860e --- /dev/null +++ b/tests/field_rest.sh @@ -0,0 +1,43 @@ +# some more tests for the "rest" motif, especially to ensure that +# "rest" will not interfere with more specific rules. +# added 2015-04-27 +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "rest matches" + +#tail recursion with default tail field +add_rule 'version=2' +add_rule 'rule=:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number% (%label2:char-to:)%)' +add_rule 'rule=:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number% (%label2:char-to:)%)%tail:rest%' +add_rule 'rule=:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number%' +add_rule 'rule=:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number%%tail:rest%' + +# real-world cisco samples +execute 'Outside:10.20.30.40/35 (40.30.20.10/35)' +assert_output_json_eq '{ "label2": "40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 (40.30.20.10/35) with rest' +assert_output_json_eq '{ "tail": " with rest", "label2": "40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 (40.30.20.10/35 brace missing' +assert_output_json_eq '{ "tail": " (40.30.20.10\/35 brace missing", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 40.30.20.10/35' +assert_output_json_eq '{ "tail": " 40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +# +# test expected mismatches +# +execute 'not at all!' +assert_output_json_eq '{ "originalmsg": "not at all!", "unparsed-data": "not at all!" }' + +execute 'Outside 10.20.30.40/35 40.30.20.10/35' +assert_output_json_eq '{ "originalmsg": "Outside 10.20.30.40\/35 40.30.20.10\/35", "unparsed-data": "Outside 10.20.30.40\/35 40.30.20.10\/35" }' + +execute 'Outside:10.20.30.40/aa 40.30.20.10/35' +assert_output_json_eq '{ "originalmsg": "Outside:10.20.30.40\/aa 40.30.20.10\/35", "unparsed-data": "aa 40.30.20.10\/35" }' + + +cleanup_tmp_files + diff --git a/tests/field_rest_jsoncnf.sh b/tests/field_rest_jsoncnf.sh new file mode 100755 index 0000000..2e872b6 --- /dev/null +++ b/tests/field_rest_jsoncnf.sh @@ -0,0 +1,43 @@ +# some more tests for the "rest" motif, especially to ensure that +# "rest" will not interfere with more specific rules. +# added 2015-04-27 +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "rest matches" + +#tail recursion with default tail field +add_rule 'version=2' +add_rule 'rule=:%{"name":"iface", "type":"char-to", "extradata":":"}%:%{"name":"ip", "type":"ipv4"}%/%{"name":"port", "type":"number"}% (%{"name":"label2", "type":"char-to", "extradata":")"}%)' +add_rule 'rule=:%{"name":"iface", "type":"char-to", "extradata":":"}%:%{"name":"ip", "type":"ipv4"}%/%{"name":"port", "type":"number"}% (%{"name":"label2", "type":"char-to", "extradata":")"}%)%{"name":"tail", "type":"rest"}%' +add_rule 'rule=:%{"name":"iface", "type":"char-to", "extradata":":"}%:%{"name":"ip", "type":"ipv4"}%/%{"name":"port", "type":"number"}%' +add_rule 'rule=:%{"name":"iface", "type":"char-to", "extradata":":"}%:%{"name":"ip", "type":"ipv4"}%/%{"name":"port", "type":"number"}%%{"name":"tail", "type":"rest"}%' + +# real-world cisco samples +execute 'Outside:10.20.30.40/35 (40.30.20.10/35)' +assert_output_json_eq '{ "label2": "40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 (40.30.20.10/35) with rest' +assert_output_json_eq '{ "tail": " with rest", "label2": "40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 (40.30.20.10/35 brace missing' +assert_output_json_eq '{ "tail": " (40.30.20.10\/35 brace missing", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 40.30.20.10/35' +assert_output_json_eq '{ "tail": " 40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +# +# test expected mismatches +# +execute 'not at all!' +assert_output_json_eq '{ "originalmsg": "not at all!", "unparsed-data": "not at all!" }' + +execute 'Outside 10.20.30.40/35 40.30.20.10/35' +assert_output_json_eq '{ "originalmsg": "Outside 10.20.30.40\/35 40.30.20.10\/35", "unparsed-data": "Outside 10.20.30.40\/35 40.30.20.10\/35" }' + +execute 'Outside:10.20.30.40/aa 40.30.20.10/35' +assert_output_json_eq '{ "originalmsg": "Outside:10.20.30.40\/aa 40.30.20.10\/35", "unparsed-data": "aa 40.30.20.10\/35" }' + + +cleanup_tmp_files + diff --git a/tests/field_rest_v1.sh b/tests/field_rest_v1.sh new file mode 100755 index 0000000..36b9e9c --- /dev/null +++ b/tests/field_rest_v1.sh @@ -0,0 +1,51 @@ +# some more tests for the "rest" motif, especially to ensure that +# "rest" will not interfere with more specific rules. +# added 2015-04-27 +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "rest matches" + +#tail recursion with default tail field +add_rule 'rule=:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number% (%label2:char-to:)%)' +add_rule 'rule=:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number% (%label2:char-to:)%)%tail:rest%' +add_rule 'rule=:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number%' +add_rule 'rule=:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number%%tail:rest%' + +# real-world cisco samples +execute 'Outside:10.20.30.40/35 (40.30.20.10/35)' +assert_output_json_eq '{ "label2": "40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 (40.30.20.10/35) with rest' +assert_output_json_eq '{ "tail": " with rest", "label2": "40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 (40.30.20.10/35 brace missing' +assert_output_json_eq '{ "tail": " (40.30.20.10\/35 brace missing", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +execute 'Outside:10.20.30.40/35 40.30.20.10/35' +assert_output_json_eq '{ "tail": " 40.30.20.10\/35", "port": "35", "ip": "10.20.30.40", "iface": "Outside" }' + +# +# test expected mismatches +# +execute 'not at all!' +assert_output_json_eq '{ "originalmsg": "not at all!", "unparsed-data": "not at all!" }' + +execute 'Outside 10.20.30.40/35 40.30.20.10/35' +assert_output_json_eq '{ "originalmsg": "Outside 10.20.30.40\/35 40.30.20.10\/35", "unparsed-data": "Outside 10.20.30.40\/35 40.30.20.10\/35" }' + +execute 'Outside:10.20.30.40/aa 40.30.20.10/35' +assert_output_json_eq '{ "originalmsg": "Outside:10.20.30.40\/aa 40.30.20.10\/35", "unparsed-data": "aa 40.30.20.10\/35" }' + + +cleanup_tmp_files + diff --git a/tests/field_rfc5424timestamp-fmt_timestamp-unix-ms.sh b/tests/field_rfc5424timestamp-fmt_timestamp-unix-ms.sh new file mode 100755 index 0000000..f3768b0 --- /dev/null +++ b/tests/field_rfc5424timestamp-fmt_timestamp-unix-ms.sh @@ -0,0 +1,33 @@ +# added 2017-10-02 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "RFC5424 timestamp in timestamp-unix format" +add_rule 'version=2' +add_rule 'rule=:here is a timestamp %{ "type":"date-rfc5424", "name":"num", "format":"timestamp-unix-ms"}% in RFC5424 format' +execute 'here is a timestamp 2000-03-11T14:15:16+01:00 in RFC5424 format' +assert_output_json_eq '{ "num": 952780516000}' + +# with milliseconds (too-low precision) +execute 'here is a timestamp 2000-03-11T14:15:16.1+01:00 in RFC5424 format' +assert_output_json_eq '{ "num": 952780516100 }' +execute 'here is a timestamp 2000-03-11T14:15:16.12+01:00 in RFC5424 format' +assert_output_json_eq '{ "num": 952780516120 }' + +# with milliseconds (exactly right precision) +execute 'here is a timestamp 2000-03-11T14:15:16.123+01:00 in RFC5424 format' +assert_output_json_eq '{ "num": 952780516123 }' + +# with overdone precision +execute 'here is a timestamp 2000-03-11T14:15:16.1234+01:00 in RFC5424 format' +assert_output_json_eq '{ "num": 952780516123 }' +execute 'here is a timestamp 2000-03-11T14:15:16.123456789+01:00 in RFC5424 format' +assert_output_json_eq '{ "num": 952780516123 }' + +#check cases where parsing failure must occur +execute 'here is a timestamp 2000-03-11T14:15:16+01:00in RFC5424 format' +assert_output_json_eq '{ "originalmsg": "here is a timestamp 2000-03-11T14:15:16+01:00in RFC5424 format", "unparsed-data": "2000-03-11T14:15:16+01:00in RFC5424 format" }' + + +cleanup_tmp_files diff --git a/tests/field_rfc5424timestamp-fmt_timestamp-unix.sh b/tests/field_rfc5424timestamp-fmt_timestamp-unix.sh new file mode 100755 index 0000000..f6e86d1 --- /dev/null +++ b/tests/field_rfc5424timestamp-fmt_timestamp-unix.sh @@ -0,0 +1,21 @@ +# added 2017-10-02 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "RFC5424 timestamp in timestamp-unix format" +add_rule 'version=2' +add_rule 'rule=:here is a timestamp %{ "type":"date-rfc5424", "name":"num", "format":"timestamp-unix"}% in RFC5424 format' +execute 'here is a timestamp 2000-03-11T14:15:16+01:00 in RFC5424 format' +assert_output_json_eq '{"num": 952780516}' + +# with milliseconds (must be ignored with this format!) +execute 'here is a timestamp 2000-03-11T14:15:16.321+01:00 in RFC5424 format' +assert_output_json_eq '{"num": 952780516}' + +#check cases where parsing failure must occur +execute 'here is a timestamp 2000-03-11T14:15:16+01:00in RFC5424 format' +assert_output_json_eq '{ "originalmsg": "here is a timestamp 2000-03-11T14:15:16+01:00in RFC5424 format", "unparsed-data": "2000-03-11T14:15:16+01:00in RFC5424 format" }' + + +cleanup_tmp_files diff --git a/tests/field_string.sh b/tests/field_string.sh new file mode 100755 index 0000000..bc05fb6 --- /dev/null +++ b/tests/field_string.sh @@ -0,0 +1,62 @@ +# added 2015-09-02 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "string syntax" + +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %f:string% b' + +execute 'a test b' +assert_output_json_eq '{"f": "test"}' + +execute 'a "test" b' +assert_output_json_eq '{"f": "test"}' + +execute 'a "test with space" b' +assert_output_json_eq '{"f": "test with space"}' + +execute 'a "test with "" double escape" b' +assert_output_json_eq '{ "f": "test with \" double escape" }' + +execute 'a "test with \" backslash escape" b' +assert_output_json_eq '{ "f": "test with \" backslash escape" }' + +echo test quoting.mode +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %f:string{"quoting.mode":"none"}% b' + +execute 'a test b' +assert_output_json_eq '{"f": "test"}' + +execute 'a "test" b' +assert_output_json_eq '{"f": "\"test\""}' + +echo "test quoting.char.*" +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %f:string{"quoting.char.begin":"[", "quoting.char.end":"]"}% b' + +execute 'a test b' +assert_output_json_eq '{"f": "test"}' + +execute 'a [test] b' +assert_output_json_eq '{"f": "test"}' + +execute 'a [test test2] b' +assert_output_json_eq '{"f": "test test2"}' + +# things that need to NOT match + +cleanup_tmp_files diff --git a/tests/field_string_perm_chars.sh b/tests/field_string_perm_chars.sh new file mode 100755 index 0000000..7480d63 --- /dev/null +++ b/tests/field_string_perm_chars.sh @@ -0,0 +1,84 @@ +# added 2015-09-02 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "string type with permitted chars" + +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %f:string{"matching.permitted":"abc"}% b' + +execute 'a abc b' +assert_output_json_eq '{"f": "abc"}' + +execute 'a abcd b' +assert_output_json_eq '{"originalmsg": "a abcd b", "unparsed-data": "abcd b" }' + +execute 'a abbbbbcccbaaaa b' +assert_output_json_eq '{"f": "abbbbbcccbaaaa"}' + +execute 'a "abc" b' +assert_output_json_eq '{"f": "abc"}' + +echo "param array" +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %f:string{"matching.permitted":[ + {"chars":"ab"}, + {"chars":"c"} + ]}% b' + +execute 'a abc b' +assert_output_json_eq '{"f": "abc"}' + +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %f:string{"matching.permitted":[ + {"class":"digit"}, + {"chars":"x"} + ]}% b' + +execute 'a 12x3 b' +assert_output_json_eq '{"f": "12x3"}' + + +echo alpha +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %f:string{"matching.permitted":[ + {"class":"alpha"} + ]}% b' + +execute 'a abcdefghijklmnopqrstuvwxyZ b' +assert_output_json_eq '{"f": "abcdefghijklmnopqrstuvwxyZ"}' + +execute 'a abcd1 b' +assert_output_json_eq '{"originalmsg": "a abcd1 b", "unparsed-data": "abcd1 b" }' + + +echo alnum +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %f:string{"matching.permitted":[ + {"class":"alnum"} + ]}% b' + +execute 'a abcdefghijklmnopqrstuvwxyZ b' +assert_output_json_eq '{"f": "abcdefghijklmnopqrstuvwxyZ"}' + +execute 'a abcd1 b' +assert_output_json_eq '{"f": "abcd1" }' + +execute 'a abcd1_ b' +assert_output_json_eq '{ "originalmsg": "a abcd1_ b", "unparsed-data": "abcd1_ b" } ' + +cleanup_tmp_files diff --git a/tests/field_suffixed.sh b/tests/field_suffixed.sh new file mode 100755 index 0000000..e031bab --- /dev/null +++ b/tests/field_suffixed.sh @@ -0,0 +1,69 @@ +# added 2015-02-25 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "field with one of many possible suffixes" + +add_rule 'rule=:gc reclaimed %eden_free:suffixed:,:b,kb,mb,gb:number% eden [surviver: %surviver_used:suffixed:;:kb;mb;gb;b:number%/%surviver_size:suffixed:|:b|kb|mb|gb:float%]' +execute 'gc reclaimed 559mb eden [surviver: 95b/30.2mb]' +assert_output_json_eq '{"eden_free": {"value": "559", "suffix":"mb"}, "surviver_used": {"value": "95", "suffix": "b"}, "surviver_size": {"value": "30.2", "suffix": "mb"}}' + +reset_rules + +add_rule 'rule=:gc reclaimed %eden_free:named_suffixed:size:unit:,:b,kb,mb,gb:number% eden [surviver: %surviver_used:named_suffixed:sz:u:;:kb;mb;gb;b:number%/%surviver_size:suffixed:|:b|kb|mb|gb:float%]' +execute 'gc reclaimed 559mb eden [surviver: 95b/30.2mb]' +assert_output_json_eq '{"eden_free": {"size": "559", "unit":"mb"}, "surviver_used": {"sz": "95", "u": "b"}, "surviver_size": {"value": "30.2", "suffix": "mb"}}' + +reset_rules + +add_rule 'rule=:gc reclaimed %eden_free:named_suffixed:size:unit:,:b,kb,mb,gb:interpret:int:number% from eden' +execute 'gc reclaimed 559mb from eden' +assert_output_json_eq '{"eden_free": {"size": 559, "unit":"mb"}}' + +reset_rules + +add_rule 'rule=:disk free: %free:named_suffixed:size:unit:,:\x25,gb:interpret:int:number%' +execute 'disk free: 12%' +assert_output_json_eq '{"free": {"size": 12, "unit":"%"}}' +execute 'disk free: 130gb' +assert_output_json_eq '{"free": {"size": 130, "unit":"gb"}}' + +reset_rules + +add_rule 'rule=:disk free: %free:named_suffixed:size:unit:\x3a:gb\x3a\x25:interpret:int:number%' +execute 'disk free: 12%' +assert_output_json_eq '{"free": {"size": 12, "unit":"%"}}' +execute 'disk free: 130gb' +assert_output_json_eq '{"free": {"size": 130, "unit":"gb"}}' + +reset_rules + +add_rule 'rule=:eden,surviver,old-gen available-capacity: %available_memory:tokenized:,:named_suffixed:size:unit:,:mb,gb:interpret:int:number%' +execute 'eden,surviver,old-gen available-capacity: 400mb,40mb,1gb' +assert_output_json_eq '{"available_memory": [{"size": 400, "unit":"mb"}, {"size": 40, "unit":"mb"}, {"size": 1, "unit":"gb"}]}' + +reset_rules + +add_rule 'rule=:eden,surviver,old-gen available-capacity: %available_memory:named_suffixed:size:unit:,:mb,gb:tokenized:,:interpret:int:number%' +execute 'eden,surviver,old-gen available-capacity: 400,40,1024mb' +assert_output_json_eq '{"available_memory": {"size": [400, 40, 1024], "unit":"mb"}}' + +reset_rules + +add_rule 'rule=:eden:surviver:old-gen available-capacity: %available_memory:named_suffixed:size:unit:,:mb,gb:tokenized:\x3a:interpret:int:number%' +execute 'eden:surviver:old-gen available-capacity: 400:40:1024mb' +assert_output_json_eq '{"available_memory": {"size": [400, 40, 1024], "unit":"mb"}}' + + + +cleanup_tmp_files + diff --git a/tests/field_suffixed_with_invalid_ruledef.sh b/tests/field_suffixed_with_invalid_ruledef.sh new file mode 100755 index 0000000..c95b53d --- /dev/null +++ b/tests/field_suffixed_with_invalid_ruledef.sh @@ -0,0 +1,67 @@ +# added 2015-02-26 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "field with one of many possible suffixes, but invalid ruledef" + +add_rule 'rule=:reclaimed %eden_free:suffixe:,:b,kb,mb,gb:number% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed:% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed:kb,mb% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed:kb,mb% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed:,:% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed:,:kb,mb% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed:,:kb,mb:% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed:,:kb,mb:floa% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + +reset_rules + +add_rule 'rule=:reclaimed %eden_free:suffixed:,:kb,m:b:floa% eden' +execute 'reclaimed 559mb eden' +assert_output_json_eq '{ "originalmsg": "reclaimed 559mb eden", "unparsed-data": "559mb eden" }' + + +cleanup_tmp_files + diff --git a/tests/field_tokenized.sh b/tests/field_tokenized.sh new file mode 100755 index 0000000..1b77ce5 --- /dev/null +++ b/tests/field_tokenized.sh @@ -0,0 +1,39 @@ +# added 2014-11-17 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "tokenized field" + +add_rule 'rule=:%arr:tokenized: , :word% %more:rest%' +execute '123 , abc , 456 , def ijk789' +assert_output_contains '"arr": [ "123", "abc", "456", "def" ]' +assert_output_contains '"more": "ijk789"' + +reset_rules +add_rule 'rule=:%ips:tokenized:, :ipv4% %text:rest%' +execute '10.20.30.40, 50.60.70.80, 90.100.110.120 are blocked' +assert_output_contains '"text": "are blocked"' +assert_output_contains '"ips": [ "10.20.30.40", "50.60.70.80", "90.100.110.120" ]' + +reset_rules +add_rule 'rule=:comma separated list of colon separated list of # separated numbers: %some_nos:tokenized:, :tokenized: \x3a :tokenized:#:number%' +execute 'comma separated list of colon separated list of # separated numbers: 10, 20 : 30#40#50 : 60#70#80, 90 : 100' +assert_output_contains '"some_nos": [ [ [ "10" ] ], [ [ "20" ], [ "30", "40", "50" ], [ "60", "70", "80" ] ], [ [ "90" ], [ "100" ] ] ]' + +reset_rules +add_rule 'rule=:%arr:tokenized:\x3a:number% %more:rest%' +execute '123:456:789 ijk789' +assert_output_json_eq '{"arr": [ "123", "456", "789" ], "more": "ijk789"}' + + +cleanup_tmp_files + diff --git a/tests/field_tokenized_recursive.sh b/tests/field_tokenized_recursive.sh new file mode 100755 index 0000000..ca9afb8 --- /dev/null +++ b/tests/field_tokenized_recursive.sh @@ -0,0 +1,70 @@ +# added 2014-12-08 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "tokenized field with recursive field matching tokens" + +#recursive field inside tokenized field with default tail field +add_rule 'rule=:%subnet_addr:ipv4%/%subnet_mask:number%%tail:rest%' +add_rule 'rule=:%ip_addr:ipv4%%tail:rest%' +add_rule 'rule=:blocked inbound via: %via_ip:ipv4% from: %addresses:tokenized:, :recursive% to %server_ip:ipv4%' +execute 'blocked inbound via: 192.168.1.1 from: 1.2.3.4, 5.6.16.0/12, 8.9.10.11, 12.13.14.15, 16.17.18.0/8, 19.20.21.24/3 to 192.168.1.5' +assert_output_json_eq '{ +"addresses": [ + {"ip_addr": "1.2.3.4"}, + {"subnet_addr": "5.6.16.0", "subnet_mask": "12"}, + {"ip_addr": "8.9.10.11"}, + {"ip_addr": "12.13.14.15"}, + {"subnet_addr": "16.17.18.0", "subnet_mask": "8"}, + {"subnet_addr": "19.20.21.24", "subnet_mask": "3"}], +"server_ip": "192.168.1.5", +"via_ip": "192.168.1.1"}' +reset_rules + +#recursive field inside tokenized field with default tail field +reset_rules +add_rule 'rule=:%subnet_addr:ipv4%/%subnet_mask:number%%remains:rest%' +add_rule 'rule=:%ip_addr:ipv4%%remains:rest%' +add_rule 'rule=:blocked inbound via: %via_ip:ipv4% from: %addresses:tokenized:, :recursive:remains% to %server_ip:ipv4%' +execute 'blocked inbound via: 192.168.1.1 from: 1.2.3.4, 5.6.16.0/12, 8.9.10.11, 12.13.14.15, 16.17.18.0/8, 19.20.21.24/3 to 192.168.1.5' +assert_output_json_eq '{ +"addresses": [ + {"ip_addr": "1.2.3.4"}, + {"subnet_addr": "5.6.16.0", "subnet_mask": "12"}, + {"ip_addr": "8.9.10.11"}, + {"ip_addr": "12.13.14.15"}, + {"subnet_addr": "16.17.18.0", "subnet_mask": "8"}, + {"subnet_addr": "19.20.21.24", "subnet_mask": "3"}], +"server_ip": "192.168.1.5", +"via_ip": "192.168.1.1"}' + +#recursive field inside tokenized field with default tail field +reset_rules 'net' +add_rule 'rule=:%subnet_addr:ipv4%/%subnet_mask:number%%remains:rest%' 'net' +add_rule 'rule=:%ip_addr:ipv4%%remains:rest%' 'net' +reset_rules +add_rule 'rule=:blocked inbound via: %via_ip:ipv4% from: %addresses:tokenized:, :descent:./net.rulebase:remains% to %server_ip:ipv4%' +execute 'blocked inbound via: 192.168.1.1 from: 1.2.3.4, 5.6.16.0/12, 8.9.10.11, 12.13.14.15, 16.17.18.0/8, 19.20.21.24/3 to 192.168.1.5' +assert_output_json_eq '{ +"addresses": [ + {"ip_addr": "1.2.3.4"}, + {"subnet_addr": "5.6.16.0", "subnet_mask": "12"}, + {"ip_addr": "8.9.10.11"}, + {"ip_addr": "12.13.14.15"}, + {"subnet_addr": "16.17.18.0", "subnet_mask": "8"}, + {"subnet_addr": "19.20.21.24", "subnet_mask": "3"}], +"server_ip": "192.168.1.5", +"via_ip": "192.168.1.1"}' + + +cleanup_tmp_files + diff --git a/tests/field_tokenized_with_invalid_ruledef.sh b/tests/field_tokenized_with_invalid_ruledef.sh new file mode 100755 index 0000000..58d10ce --- /dev/null +++ b/tests/field_tokenized_with_invalid_ruledef.sh @@ -0,0 +1,44 @@ +# added 2014-11-18 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "tokenized field with invalid rule definition" + +add_rule 'rule=:%arr:tokenized%' +execute '123 abc 456 def' +assert_output_contains '"unparsed-data": "123 abc 456 def"' +assert_output_contains '"originalmsg": "123 abc 456 def"' + +reset_rules +add_rule 'rule=:%arr:tokenized: %' +execute '123 abc 456 def' +assert_output_contains '"unparsed-data": "123 abc 456 def"' +assert_output_contains '"originalmsg": "123 abc 456 def"' + +reset_rules +add_rule 'rule=:%arr:tokenized:quux:%' +execute '123 abc 456 def' +assert_output_contains '"unparsed-data": "123 abc 456 def"' +assert_output_contains '"originalmsg": "123 abc 456 def"' + +reset_rules +add_rule 'rule=:%arr:tokenized:quux:some_non_existant_type%' +execute '123 abc 456 def' +assert_output_contains '"unparsed-data": "123 abc 456 def"' +assert_output_contains '"originalmsg": "123 abc 456 def"' + +reset_rules +add_rule 'rule=:%arr:tokenized:quux:some_non_existant_type:%' +execute '123 abc 456 def' +assert_output_contains '"unparsed-data": "123 abc 456 def"' +assert_output_contains '"originalmsg": "123 abc 456 def"' + +reset_rules +add_rule 'rule=:%arr:tokenized::::%%%%' +execute '123 abc 456 def' +assert_output_contains '"unparsed-data": "123 abc 456 def"' +assert_output_contains '"originalmsg": "123 abc 456 def"' + + +cleanup_tmp_files + diff --git a/tests/field_tokenized_with_regex.sh b/tests/field_tokenized_with_regex.sh new file mode 100755 index 0000000..4f1b54a --- /dev/null +++ b/tests/field_tokenized_with_regex.sh @@ -0,0 +1,34 @@ +# added 2014-11-17 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi + +#test that tokenized disabled regex if parent context has it disabled +. $srcdir/exec.sh + +test_def $0 "tokenized field with regex based field" +add_rule 'rule=:%parts:tokenized:,:regex:[^, ]+% %text:rest%' +execute '123,abc,456,def foo bar' +assert_output_contains '"unparsed-data": "123,abc,456,def foo bar"' +assert_output_contains '"originalmsg": "123,abc,456,def foo bar"' + +#and then enables it when parent context has it enabled +export ln_opts='-oallowRegex' +. $srcdir/exec.sh + +test_def $0 "tokenized field with regex based field" +add_rule 'rule=:%parts:tokenized:,:regex:[^, ]+% %text:rest%' +execute '123,abc,456,def foo bar' +assert_output_contains '"parts": [ "123", "abc", "456", "def" ]' +assert_output_contains '"text": "foo bar"' + + +cleanup_tmp_files + diff --git a/tests/field_v2-iptables.sh b/tests/field_v2-iptables.sh new file mode 100755 index 0000000..1f02f62 --- /dev/null +++ b/tests/field_v2-iptables.sh @@ -0,0 +1,65 @@ +# added 2015-04-30 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "v2-iptables field" +add_rule 'version=2' +add_rule 'rule=:iptables output denied: %field:v2-iptables%' + +# first, a real-world case +execute 'iptables output denied: IN= OUT=eth0 SRC=176.9.56.141 DST=168.192.14.3 LEN=32 TOS=0x00 PREC=0x00 TTL=64 ID=39110 DF PROTO=UDP SPT=49564 DPT=2010 LEN=12' +assert_output_json_eq '{ "field": { "IN": "", "OUT": "eth0", "SRC": "176.9.56.141", "DST": "168.192.14.3", "LEN": "12", "TOS": "0x00", "PREC": "0x00", "TTL": "64", "ID": "39110", "DF": null, "PROTO": "UDP", "SPT": "49564", "DPT": "2010" } }' + +# now some more "fabricated" cases for better readable test +reset_rules +add_rule 'rule=:iptables: %field:v2-iptables%' + +execute 'iptables: IN=value SECOND=test' +assert_output_json_eq '{ "field": { "IN": "value", "SECOND": "test" }} }' + +execute 'iptables: IN= SECOND=test' +assert_output_json_eq '{ "field": { "IN": ""} }' + +execute 'iptables: IN SECOND=test' +assert_output_json_eq '{ "field": { "IN": null} }' + +execute 'iptables: IN=invalue OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": "invalue", "OUT": "outvalue" } }' + +execute 'iptables: IN= OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": "", "OUT": "outvalue" } }' + +execute 'iptables: IN OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": null, "OUT": "outvalue" } }' + +# +#check cases where parsing failure must occur +# +echo verify failure cases + +# lower case is not permitted +execute 'iptables: in=value' +assert_output_json_eq '{ "originalmsg": "iptables: in=value", "unparsed-data": "in=value" }' + +execute 'iptables: in=' +assert_output_json_eq '{ "originalmsg": "iptables: in=", "unparsed-data": "in=" }' + +execute 'iptables: in' +assert_output_json_eq '{ "originalmsg": "iptables: in", "unparsed-data": "in" }' + +execute 'iptables: IN' # single field is NOT permitted! +assert_output_json_eq '{ "originalmsg": "iptables: IN", "unparsed-data": "IN" }' + +# multiple spaces between n=v pairs are not permitted +execute 'iptables: IN=invalue OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN=invalue OUT=outvalue", "unparsed-data": "IN=invalue OUT=outvalue" }' + +execute 'iptables: IN= OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN= OUT=outvalue", "unparsed-data": "IN= OUT=outvalue" }' + +execute 'iptables: IN OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN OUT=outvalue", "unparsed-data": "IN OUT=outvalue" }' + + +cleanup_tmp_files + diff --git a/tests/field_v2-iptables_jsoncnf.sh b/tests/field_v2-iptables_jsoncnf.sh new file mode 100755 index 0000000..1f1a36c --- /dev/null +++ b/tests/field_v2-iptables_jsoncnf.sh @@ -0,0 +1,66 @@ +# added 2015-04-30 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "v2-iptables field" +add_rule 'version=2' +add_rule 'rule=:iptables output denied: %{"name":"field", "type":"v2-iptables"}%' + +# first, a real-world case +execute 'iptables output denied: IN= OUT=eth0 SRC=176.9.56.141 DST=168.192.14.3 LEN=32 TOS=0x00 PREC=0x00 TTL=64 ID=39110 DF PROTO=UDP SPT=49564 DPT=2010 LEN=12' +assert_output_json_eq '{ "field": { "IN": "", "OUT": "eth0", "SRC": "176.9.56.141", "DST": "168.192.14.3", "LEN": "12", "TOS": "0x00", "PREC": "0x00", "TTL": "64", "ID": "39110", "DF": null, "PROTO": "UDP", "SPT": "49564", "DPT": "2010" } }' + +# now some more "fabricated" cases for better readable test +reset_rules +add_rule 'version=2' +add_rule 'rule=:iptables: %field:v2-iptables%' + +execute 'iptables: IN=value SECOND=test' +assert_output_json_eq '{ "field": { "IN": "value", "SECOND": "test" }} }' + +execute 'iptables: IN= SECOND=test' +assert_output_json_eq '{ "field": { "IN": ""} }' + +execute 'iptables: IN SECOND=test' +assert_output_json_eq '{ "field": { "IN": null} }' + +execute 'iptables: IN=invalue OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": "invalue", "OUT": "outvalue" } }' + +execute 'iptables: IN= OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": "", "OUT": "outvalue" } }' + +execute 'iptables: IN OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": null, "OUT": "outvalue" } }' + +# +#check cases where parsing failure must occur +# +echo verify failure cases + +# lower case is not permitted +execute 'iptables: in=value' +assert_output_json_eq '{ "originalmsg": "iptables: in=value", "unparsed-data": "in=value" }' + +execute 'iptables: in=' +assert_output_json_eq '{ "originalmsg": "iptables: in=", "unparsed-data": "in=" }' + +execute 'iptables: in' +assert_output_json_eq '{ "originalmsg": "iptables: in", "unparsed-data": "in" }' + +execute 'iptables: IN' # single field is NOT permitted! +assert_output_json_eq '{ "originalmsg": "iptables: IN", "unparsed-data": "IN" }' + +# multiple spaces between n=v pairs are not permitted +execute 'iptables: IN=invalue OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN=invalue OUT=outvalue", "unparsed-data": "IN=invalue OUT=outvalue" }' + +execute 'iptables: IN= OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN= OUT=outvalue", "unparsed-data": "IN= OUT=outvalue" }' + +execute 'iptables: IN OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN OUT=outvalue", "unparsed-data": "IN OUT=outvalue" }' + + +cleanup_tmp_files + diff --git a/tests/field_v2-iptables_v1.sh b/tests/field_v2-iptables_v1.sh new file mode 100755 index 0000000..b566718 --- /dev/null +++ b/tests/field_v2-iptables_v1.sh @@ -0,0 +1,64 @@ +# added 2015-04-30 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "v2-iptables field" +add_rule 'rule=:iptables output denied: %field:v2-iptables%' + +# first, a real-world case +execute 'iptables output denied: IN= OUT=eth0 SRC=176.9.56.141 DST=168.192.14.3 LEN=32 TOS=0x00 PREC=0x00 TTL=64 ID=39110 DF PROTO=UDP SPT=49564 DPT=2010 LEN=12' +assert_output_json_eq '{ "field": { "IN": "", "OUT": "eth0", "SRC": "176.9.56.141", "DST": "168.192.14.3", "LEN": "12", "TOS": "0x00", "PREC": "0x00", "TTL": "64", "ID": "39110", "DF": null, "PROTO": "UDP", "SPT": "49564", "DPT": "2010" } }' + +# now some more "fabricated" cases for better readable test +reset_rules +add_rule 'rule=:iptables: %field:v2-iptables%' + +execute 'iptables: IN=value SECOND=test' +assert_output_json_eq '{ "field": { "IN": "value", "SECOND": "test" }} }' + +execute 'iptables: IN= SECOND=test' +assert_output_json_eq '{ "field": { "IN": ""} }' + +execute 'iptables: IN SECOND=test' +assert_output_json_eq '{ "field": { "IN": null} }' + +execute 'iptables: IN=invalue OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": "invalue", "OUT": "outvalue" } }' + +execute 'iptables: IN= OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": "", "OUT": "outvalue" } }' + +execute 'iptables: IN OUT=outvalue' +assert_output_json_eq '{ "field": { "IN": null, "OUT": "outvalue" } }' + +# +#check cases where parsing failure must occur +# +echo verify failure cases + +# lower case is not permitted +execute 'iptables: in=value' +assert_output_json_eq '{ "originalmsg": "iptables: in=value", "unparsed-data": "in=value" }' + +execute 'iptables: in=' +assert_output_json_eq '{ "originalmsg": "iptables: in=", "unparsed-data": "in=" }' + +execute 'iptables: in' +assert_output_json_eq '{ "originalmsg": "iptables: in", "unparsed-data": "in" }' + +execute 'iptables: IN' # single field is NOT permitted! +assert_output_json_eq '{ "originalmsg": "iptables: IN", "unparsed-data": "IN" }' + +# multiple spaces between n=v pairs are not permitted +execute 'iptables: IN=invalue OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN=invalue OUT=outvalue", "unparsed-data": "IN=invalue OUT=outvalue" }' + +execute 'iptables: IN= OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN= OUT=outvalue", "unparsed-data": "IN= OUT=outvalue" }' + +execute 'iptables: IN OUT=outvalue' +assert_output_json_eq '{ "originalmsg": "iptables: IN OUT=outvalue", "unparsed-data": "IN OUT=outvalue" }' + + +cleanup_tmp_files + diff --git a/tests/field_whitespace.sh b/tests/field_whitespace.sh new file mode 100755 index 0000000..003c861 --- /dev/null +++ b/tests/field_whitespace.sh @@ -0,0 +1,37 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "whitespace parser" +# the "word" parser unfortunatly treats everything except +# a SP as being in the word. So a HT inside a word is +# permitted, which does not work well with what we +# want to test here. to solve this problem, we use op-quoted-string. +# However, we must actually quote the samples with HT, because +# that parser also treats HT as being part of the word. But thanks +# to the quotes, we can force it to not do that. +# rgerhards, 2015-04-30 +add_rule 'version=2' +add_rule 'rule=:%a:op-quoted-string%%-:whitespace%%b:op-quoted-string%' + +execute 'word1 word2' # multiple spaces +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute 'word1 word2' # single space +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute '"word1" "word2"' # tab (US-ASCII HT) +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute '"word1" "word2"' # mix of tab and spaces +assert_output_json_eq '{ "b": "word2", "a": "word1" }' + + +cleanup_tmp_files + diff --git a/tests/field_whitespace_jsoncnf.sh b/tests/field_whitespace_jsoncnf.sh new file mode 100755 index 0000000..402c07a --- /dev/null +++ b/tests/field_whitespace_jsoncnf.sh @@ -0,0 +1,37 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "whitespace parser" +# the "word" parser unfortunatly treats everything except +# a SP as being in the word. So a HT inside a word is +# permitted, which does not work well with what we +# want to test here. to solve this problem, we use op-quoted-string. +# However, we must actually quote the samples with HT, because +# that parser also treats HT as being part of the word. But thanks +# to the quotes, we can force it to not do that. +# rgerhards, 2015-04-30 +add_rule 'version=2' +add_rule 'rule=:%{"name":"a", "type":"op-quoted-string"}%%{"name":"-", "type":"whitespace"}%%{"name":"b", "type":"op-quoted-string"}%' + +execute 'word1 word2' # multiple spaces +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute 'word1 word2' # single space +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute '"word1" "word2"' # tab (US-ASCII HT) +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute '"word1" "word2"' # mix of tab and spaces +assert_output_json_eq '{ "b": "word2", "a": "word1" }' + + +cleanup_tmp_files + diff --git a/tests/field_whitespace_v1.sh b/tests/field_whitespace_v1.sh new file mode 100755 index 0000000..059e569 --- /dev/null +++ b/tests/field_whitespace_v1.sh @@ -0,0 +1,36 @@ +# added 2015-03-12 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "whitespace parser" +# the "word" parser unfortunatly treats everything except +# a SP as being in the word. So a HT inside a word is +# permitted, which does not work well with what we +# want to test here. to solve this problem, we use op-quoted-string. +# However, we must actually quote the samples with HT, because +# that parser also treats HT as being part of the word. But thanks +# to the quotes, we can force it to not do that. +# rgerhards, 2015-04-30 +add_rule 'rule=:%a:op-quoted-string%%-:whitespace%%b:op-quoted-string%' + +execute 'word1 word2' # multiple spaces +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute 'word1 word2' # single space +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute '"word1" "word2"' # tab (US-ASCII HT) +assert_output_json_eq '{ "b": "word2", "a": "word1" }' +execute '"word1" "word2"' # mix of tab and spaces +assert_output_json_eq '{ "b": "word2", "a": "word1" }' + + +cleanup_tmp_files + diff --git a/tests/include.sh b/tests/include.sh new file mode 100755 index 0000000..5c5bd7b --- /dev/null +++ b/tests/include.sh @@ -0,0 +1,20 @@ +# added 2015-08-28 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "include (success case)" +reset_rules +add_rule 'version=2' +add_rule 'include=inc.rulebase' + +reset_rules inc +add_rule 'version=2' inc +add_rule 'rule=:%field:mac48%' inc + +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +# single test is sufficient, because that only works if the include +# worked ;) + +cleanup_tmp_files diff --git a/tests/include_RULEBASES.sh b/tests/include_RULEBASES.sh new file mode 100755 index 0000000..136d77d --- /dev/null +++ b/tests/include_RULEBASES.sh @@ -0,0 +1,25 @@ +# added 2015-08-28 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "include (via LIBLOGNORM_RULEBASES directory)" +reset_rules +add_rule 'version=2' +add_rule 'include=inc.rulebase' + +reset_rules inc +add_rule 'version=2' inc +add_rule 'rule=:%field:mac48%' inc + +mv inc.rulebase /tmp + +export LIBLOGNORM_RULEBASES=/tmp +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +export LIBLOGNORM_RULEBASES=/tmp/ +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +rm /tmp/inc.rulebase +cleanup_tmp_files diff --git a/tests/json_eq.c b/tests/json_eq.c new file mode 100644 index 0000000..5875ac4 --- /dev/null +++ b/tests/json_eq.c @@ -0,0 +1,93 @@ +#include "config.h" +#include +#include +#include +#include +#include +#include "internal.h" + + +typedef struct json_object obj; + +static int eq(obj* expected, obj* actual); + +static int obj_eq(obj* expected, obj* actual) { + int eql = 1; + struct json_object_iterator it = json_object_iter_begin(expected); + struct json_object_iterator itEnd = json_object_iter_end(expected); + while (!json_object_iter_equal(&it, &itEnd)) { + obj *actual_val; + json_object_object_get_ex(actual, json_object_iter_peek_name(&it), &actual_val); + eql &= eq(json_object_iter_peek_value(&it), actual_val); + json_object_iter_next(&it); + } + return eql; +} + +static int arr_eq(obj* expected, obj* actual) { + int eql = 1; + int expected_len = json_object_array_length(expected); + int actual_len = json_object_array_length(actual); + if (expected_len != actual_len) return 0; + for (int i = 0; i < expected_len; i++) { + obj* exp = json_object_array_get_idx(expected, i); + obj* act = json_object_array_get_idx(actual, i); + eql &= eq(exp, act); + } + return eql; +} + +static int str_eq(obj* expected, obj* actual) { + const char* exp_str = json_object_to_json_string(expected); + const char* act_str = json_object_to_json_string(actual); + return strcmp(exp_str, act_str) == 0; +} + +static int eq(obj* expected, obj* actual) { + if (expected == NULL && actual == NULL) { + return 1; + } else if (expected == NULL) { + return 0; + } else if (actual == NULL) { + return 0; + } + enum json_type expected_type = json_object_get_type(expected); + enum json_type actual_type = json_object_get_type(actual); + if (expected_type != actual_type) return 0; + switch(expected_type) { + case json_type_null: + return 1; + case json_type_boolean: + return json_object_get_boolean(expected) == json_object_get_boolean(actual); + case json_type_double: + return (fabs(json_object_get_double(expected) - json_object_get_double(actual)) < 0.001); + case json_type_int: + return json_object_get_int64(expected) == json_object_get_int64(actual); + case json_type_object: + return obj_eq(expected, actual); + case json_type_array: + return arr_eq(expected, actual); + case json_type_string: + return str_eq(expected, actual); + default: + fprintf(stderr, "unexpected type in %s:%d\n", __FILE__, __LINE__); + abort(); + } + return 0; +} + +int main(int argc, char** argv) { + if (argc != 3) { + fprintf(stderr, "expected and actual json not given, number of args was: %d\n", argc); + exit(100); + } + obj* expected = json_tokener_parse(argv[1]); + obj* actual = json_tokener_parse(argv[2]); + int result = eq(expected, actual) ? 0 : 1; + json_object_put(expected); + json_object_put(actual); + if (result != 0) { + printf("JSONs weren't equal. \n\tExpected: \n\t\t%s\n\tActual: \n\t\t%s\n", argv[1], argv[2]); + } + return result; +} diff --git a/tests/literal.sh b/tests/literal.sh new file mode 100755 index 0000000..e96abcb --- /dev/null +++ b/tests/literal.sh @@ -0,0 +1,30 @@ +# added 2016-12-21 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh +test_def $0 "using names with literal" + +add_rule 'version=2' +add_rule 'rule=:%{"type":"literal", "text":"a", "name":"var"}%' +execute 'a' +assert_output_json_eq '{ "var": "a" }' + +reset_rules +add_rule 'version=2' +add_rule 'rule=:Test %{"type":"literal", "text":"a", "name":"var"}%' +execute 'Test a' +assert_output_json_eq '{ "var": "a" }' + +reset_rules +add_rule 'version=2' +add_rule 'rule=:Test %{"type":"literal", "text":"a", "name":"var"}% End' +execute 'Test a End' +assert_output_json_eq '{ "var": "a" }' + +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %[{"name":"num", "type":"number"}, {"name":"colon", "type":"literal", "text":":"}, {"name":"hex", "type":"hexnumber"}]% b' +execute 'a 4711:0x4712 b' +assert_output_json_eq '{ "hex": "0x4712", "colon": ":", "num": "4711" }' + +cleanup_tmp_files diff --git a/tests/lognormalizer-invld-call.sh b/tests/lognormalizer-invld-call.sh new file mode 100755 index 0000000..0ca29d1 --- /dev/null +++ b/tests/lognormalizer-invld-call.sh @@ -0,0 +1,13 @@ +# This file is part of the liblognorm project, released under ASL 2.0 + +echo running test $0 +../src/lognormalizer +if [ $? -eq 0 ]; then + echo "FAIL: loganalyzer did not detect missing rulebase" + exit 1 +fi +../src/lognormalizer -r test -R test +if [ $? -eq 0 ]; then + echo "FAIL: loganalyzer did not detect both -r and -R given" + exit 1 +fi diff --git a/tests/missing_line_ending.sh b/tests/missing_line_ending.sh new file mode 100755 index 0000000..0cd6218 --- /dev/null +++ b/tests/missing_line_ending.sh @@ -0,0 +1,26 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "missing line ending" + +reset_rules +add_rule 'version=2' +add_rule_no_LF 'rule=:%field:mac48%' + +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +execute 'f0-f6-1c-5f-cc-a2' +assert_output_json_eq '{"field": "f0-f6-1c-5f-cc-a2"}' + +# things that need to NOT match + +execute 'f0-f6:1c:5f:cc-a2' +assert_output_json_eq '{ "originalmsg": "f0-f6:1c:5f:cc-a2", "unparsed-data": "f0-f6:1c:5f:cc-a2" }' + +execute 'f0:f6:1c:xf:cc:a2' +assert_output_json_eq '{ "originalmsg": "f0:f6:1c:xf:cc:a2", "unparsed-data": "f0:f6:1c:xf:cc:a2" }' + + +cleanup_tmp_files diff --git a/tests/missing_line_ending_v1.sh b/tests/missing_line_ending_v1.sh new file mode 100755 index 0000000..0d9c2aa --- /dev/null +++ b/tests/missing_line_ending_v1.sh @@ -0,0 +1,24 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "missing line ending (v1)" +reset_rules +add_rule_no_LF 'rule=:%field:mac48%' + +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +execute 'f0-f6-1c-5f-cc-a2' +assert_output_json_eq '{"field": "f0-f6-1c-5f-cc-a2"}' + +# things that need to NOT match + +execute 'f0-f6:1c:5f:cc-a2' +assert_output_json_eq '{ "originalmsg": "f0-f6:1c:5f:cc-a2", "unparsed-data": "f0-f6:1c:5f:cc-a2" }' + +execute 'f0:f6:1c:xf:cc:a2' +assert_output_json_eq '{ "originalmsg": "f0:f6:1c:xf:cc:a2", "unparsed-data": "f0:f6:1c:xf:cc:a2" }' + + +cleanup_tmp_files diff --git a/tests/names.sh b/tests/names.sh new file mode 100755 index 0000000..47b4bae --- /dev/null +++ b/tests/names.sh @@ -0,0 +1,33 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "using names with literal" +add_rule 'version=2' +add_rule 'rule=:a %[{"name":"num", "type":"number"}, {"type":"literal", "text":":"}, {"name":"hex", "type":"hexnumber"}]% b' +execute 'a 4711:0x4712 b' +assert_output_json_eq '{ "hex": "0x4712", "num": "4711" }' + +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %[{"name":"num", "type":"number"}, {"name":"literal", "type":"literal", "text":":"}, {"name":"hex", "type":"hexnumber"}]% b' +execute 'a 4711:0x4712 b' +assert_output_json_eq '{ "hex": "0x4712", "num": "4711" }' + +# check that "-" is still discarded +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %[{"name":"num", "type":"number"}, {"name":"-", "type":"literal", "text":":"}, {"name":"hex", "type":"hexnumber"}]% b' +execute 'a 4711:0x4712 b' +assert_output_json_eq '{ "hex": "0x4712", "num": "4711" }' + + +# now let's check old style. Here we need "-". +reset_rules +add_rule 'version=2' +add_rule 'rule=:a %-:number%:%hex:hexnumber% b' +execute 'a 4711:0x4712 b' +assert_output_json_eq '{ "hex": "0x4712" }' + +cleanup_tmp_files diff --git a/tests/options.sh.in b/tests/options.sh.in new file mode 100644 index 0000000..79c7da0 --- /dev/null +++ b/tests/options.sh.in @@ -0,0 +1,6 @@ +use_valgrind=@VALGRIND@ + +echo "Using valgrind: $use_valgrind" +if [ $use_valgrind == "yes" ]; then + cmd="valgrind --error-exitcode=191 --malloc-fill=ff --free-fill=fe --leak-check=full --trace-children=yes $cmd" +fi diff --git a/tests/parser_LF.sh b/tests/parser_LF.sh new file mode 100755 index 0000000..b8bc889 --- /dev/null +++ b/tests/parser_LF.sh @@ -0,0 +1,27 @@ +# added 2015-07-15 by Rainer Gerhards +# This checks if whitespace inside parser definitions is properly treated +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi + +. $srcdir/exec.sh +test_def $0 "LF in parser definition" + +add_rule 'rule=:here is a number % + num:hexnumber + % in hex form' +execute 'here is a number 0x1234 in hex form' +assert_output_json_eq '{"num": "0x1234"}' + +#check cases where parsing failure must occur +execute 'here is a number 0x1234in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0x1234in hex form", "unparsed-data": "0x1234in hex form" }' + +cleanup_tmp_files diff --git a/tests/parser_LF_jsoncnf.sh b/tests/parser_LF_jsoncnf.sh new file mode 100755 index 0000000..56e8548 --- /dev/null +++ b/tests/parser_LF_jsoncnf.sh @@ -0,0 +1,19 @@ +# added 2015-07-15 by Rainer Gerhards +# This checks if whitespace inside parser definitions is properly treated +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh +test_def $0 "LF in parser definition" + +add_rule 'version=2' +add_rule 'rule=:here is a number %{ + "name":"num", "type":"hexnumber" + }% in hex form' +execute 'here is a number 0x1234 in hex form' +assert_output_json_eq '{"num": "0x1234"}' + +#check cases where parsing failure must occur +execute 'here is a number 0x1234in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0x1234in hex form", "unparsed-data": "0x1234in hex form" }' + +cleanup_tmp_files diff --git a/tests/parser_prios.sh b/tests/parser_prios.sh new file mode 100755 index 0000000..c9d706a --- /dev/null +++ b/tests/parser_prios.sh @@ -0,0 +1,38 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +test_def $0 "parser priorities, simple case" +add_rule 'version=2' +add_rule 'rule=:%{"name":"field", "type":"mac48"}%' +add_rule 'rule=:%{"name":"rest", "type":"rest"}%' + +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"field": "f0:f6:1c:5f:cc:a2"}' + +execute 'f0-f6-1c-5f-cc-a2' +assert_output_json_eq '{"field": "f0-f6-1c-5f-cc-a2"}' + +# things that need to match rest + +execute 'f0-f6:1c:5f:cc-a2' +assert_output_json_eq '{ "rest": "f0-f6:1c:5f:cc-a2" }' + + +# now the same with inverted priorites. We should now always have +# rest matches. +reset_rules +add_rule 'version=2' +add_rule 'rule=:%{"name":"field", "type":"mac48", "priority":100}%' +add_rule 'rule=:%{"name":"rest", "type":"rest", "priority":10}%' + +execute 'f0:f6:1c:5f:cc:a2' +assert_output_json_eq '{"rest": "f0:f6:1c:5f:cc:a2"}' + +execute 'f0-f6-1c-5f-cc-a2' +assert_output_json_eq '{"rest": "f0-f6-1c-5f-cc-a2"}' + +execute 'f0-f6:1c:5f:cc-a2' +assert_output_json_eq '{ "rest": "f0-f6:1c:5f:cc-a2" }' + +cleanup_tmp_files diff --git a/tests/parser_whitespace.sh b/tests/parser_whitespace.sh new file mode 100755 index 0000000..91ccdc5 --- /dev/null +++ b/tests/parser_whitespace.sh @@ -0,0 +1,25 @@ +# added 2015-07-15 by Rainer Gerhards +# This checks if whitespace inside parser definitions is properly treated +# This file is part of the liblognorm project, released under ASL 2.0 + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi + +. $srcdir/exec.sh +test_def $0 "whitespace in parser definition" + +add_rule 'rule=:here is a number % num:hexnumber % in hex form' +execute 'here is a number 0x1234 in hex form' +assert_output_json_eq '{"num": "0x1234"}' + +#check cases where parsing failure must occur +execute 'here is a number 0x1234in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0x1234in hex form", "unparsed-data": "0x1234in hex form" }' + +cleanup_tmp_files diff --git a/tests/parser_whitespace_jsoncnf.sh b/tests/parser_whitespace_jsoncnf.sh new file mode 100755 index 0000000..89fb744 --- /dev/null +++ b/tests/parser_whitespace_jsoncnf.sh @@ -0,0 +1,17 @@ +# added 2015-07-15 by Rainer Gerhards +# This checks if whitespace inside parser definitions is properly treated +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh +test_def $0 "whitespace in parser definition" + +add_rule 'version=2' +add_rule 'rule=:here is a number % {"name":"num", "type":"hexnumber"} % in hex form' +execute 'here is a number 0x1234 in hex form' +assert_output_json_eq '{"num": "0x1234"}' + +#check cases where parsing failure must occur +execute 'here is a number 0x1234in hex form' +assert_output_json_eq '{ "originalmsg": "here is a number 0x1234in hex form", "unparsed-data": "0x1234in hex form" }' + +cleanup_tmp_files diff --git a/tests/repeat_alternative_nested.sh b/tests/repeat_alternative_nested.sh new file mode 100755 index 0000000..5332658 --- /dev/null +++ b/tests/repeat_alternative_nested.sh @@ -0,0 +1,35 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple alternative syntax" +add_rule 'version=2' +add_rule 'rule=:a %{"name":"numbers", "type":"repeat", + "parser": + { "type":"alternative", "parser": [ + [ {"type":"number", "name":"n1"}, + {"type":"literal", "text":":"}, + {"type":"number", "name":"n2"}, + ], + {"type":"hexnumber", "name":"hex"} + ] + }, + "while":[ + {"type":"literal", "text":", "} + ] + }% b' + +execute 'a 1:2, 3:4, 5:6, 7:8 b' +assert_output_json_eq '{ "numbers": [ { "n2": "2", "n1": "1" }, { "n2": "4", "n1": "3" }, { "n2": "6", "n1": "5" }, { "n2": "8", "n1": "7" } ] }' + +execute 'a 0x4711 b' +assert_output_json_eq '{ "numbers": [ { "hex": "0x4711" } ] }' + +# note: 0x4711, 1:2 does not work because hexnumber expects a SP after +# the number! Thus we use the reverse. We could add this case once +# we have added an option for more relaxed matching to hexnumber. +execute 'a 1:2, 0x4711 b' +assert_output_json_eq '{ "numbers": [ { "n2": "2", "n1": "1" }, { "hex": "0x4711" } ] }' + +cleanup_tmp_files diff --git a/tests/repeat_mismatch_in_while.sh b/tests/repeat_mismatch_in_while.sh new file mode 100755 index 0000000..843b975 --- /dev/null +++ b/tests/repeat_mismatch_in_while.sh @@ -0,0 +1,38 @@ +# added 2015-08-26 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +# This is based on a practical support case, see +# https://github.com/rsyslog/liblognorm/issues/130 + +. $srcdir/exec.sh + +test_def $0 "repeat with mismatch in parser part" + +reset_rules +add_rule 'version=2' +add_rule 'prefix=%timestamp:date-rfc3164% %hostname:word%' +add_rule 'rule=cisco,fwblock: \x25ASA-6-106015\x3a Deny %proto:word% (no connection) from %source:cisco-interface-spec% to %dest:cisco-interface-spec% flags %flags:repeat{ "parser": {"type":"word", "name":"."}, "while":{"type":"literal", "text":" "} }% on interface %srciface:word%' + +echo step 1 +execute 'Aug 18 13:18:45 192.168.99.2 %ASA-6-106015: Deny TCP (no connection) from 173.252.88.66/443 to 76.79.249.222/52746 flags RST on interface outside' +assert_output_json_eq '{ "originalmsg": "Aug 18 13:18:45 192.168.99.2 %ASA-6-106015: Deny TCP (no connection) from 173.252.88.66\/443 to 76.79.249.222\/52746 flags RST on interface outside", "unparsed-data": "RST on interface outside" }' + +# now check case where we permit a mismatch inside the parser part and still +# accept this as valid. This is needed for some use cases. See github +# issue mentioned above for more details. +# Note: there is something odd with the testbench driver: I cannot use two +# consequtiuve spaces +reset_rules +add_rule 'version=2' +add_rule 'prefix=%timestamp:date-rfc3164% %hostname:word%' +add_rule 'rule=cisco,fwblock: \x25ASA-6-106015\x3a Deny %proto:word% (no connection) from %source:cisco-interface-spec% to %dest:cisco-interface-spec% flags %flags:repeat{ "option.permitMismatchInParser":true, "parser": {"type":"word", "name":"."}, "while":{"type":"literal", "text":" "} }%\x20 on interface %srciface:word%' + +echo step 2 +execute 'Aug 18 13:18:45 192.168.99.2 %ASA-6-106015: Deny TCP (no connection) from 173.252.88.66/443 to 76.79.249.222/52746 flags RST on interface outside' +assert_output_json_eq '{ "srciface": "outside", "flags": [ "RST" ], "dest": { "ip": "76.79.249.222", "port": "52746" }, "source": { "ip": "173.252.88.66", "port": "443" }, "proto": "TCP", "hostname": "192.168.99.2", "timestamp": "Aug 18 13:18:45" }' + +echo step 3 +execute 'Aug 18 13:18:45 192.168.99.2 %ASA-6-106015: Deny TCP (no connection) from 173.252.88.66/443 to 76.79.249.222/52746 flags RST XST on interface outside' +assert_output_json_eq '{ "srciface": "outside", "flags": [ "RST", "XST" ], "dest": { "ip": "76.79.249.222", "port": "52746" }, "source": { "ip": "173.252.88.66", "port": "443" }, "proto": "TCP", "hostname": "192.168.99.2", "timestamp": "Aug 18 13:18:45" }' + + +cleanup_tmp_files diff --git a/tests/repeat_simple.sh b/tests/repeat_simple.sh new file mode 100755 index 0000000..48a6348 --- /dev/null +++ b/tests/repeat_simple.sh @@ -0,0 +1,22 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple repeat syntax" +add_rule 'version=2' +add_rule 'rule=:a %{"name":"numbers", "type":"repeat", + "parser":[ + {"name":"n1", "type":"number"}, + {"type":"literal", "text":":"}, + {"name":"n2", "type":"number"} + ], + "while":[ + {"type":"literal", "text":", "} + ] + }% b %w:word% +' +execute 'a 1:2, 3:4, 5:6, 7:8 b test' +assert_output_json_eq '{ "w": "test", "numbers": [ { "n2": "2", "n1": "1" }, { "n2": "4", "n1": "3" }, { "n2": "6", "n1": "5" }, { "n2": "8", "n1": "7" } ] }' + +cleanup_tmp_files diff --git a/tests/repeat_very_simple.sh b/tests/repeat_very_simple.sh new file mode 100755 index 0000000..f03d27d --- /dev/null +++ b/tests/repeat_very_simple.sh @@ -0,0 +1,18 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "very simple repeat syntax" +add_rule 'version=2' +add_rule 'rule=:a %{"name":"numbers", "type":"repeat", + "parser": + {"name":"n", "type":"number"}, + "while": + {"type":"literal", "text":", "} + }% b %w:word% +' +execute 'a 1, 2, 3, 4 b test' +assert_output_json_eq '{ "w": "test", "numbers": [ { "n": "1" }, { "n": "2" }, { "n": "3" }, { "n": "4" } ] }' + +cleanup_tmp_files diff --git a/tests/repeat_while_alternative.sh b/tests/repeat_while_alternative.sh new file mode 100755 index 0000000..55bab72 --- /dev/null +++ b/tests/repeat_while_alternative.sh @@ -0,0 +1,25 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "repeat syntax with alternative terminators" +add_rule 'version=2' +add_rule 'rule=:a %{"name":"numbers", "type":"repeat", + "parser":[ + {"name":"n1", "type":"number"}, + {"type":"literal", "text":":"}, + {"name":"n2", "type":"number"} + ], + "while": { + "type":"alternative", "parser": [ + {"type":"literal", "text":", "}, + {"type":"literal", "text":","} + ] + } + }% b %w:word% +' +execute 'a 1:2, 3:4,5:6, 7:8 b test' +assert_output_json_eq '{ "w": "test", "numbers": [ { "n2": "2", "n1": "1" }, { "n2": "4", "n1": "3" }, { "n2": "6", "n1": "5" }, { "n2": "8", "n1": "7" } ] }' + +cleanup_tmp_files diff --git a/tests/rule_last_str_long.sh b/tests/rule_last_str_long.sh new file mode 100755 index 0000000..ad1e6ce --- /dev/null +++ b/tests/rule_last_str_long.sh @@ -0,0 +1,28 @@ + +uname -a | grep "SunOS.*5.10" +if [ $? -eq 0 ] ; then + echo platform: `uname -a` + echo This looks like solaris 10, we disable known-failing tests to + echo permit OpenCSW to build packages. However, this are real failurs + echo and so a fix should be done as soon as time permits. + exit 77 +fi +. $srcdir/exec.sh + +test_def $0 "multiple formats including string (see also: rule_last_str_short.sh)" +add_rule 'version=2' +add_rule 'rule=:%string:string%' +add_rule 'rule=:before %string:string%' +add_rule 'rule=:%string:string% after' +add_rule 'rule=:before %string:string% after' +add_rule 'rule=:before %string:string% middle %string:string%' + +execute 'string' +execute 'before string' +execute 'string after' +execute 'before string after' +execute 'before string middle string' +assert_output_json_eq '{"string": "string" }' '{"string": "string" }''{"string": "string" }''{"string": "string" }''{"string": "string", "string": "string" }' + + +cleanup_tmp_files diff --git a/tests/rule_last_str_short.sh b/tests/rule_last_str_short.sh new file mode 100755 index 0000000..bc9d690 --- /dev/null +++ b/tests/rule_last_str_short.sh @@ -0,0 +1,13 @@ +. $srcdir/exec.sh + +test_def $0 "string being last in a rule (see also: rule_last_str_long.sh)" + +add_rule 'version=2' +add_rule 'rule=:%string:string%' + +execute 'string' + +assert_output_json_eq '{"string": "string" }' + + +cleanup_tmp_files diff --git a/tests/runaway_rule.sh b/tests/runaway_rule.sh new file mode 100755 index 0000000..abb9643 --- /dev/null +++ b/tests/runaway_rule.sh @@ -0,0 +1,19 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +# Note that this test produces an error message, as it encouters the +# runaway rule. This is OK and actually must happen. The prime point +# of the test is that it correctly loads the second rule, which +# would otherwise be consumed by the runaway rule. +. $srcdir/exec.sh + +test_def $0 "runaway rule (unmatched percent signs)" + +reset_rules +add_rule 'version=2' +add_rule 'rule=:test %f1:word unmatched percent' +add_rule 'rule=:%field:word%' + +execute 'data' +assert_output_json_eq '{"field": "data"}' + +cleanup_tmp_files diff --git a/tests/runaway_rule_comment.sh b/tests/runaway_rule_comment.sh new file mode 100755 index 0000000..a377a0c --- /dev/null +++ b/tests/runaway_rule_comment.sh @@ -0,0 +1,21 @@ +# added 2015-09-16 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +# Note that this test produces an error message, as it encouters the +# runaway rule. This is OK and actually must happen. The prime point +# of the test is that it correctly loads the second rule, which +# would otherwise be consumed by the runaway rule. +. $srcdir/exec.sh + +test_def $0 "runaway rule with comment lines (v2)" + +reset_rules +add_rule 'version=2' +add_rule 'rule=:test %f1:word unmatched percent' +add_rule '' +add_rule '#comment' +add_rule 'rule=:%field:word%' + +execute 'data' +assert_output_json_eq '{"field": "data"}' + +cleanup_tmp_files diff --git a/tests/runaway_rule_comment_v1.sh b/tests/runaway_rule_comment_v1.sh new file mode 100755 index 0000000..8c2ae1e --- /dev/null +++ b/tests/runaway_rule_comment_v1.sh @@ -0,0 +1,20 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +# Note that this test produces an error message, as it encouters the +# runaway rule. This is OK and actually must happen. The prime point +# of the test is that it correctly loads the second rule, which +# would otherwise be consumed by the runaway rule. +. $srcdir/exec.sh + +test_def $0 "runaway rule with comment lines (v1)" + +reset_rules +add_rule 'rule=:test %f1:word unmatched percent' +add_rule '' +add_rule '#comment' +add_rule 'rule=:%field:word%' + +execute 'data' +assert_output_json_eq '{"field": "data"}' + +cleanup_tmp_files diff --git a/tests/runaway_rule_v1.sh b/tests/runaway_rule_v1.sh new file mode 100755 index 0000000..e626305 --- /dev/null +++ b/tests/runaway_rule_v1.sh @@ -0,0 +1,18 @@ +# added 2015-05-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 +# Note that this test produces an error message, as it encouters the +# runaway rule. This is OK and actually must happen. The prime point +# of the test is that it correctly loads the second rule, which +# would otherwise be consumed by the runaway rule. +. $srcdir/exec.sh + +test_def $0 "runaway rule (unmatched percent signs) v1 version" + +reset_rules +add_rule 'rule=:test %f1:word unmatched percent' +add_rule 'rule=:%field:word%' + +execute 'data' +assert_output_json_eq '{"field": "data"}' + +cleanup_tmp_files diff --git a/tests/seq_simple.sh b/tests/seq_simple.sh new file mode 100755 index 0000000..3c22067 --- /dev/null +++ b/tests/seq_simple.sh @@ -0,0 +1,12 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple sequence (array) syntax" +add_rule 'version=2' +add_rule 'rule=:a %[{"name":"num", "type":"number"}, {"name":"-", "type":"literal", "text":":"}, {"name":"hex", "type":"hexnumber"}]% b' +execute 'a 4711:0x4712 b' +assert_output_json_eq '{ "hex": "0x4712", "num": "4711" }' + +cleanup_tmp_files diff --git a/tests/strict_prefix_actual_sample1.sh b/tests/strict_prefix_actual_sample1.sh new file mode 100755 index 0000000..6820184 --- /dev/null +++ b/tests/strict_prefix_actual_sample1.sh @@ -0,0 +1,15 @@ +# added 2015-11-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "one rule is strict prefix of a longer one" +add_rule 'version=2' +add_rule 'prefix=%timestamp:date-rfc3164% %hostname:word% BL-WLC01: *%programname:char-to:\x3a%: %timestamp:date-rfc3164%.%fracsec:number%:' +add_rule 'rule=wifi: #LOG-3-Q_IND: webauth_redirect.c:1238 read error on server socket, errno=131[...It occurred %count:number% times.!]' +add_rule 'rule=wifi: #LOG-3-Q_IND: webauth_redirect.c:1238 read error on server socket, errno=131' + +execute 'Sep 28 23:53:19 192.168.123.99 BL-WLC01: *dtlArpTask: Sep 28 23:53:19.614: #LOG-3-Q_IND: webauth_redirect.c:1238 read error on server socket, errno=131' +assert_output_json_eq '{ "fracsec": "614", "timestamp": "Sep 28 23:53:19", "programname": "dtlArpTask", "hostname": "192.168.123.99" }' + +cleanup_tmp_files diff --git a/tests/strict_prefix_matching_1.sh b/tests/strict_prefix_matching_1.sh new file mode 100755 index 0000000..f90f8ee --- /dev/null +++ b/tests/strict_prefix_matching_1.sh @@ -0,0 +1,17 @@ +# added 2015-11-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "one rule is strict prefix of a longer one" +add_rule 'version=2' +add_rule 'rule=:a word %w1:word%' +add_rule 'rule=:a word %w1:word% another word %w2:word%' + +execute 'a word w1 another word w2' +assert_output_json_eq '{ "w2": "w2", "w1": "w1" }' + +execute 'a word w1' +assert_output_json_eq '{ "w1": "w1" }' + +cleanup_tmp_files diff --git a/tests/strict_prefix_matching_2.sh b/tests/strict_prefix_matching_2.sh new file mode 100755 index 0000000..b11b144 --- /dev/null +++ b/tests/strict_prefix_matching_2.sh @@ -0,0 +1,21 @@ +# added 2015-11-05 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "named literal compaction" +add_rule 'version=2' +add_rule 'rule=:a word %w1:word% %l1:literal{"text":"l"}% b' +add_rule 'rule=:a word %w1:word% %l2:literal{"text":"l2"}% b' +add_rule 'rule=:a word %w1:word% l3 b' + +execute 'a word w1 l b' +assert_output_json_eq '{ "l1": "l", "w1": "w1" }' + +execute 'a word w1 l2 b' +assert_output_json_eq '{ "l2": "l2", "w1": "w1" }' + +execute 'a word w1 l3 b' +assert_output_json_eq '{ "w1": "w1" }' + +cleanup_tmp_files diff --git a/tests/string_rb_simple.sh b/tests/string_rb_simple.sh new file mode 100755 index 0000000..3e76373 --- /dev/null +++ b/tests/string_rb_simple.sh @@ -0,0 +1,9 @@ +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple rulebase via string" +execute_with_string 'rule=:%w:word%' 'test' +assert_output_json_eq '{ "w": "test" }' + +cleanup_tmp_files diff --git a/tests/string_rb_simple_2_lines.sh b/tests/string_rb_simple_2_lines.sh new file mode 100755 index 0000000..ef62e57 --- /dev/null +++ b/tests/string_rb_simple_2_lines.sh @@ -0,0 +1,24 @@ +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple rulebase via string" +execute_with_string 'rule=:%w:word% +rule=:%n:number%' 'test' +assert_output_json_eq '{ "w": "test" }' + +execute_with_string 'rule=:%w:word% +rule=:%n:number%' '2' +assert_output_json_eq '{ "n": "2" }' + +#This is a correct word... +execute_with_string 'rule=:%w:word% +rule=:%n:number%' '2.3' +assert_output_json_eq '{ "w": "2.3" }' + +#check error case +execute_with_string 'rule=:%w:word% +rule=:%n:number%' '2 3' +assert_output_json_eq '{ "originalmsg": "2 3", "unparsed-data": " 3" }' + +cleanup_tmp_files diff --git a/tests/usrdef_actual1.sh b/tests/usrdef_actual1.sh new file mode 100755 index 0000000..09892f2 --- /dev/null +++ b/tests/usrdef_actual1.sh @@ -0,0 +1,30 @@ +# added 2015-10-30 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "an actual test case for user-defined types" +add_rule 'version=2' +add_rule 'type=@endpid:%{"type":"alternative","parser":[ {"type": "literal", "text":"]"},{"type": "literal", "text":"]:"} ] }%' +add_rule 'type=@AUTOTYPE1:%iface:char-to:/%/%ip:ipv4%(%port:number%)' +add_rule 'type=@AUTOTYPE1:%iface:char-to:\x3a%\x3a%ip:ipv4%/%port:number%' +add_rule 'type=@AUTOTYPE1:%iface:char-to:\x3a%\x3a%ip:ipv4%' +add_rule 'rule=:a pid[%pid:number%%-:@endpid% b' +add_rule 'rule=:a iface %.:@AUTOTYPE1% b' + +execute 'a pid[4711] b' +assert_output_json_eq '{ "pid": "4711" }' +# the next text needs priority assignment +#execute 'a pid[4712]: b' +#assert_output_json_eq '{ "pid": "4712" }' + +execute 'a iface inside:10.0.0.1 b' +assert_output_json_eq '{ "ip": "10.0.0.1", "iface": "inside" }' + +execute 'a iface inside:10.0.0.1/514 b' +assert_output_json_eq '{ "port": "514", "ip": "10.0.0.1", "iface": "inside" }' + +execute 'a iface inside/10.0.0.1(514) b' +assert_output_json_eq '{ "port": "514", "ip": "10.0.0.1", "iface": "inside" }' + +cleanup_tmp_files diff --git a/tests/usrdef_ipaddr.sh b/tests/usrdef_ipaddr.sh new file mode 100755 index 0000000..263e251 --- /dev/null +++ b/tests/usrdef_ipaddr.sh @@ -0,0 +1,18 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple user-defined type" +add_rule 'version=2' +add_rule 'type=@IPaddr:%ip:ipv4%' +add_rule 'type=@IPaddr:%ip:ipv6%' +add_rule 'rule=:an ip address %.:@IPaddr%' +execute 'an ip address 10.0.0.1' +assert_output_json_eq '{ "ip": "10.0.0.1" }' +execute 'an ip address 127::1' +assert_output_json_eq '{ "ip": "127::1" }' +execute 'an ip address 2001:DB8:0:1::10:1FF' +assert_output_json_eq '{ "ip": "2001:DB8:0:1::10:1FF" }' + +cleanup_tmp_files diff --git a/tests/usrdef_ipaddr_dotdot.sh b/tests/usrdef_ipaddr_dotdot.sh new file mode 100755 index 0000000..6026a38 --- /dev/null +++ b/tests/usrdef_ipaddr_dotdot.sh @@ -0,0 +1,18 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "user-defined type with '..' name" +add_rule 'version=2' +add_rule 'type=@IPaddr:%..:ipv4%' +add_rule 'type=@IPaddr:%..:ipv6%' +add_rule 'rule=:an ip address %ip:@IPaddr%' +execute 'an ip address 10.0.0.1' +assert_output_json_eq '{ "ip": "10.0.0.1" }' +execute 'an ip address 127::1' +assert_output_json_eq '{ "ip": "127::1" }' +execute 'an ip address 2001:DB8:0:1::10:1FF' +assert_output_json_eq '{ "ip": "2001:DB8:0:1::10:1FF" }' + +cleanup_tmp_files diff --git a/tests/usrdef_ipaddr_dotdot2.sh b/tests/usrdef_ipaddr_dotdot2.sh new file mode 100755 index 0000000..d716563 --- /dev/null +++ b/tests/usrdef_ipaddr_dotdot2.sh @@ -0,0 +1,16 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "user-defined type with '..' name embedded in other fields" +add_rule 'version=2' +add_rule 'type=@IPaddr:%..:ipv4%' +add_rule 'type=@IPaddr:%..:ipv6%' +add_rule 'rule=:a word %w1:word% an ip address %ip:@IPaddr% another word %w2:word%' +execute 'a word word1 an ip address 10.0.0.1 another word word2' +assert_output_json_eq '{ "w2": "word2", "ip": "10.0.0.1", "w1": "word1" }' +execute 'a word word1 an ip address 2001:DB8:0:1::10:1FF another word word2' +assert_output_json_eq '{ "w2": "word2", "ip": "2001:DB8:0:1::10:1FF", "w1": "word1" }' + +cleanup_tmp_files diff --git a/tests/usrdef_ipaddr_dotdot3.sh b/tests/usrdef_ipaddr_dotdot3.sh new file mode 100755 index 0000000..5641d51 --- /dev/null +++ b/tests/usrdef_ipaddr_dotdot3.sh @@ -0,0 +1,21 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "user-defined type with '..' name embedded in other fields" +add_rule 'version=2' +add_rule 'type=@IPaddr:%..:ipv4%' +add_rule 'type=@IPaddr:%..:ipv6%' +add_rule 'type=@ipOrNumber:%..:@IPaddr{"priority":"1000"}%' +add_rule 'type=@ipOrNumber:%..:number%' +#add_rule 'type=@ipOrNumber:%..:@IPaddr%' # if we enable this instead of the above, the test would break +add_rule 'rule=:a word %w1:word% an ip address %ip:@ipOrNumber% another word %w2:word%' +execute 'a word word1 an ip address 10.0.0.1 another word word2' +assert_output_json_eq '{ "w2": "word2", "ip": "10.0.0.1", "w1": "word1" }' +execute 'a word word1 an ip address 2001:DB8:0:1::10:1FF another word word2' +assert_output_json_eq '{ "w2": "word2", "ip": "2001:DB8:0:1::10:1FF", "w1": "word1" }' +execute 'a word word1 an ip address 111 another word word2' +assert_output_json_eq '{ "w2": "word2", "ip": "111", "w1": "word1" }' + +cleanup_tmp_files diff --git a/tests/usrdef_simple.sh b/tests/usrdef_simple.sh new file mode 100755 index 0000000..755a80f --- /dev/null +++ b/tests/usrdef_simple.sh @@ -0,0 +1,13 @@ +# added 2015-07-22 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "simple user-defined type" +add_rule 'version=2' +add_rule 'type=@hex-byte:%f1:hexnumber{"maxval": "255"}%' +add_rule 'rule=:a word %w1:word% a byte % .:@hex-byte % another word %w2:word%' +execute 'a word w1 a byte 0xff another word w2' +assert_output_json_eq '{ "w2": "w2", "f1": "0xff", "w1": "w1" }' + +cleanup_tmp_files diff --git a/tests/usrdef_two.sh b/tests/usrdef_two.sh new file mode 100755 index 0000000..f2dacda --- /dev/null +++ b/tests/usrdef_two.sh @@ -0,0 +1,18 @@ +# added 2015-10-30 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "user-defined type with two alternatives" +add_rule 'version=2' +add_rule 'type=@hex-byte:%f1:hexnumber{"maxval": "255"}%' +add_rule 'type=@hex-byte:%f1:word%' +add_rule 'rule=:a word %w1:word% a byte % .:@hex-byte % another word %w2:word%' + +execute 'a word w1 a byte 0xff another word w2' +assert_output_json_eq '{ "w2": "w2", "f1": "0xff", "w1": "w1" }' + +execute 'a word w1 a byte TEST another word w2' +assert_output_json_eq '{ "w2": "w2", "f1": "TEST", "w1": "w1" }' + +cleanup_tmp_files diff --git a/tests/usrdef_twotypes.sh b/tests/usrdef_twotypes.sh new file mode 100755 index 0000000..3868002 --- /dev/null +++ b/tests/usrdef_twotypes.sh @@ -0,0 +1,15 @@ +# added 2015-10-30 by Rainer Gerhards +# This file is part of the liblognorm project, released under ASL 2.0 + +. $srcdir/exec.sh + +test_def $0 "two user-defined types" +add_rule 'version=2' +add_rule 'type=@hex-byte:%f1:hexnumber{"maxval": "255"}%' +add_rule 'type=@word-type:%w1:word%' +add_rule 'rule=:a word %.:@word-type% a byte % .:@hex-byte % another word %w2:word%' + +execute 'a word w1 a byte 0xff another word w2' +assert_output_json_eq '{ "w2": "w2", "f1": "0xff", "w1": "w1" }' + +cleanup_tmp_files diff --git a/tests/very_long_logline.sh b/tests/very_long_logline.sh new file mode 100755 index 0000000..e363997 --- /dev/null +++ b/tests/very_long_logline.sh @@ -0,0 +1,16 @@ +# added 2015-09-21 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +msg="foo" +for i in $(seq 1 10); do + msg="${msg},${msg},abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ${i}" +done + +test_def $0 "float field" +add_rule 'rule=:%line:rest%' +execute $msg +assert_output_json_eq "{\"line\": \"$msg\"}" + +cleanup_tmp_files + diff --git a/tests/very_long_logline_jsoncnf.sh b/tests/very_long_logline_jsoncnf.sh new file mode 100755 index 0000000..272f697 --- /dev/null +++ b/tests/very_long_logline_jsoncnf.sh @@ -0,0 +1,17 @@ +# added 2015-09-21 by singh.janmejay +# This file is part of the liblognorm project, released under ASL 2.0 +. $srcdir/exec.sh + +msg="foo" +for i in $(seq 1 10); do + msg="${msg},${msg},abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ${i}" +done + +test_def $0 "float field" +add_rule 'version=2' +add_rule 'rule=:%{"name":"line", "type":"rest"}%' +execute $msg +assert_output_json_eq "{\"line\": \"$msg\"}" + +cleanup_tmp_files + diff --git a/tools/Makefile.am b/tools/Makefile.am new file mode 100644 index 0000000..90b7453 --- /dev/null +++ b/tools/Makefile.am @@ -0,0 +1,9 @@ +# Note: slsa is not yet functional with the v2 engine! +#bin_PROGRAMS = slsa +#slsa_SOURCES = slsa.c syntaxes.c ../src/parser.c +#slsa_CPPFLAGS = -I$(top_srcdir)/src $(JSON_C_CFLAGS) +#slsa_LDADD = $(JSON_C_LIBS) $(LIBLOGNORM_LIBS) $(LIBESTR_LIBS) +#slsa_DEPENDENCIES = ../src/liblognorm.la + +EXTRA_DIST=logrecord.h +#include_HEADERS= diff --git a/tools/Makefile.in b/tools/Makefile.in new file mode 100644 index 0000000..bf5a56d --- /dev/null +++ b/tools/Makefile.in @@ -0,0 +1,462 @@ +# Makefile.in generated by automake 1.15 from Makefile.am. +# @configure_input@ + +# Copyright (C) 1994-2014 Free Software Foundation, Inc. + +# This Makefile.in is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY, to the extent permitted by law; without +# even the implied warranty of MERCHANTABILITY or FITNESS FOR A +# PARTICULAR PURPOSE. + +@SET_MAKE@ + +# Note: slsa is not yet functional with the v2 engine! +#bin_PROGRAMS = slsa +#slsa_SOURCES = slsa.c syntaxes.c ../src/parser.c +#slsa_CPPFLAGS = -I$(top_srcdir)/src $(JSON_C_CFLAGS) +#slsa_LDADD = $(JSON_C_LIBS) $(LIBLOGNORM_LIBS) $(LIBESTR_LIBS) +#slsa_DEPENDENCIES = ../src/liblognorm.la +VPATH = @srcdir@ +am__is_gnu_make = { \ + if test -z '$(MAKELEVEL)'; then \ + false; \ + elif test -n '$(MAKE_HOST)'; then \ + true; \ + elif test -n '$(MAKE_VERSION)' && test -n '$(CURDIR)'; then \ + true; \ + else \ + false; \ + fi; \ +} +am__make_running_with_option = \ + case $${target_option-} in \ + ?) ;; \ + *) echo "am__make_running_with_option: internal error: invalid" \ + "target option '$${target_option-}' specified" >&2; \ + exit 1;; \ + esac; \ + has_opt=no; \ + sane_makeflags=$$MAKEFLAGS; \ + if $(am__is_gnu_make); then \ + sane_makeflags=$$MFLAGS; \ + else \ + case $$MAKEFLAGS in \ + *\\[\ \ ]*) \ + bs=\\; \ + sane_makeflags=`printf '%s\n' "$$MAKEFLAGS" \ + | sed "s/$$bs$$bs[$$bs $$bs ]*//g"`;; \ + esac; \ + fi; \ + skip_next=no; \ + strip_trailopt () \ + { \ + flg=`printf '%s\n' "$$flg" | sed "s/$$1.*$$//"`; \ + }; \ + for flg in $$sane_makeflags; do \ + test $$skip_next = yes && { skip_next=no; continue; }; \ + case $$flg in \ + *=*|--*) continue;; \ + -*I) strip_trailopt 'I'; skip_next=yes;; \ + -*I?*) strip_trailopt 'I';; \ + -*O) strip_trailopt 'O'; skip_next=yes;; \ + -*O?*) strip_trailopt 'O';; \ + -*l) strip_trailopt 'l'; skip_next=yes;; \ + -*l?*) strip_trailopt 'l';; \ + -[dEDm]) skip_next=yes;; \ + -[JT]) skip_next=yes;; \ + esac; \ + case $$flg in \ + *$$target_option*) has_opt=yes; break;; \ + esac; \ + done; \ + test $$has_opt = yes +am__make_dryrun = (target_option=n; $(am__make_running_with_option)) +am__make_keepgoing = (target_option=k; $(am__make_running_with_option)) +pkgdatadir = $(datadir)/@PACKAGE@ +pkgincludedir = $(includedir)/@PACKAGE@ +pkglibdir = $(libdir)/@PACKAGE@ +pkglibexecdir = $(libexecdir)/@PACKAGE@ +am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd +install_sh_DATA = $(install_sh) -c -m 644 +install_sh_PROGRAM = $(install_sh) -c +install_sh_SCRIPT = $(install_sh) -c +INSTALL_HEADER = $(INSTALL_DATA) +transform = $(program_transform_name) +NORMAL_INSTALL = : +PRE_INSTALL = : +POST_INSTALL = : +NORMAL_UNINSTALL = : +PRE_UNINSTALL = : +POST_UNINSTALL = : +build_triplet = @build@ +host_triplet = @host@ +subdir = tools +ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 +am__aclocal_m4_deps = $(top_srcdir)/m4/libtool.m4 \ + $(top_srcdir)/m4/ltoptions.m4 $(top_srcdir)/m4/ltsugar.m4 \ + $(top_srcdir)/m4/ltversion.m4 $(top_srcdir)/m4/lt~obsolete.m4 \ + $(top_srcdir)/m4/pkg.m4 $(top_srcdir)/configure.ac +am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \ + $(ACLOCAL_M4) +DIST_COMMON = $(srcdir)/Makefile.am $(am__DIST_COMMON) +mkinstalldirs = $(install_sh) -d +CONFIG_HEADER = $(top_builddir)/config.h +CONFIG_CLEAN_FILES = +CONFIG_CLEAN_VPATH_FILES = +AM_V_P = $(am__v_P_@AM_V@) +am__v_P_ = $(am__v_P_@AM_DEFAULT_V@) +am__v_P_0 = false +am__v_P_1 = : +AM_V_GEN = $(am__v_GEN_@AM_V@) +am__v_GEN_ = $(am__v_GEN_@AM_DEFAULT_V@) +am__v_GEN_0 = @echo " GEN " $@; +am__v_GEN_1 = +AM_V_at = $(am__v_at_@AM_V@) +am__v_at_ = $(am__v_at_@AM_DEFAULT_V@) +am__v_at_0 = @ +am__v_at_1 = +SOURCES = +DIST_SOURCES = +am__can_run_installinfo = \ + case $$AM_UPDATE_INFO_DIR in \ + n|no|NO) false;; \ + *) (install-info --version) >/dev/null 2>&1;; \ + esac +am__tagged_files = $(HEADERS) $(SOURCES) $(TAGS_FILES) $(LISP) +am__DIST_COMMON = $(srcdir)/Makefile.in +DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST) +ACLOCAL = @ACLOCAL@ +AMTAR = @AMTAR@ +AM_DEFAULT_VERBOSITY = @AM_DEFAULT_VERBOSITY@ +AR = @AR@ +AUTOCONF = @AUTOCONF@ +AUTOHEADER = @AUTOHEADER@ +AUTOMAKE = @AUTOMAKE@ +AWK = @AWK@ +CC = @CC@ +CCDEPMODE = @CCDEPMODE@ +CFLAGS = @CFLAGS@ +CPP = @CPP@ +CPPFLAGS = @CPPFLAGS@ +CYGPATH_W = @CYGPATH_W@ +DEFS = @DEFS@ +DEPDIR = @DEPDIR@ +DLLTOOL = @DLLTOOL@ +DSYMUTIL = @DSYMUTIL@ +DUMPBIN = @DUMPBIN@ +ECHO_C = @ECHO_C@ +ECHO_N = @ECHO_N@ +ECHO_T = @ECHO_T@ +EGREP = @EGREP@ +EXEEXT = @EXEEXT@ +FEATURE_REGEXP = @FEATURE_REGEXP@ +FGREP = @FGREP@ +GREP = @GREP@ +INSTALL = @INSTALL@ +INSTALL_DATA = @INSTALL_DATA@ +INSTALL_PROGRAM = @INSTALL_PROGRAM@ +INSTALL_SCRIPT = @INSTALL_SCRIPT@ +INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@ +JSON_C_CFLAGS = @JSON_C_CFLAGS@ +JSON_C_LIBS = @JSON_C_LIBS@ +LD = @LD@ +LDFLAGS = @LDFLAGS@ +LIBESTR_CFLAGS = @LIBESTR_CFLAGS@ +LIBESTR_LIBS = @LIBESTR_LIBS@ +LIBLOGNORM_CFLAGS = @LIBLOGNORM_CFLAGS@ +LIBLOGNORM_LIBS = @LIBLOGNORM_LIBS@ +LIBOBJS = @LIBOBJS@ +LIBS = @LIBS@ +LIBTOOL = @LIBTOOL@ +LIPO = @LIPO@ +LN_S = @LN_S@ +LTLIBOBJS = @LTLIBOBJS@ +LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAKEINFO = @MAKEINFO@ +MANIFEST_TOOL = @MANIFEST_TOOL@ +MKDIR_P = @MKDIR_P@ +NM = @NM@ +NMEDIT = @NMEDIT@ +OBJDUMP = @OBJDUMP@ +OBJEXT = @OBJEXT@ +OTOOL = @OTOOL@ +OTOOL64 = @OTOOL64@ +PACKAGE = @PACKAGE@ +PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@ +PACKAGE_NAME = @PACKAGE_NAME@ +PACKAGE_STRING = @PACKAGE_STRING@ +PACKAGE_TARNAME = @PACKAGE_TARNAME@ +PACKAGE_URL = @PACKAGE_URL@ +PACKAGE_VERSION = @PACKAGE_VERSION@ +PATH_SEPARATOR = @PATH_SEPARATOR@ +PCRE_CFLAGS = @PCRE_CFLAGS@ +PCRE_LIBS = @PCRE_LIBS@ +PKG_CONFIG = @PKG_CONFIG@ +PKG_CONFIG_LIBDIR = @PKG_CONFIG_LIBDIR@ +PKG_CONFIG_PATH = @PKG_CONFIG_PATH@ +RANLIB = @RANLIB@ +SED = @SED@ +SET_MAKE = @SET_MAKE@ +SHELL = @SHELL@ +SPHINXBUILD = @SPHINXBUILD@ +STRIP = @STRIP@ +VALGRIND = @VALGRIND@ +VERSION = @VERSION@ +WARN_CFLAGS = @WARN_CFLAGS@ +WARN_LDFLAGS = @WARN_LDFLAGS@ +WARN_SCANNERFLAGS = @WARN_SCANNERFLAGS@ +abs_builddir = @abs_builddir@ +abs_srcdir = @abs_srcdir@ +abs_top_builddir = @abs_top_builddir@ +abs_top_srcdir = @abs_top_srcdir@ +ac_ct_AR = @ac_ct_AR@ +ac_ct_CC = @ac_ct_CC@ +ac_ct_DUMPBIN = @ac_ct_DUMPBIN@ +am__include = @am__include@ +am__leading_dot = @am__leading_dot@ +am__quote = @am__quote@ +am__tar = @am__tar@ +am__untar = @am__untar@ +bindir = @bindir@ +build = @build@ +build_alias = @build_alias@ +build_cpu = @build_cpu@ +build_os = @build_os@ +build_vendor = @build_vendor@ +builddir = @builddir@ +datadir = @datadir@ +datarootdir = @datarootdir@ +docdir = @docdir@ +dvidir = @dvidir@ +exec_prefix = @exec_prefix@ +host = @host@ +host_alias = @host_alias@ +host_cpu = @host_cpu@ +host_os = @host_os@ +host_vendor = @host_vendor@ +htmldir = @htmldir@ +includedir = @includedir@ +infodir = @infodir@ +install_sh = @install_sh@ +libdir = @libdir@ +libexecdir = @libexecdir@ +localedir = @localedir@ +localstatedir = @localstatedir@ +mandir = @mandir@ +mkdir_p = @mkdir_p@ +oldincludedir = @oldincludedir@ +pdfdir = @pdfdir@ +pkg_config_libs_private = @pkg_config_libs_private@ +prefix = @prefix@ +program_transform_name = @program_transform_name@ +psdir = @psdir@ +runstatedir = @runstatedir@ +sbindir = @sbindir@ +sharedstatedir = @sharedstatedir@ +srcdir = @srcdir@ +sysconfdir = @sysconfdir@ +target_alias = @target_alias@ +top_build_prefix = @top_build_prefix@ +top_builddir = @top_builddir@ +top_srcdir = @top_srcdir@ +EXTRA_DIST = logrecord.h +all: all-am + +.SUFFIXES: +$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) + @for dep in $?; do \ + case '$(am__configure_deps)' in \ + *$$dep*) \ + ( cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh ) \ + && { if test -f $@; then exit 0; else break; fi; }; \ + exit 1;; \ + esac; \ + done; \ + echo ' cd $(top_srcdir) && $(AUTOMAKE) --gnu tools/Makefile'; \ + $(am__cd) $(top_srcdir) && \ + $(AUTOMAKE) --gnu tools/Makefile +Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status + @case '$?' in \ + *config.status*) \ + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \ + *) \ + echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \ + cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \ + esac; + +$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh + +$(top_srcdir)/configure: $(am__configure_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(ACLOCAL_M4): $(am__aclocal_m4_deps) + cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh +$(am__aclocal_m4_deps): + +mostlyclean-libtool: + -rm -f *.lo + +clean-libtool: + -rm -rf .libs _libs +tags TAGS: + +ctags CTAGS: + +cscope cscopelist: + + +distdir: $(DISTFILES) + @srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \ + list='$(DISTFILES)'; \ + dist_files=`for file in $$list; do echo $$file; done | \ + sed -e "s|^$$srcdirstrip/||;t" \ + -e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \ + case $$dist_files in \ + */*) $(MKDIR_P) `echo "$$dist_files" | \ + sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \ + sort -u` ;; \ + esac; \ + for file in $$dist_files; do \ + if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \ + if test -d $$d/$$file; then \ + dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \ + if test -d "$(distdir)/$$file"; then \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \ + cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \ + find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \ + fi; \ + cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \ + else \ + test -f "$(distdir)/$$file" \ + || cp -p $$d/$$file "$(distdir)/$$file" \ + || exit 1; \ + fi; \ + done +check-am: all-am +check: check-am +all-am: Makefile +installdirs: +install: install-am +install-exec: install-exec-am +install-data: install-data-am +uninstall: uninstall-am + +install-am: all-am + @$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am + +installcheck: installcheck-am +install-strip: + if test -z '$(STRIP)'; then \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + install; \ + else \ + $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \ + install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \ + "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \ + fi +mostlyclean-generic: + +clean-generic: + +distclean-generic: + -test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES) + -test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES) + +maintainer-clean-generic: + @echo "This command is intended for maintainers to use" + @echo "it deletes files that may require special tools to rebuild." +clean: clean-am + +clean-am: clean-generic clean-libtool mostlyclean-am + +distclean: distclean-am + -rm -f Makefile +distclean-am: clean-am distclean-generic + +dvi: dvi-am + +dvi-am: + +html: html-am + +html-am: + +info: info-am + +info-am: + +install-data-am: + +install-dvi: install-dvi-am + +install-dvi-am: + +install-exec-am: + +install-html: install-html-am + +install-html-am: + +install-info: install-info-am + +install-info-am: + +install-man: + +install-pdf: install-pdf-am + +install-pdf-am: + +install-ps: install-ps-am + +install-ps-am: + +installcheck-am: + +maintainer-clean: maintainer-clean-am + -rm -f Makefile +maintainer-clean-am: distclean-am maintainer-clean-generic + +mostlyclean: mostlyclean-am + +mostlyclean-am: mostlyclean-generic mostlyclean-libtool + +pdf: pdf-am + +pdf-am: + +ps: ps-am + +ps-am: + +uninstall-am: + +.MAKE: install-am install-strip + +.PHONY: all all-am check check-am clean clean-generic clean-libtool \ + cscopelist-am ctags-am distclean distclean-generic \ + distclean-libtool distdir dvi dvi-am html html-am info info-am \ + install install-am install-data install-data-am install-dvi \ + install-dvi-am install-exec install-exec-am install-html \ + install-html-am install-info install-info-am install-man \ + install-pdf install-pdf-am install-ps install-ps-am \ + install-strip installcheck installcheck-am installdirs \ + maintainer-clean maintainer-clean-generic mostlyclean \ + mostlyclean-generic mostlyclean-libtool pdf pdf-am ps ps-am \ + tags-am uninstall uninstall-am + +.PRECIOUS: Makefile + +#include_HEADERS= + +# Tell versions [3.59,3.63) of GNU make to not export all variables. +# Otherwise a system limit (for SysV at least) may be exceeded. +.NOEXPORT: diff --git a/tools/logrecord.h b/tools/logrecord.h new file mode 100644 index 0000000..5b608a5 --- /dev/null +++ b/tools/logrecord.h @@ -0,0 +1,47 @@ +/* The (in-memory) format of a log record. + * + * A log record is sequence of nodes of different syntaxes. A log + * record is described by a pointer to its root node. + * The most important node type is literal text, which is always + * assumed if no other syntax is detected. A full list of syntaxes + * can be found below. + * + * Copyright 2015 Rainer Gerhards + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +#ifndef LOGRECORD_H_INCLUDED +#define LOGRECORD_H_INCLUDED + +#include +/* log record node syntaxes + * This "enumeration" starts at 0 and increments for each new + * syntax. Note that we do not use an enum type so that we can + * streamline the in-memory representation. For large sets of + * log records to be held in main memory, this is important. + */ +#define LRN_SYNTAX_LITERAL_TEXT 0 +#define LRN_SYNTAX_IPV4 1 +#define LRN_SYNTAX_INT_POSITIVE 2 +#define LRN_SYNTAX_DATE_RFC3164 3 + +struct logrec_node { + struct logrec_node *next; /* NULL: end of record */ + int8_t ntype; + union { + char *ltext; /* the literal text */ + int64_t number; /* all integer types */ + } val; +}; +typedef struct logrec_node logrecord_t; +#endif /* ifndef LOGRECORD_H_INCLUDED */