Skip to content

CMake build system for git #614

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 33 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
abb59f8
Merge branch 'hn/refs-cleanup' into jch
gitster Jun 9, 2020
4cea3bf
Merge branch 'jk/diff-memuse-optim-with-stat-unmatch' into next
gitster Jun 10, 2020
5a72620
Merge branch 'tb/t5318-cleanup' into next
gitster Jun 10, 2020
2901ff9
Merge branch 'js/reflog-anonymize-for-clone-and-fetch' into next
gitster Jun 10, 2020
697a10b
Merge branch 'en/sparse-checkout' into next
gitster Jun 10, 2020
167a629
Merge branch 'js/msvc-build-fix' into next
gitster Jun 10, 2020
2e63c8c
Merge branch 'en/do-match-pathspec-fix' into next
gitster Jun 10, 2020
92944ed
Merge branch 'js/fuzz-commit-graph-leakfix' into next
gitster Jun 10, 2020
60144a5
Merge branch 'dl/t-readme-spell-git-correctly' into next
gitster Jun 10, 2020
ad3269f
Merge branch 'dl/python-2.7-is-the-floor-version' into next
gitster Jun 10, 2020
c72c7da
Merge branch 'es/advertise-contribution-doc' into next
gitster Jun 10, 2020
ee63b70
Merge branch 'cc/upload-pack-data-2' into next
gitster Jun 12, 2020
912c337
Merge branch 'jt/redact-all-cookies' into next
gitster Jun 12, 2020
5f7c822
Merge branch 'es/worktree-duplicate-paths' into next
gitster Jun 12, 2020
bcb7351
Sync with master
gitster Jun 12, 2020
3521be3
Merge branch 'en/sparse-with-submodule-doc' into next
gitster Jun 18, 2020
5b31140
Merge branch 'jk/complete-git-switch' into next
gitster Jun 18, 2020
2c4ec99
Merge branch 'en/clean-cleanups' into next
gitster Jun 18, 2020
e0b54a0
Merge branch 'ct/diff-with-merge-base-clarification' into next
gitster Jun 18, 2020
393eff5
Merge branch 'cc/upload-pack-data-3' into next
gitster Jun 18, 2020
f2d1464
Sync with master
gitster Jun 18, 2020
76e1513
Merge branch 'dl/branch-cleanup' into next
gitster Jun 18, 2020
86b34a3
Merge branch 'ds/merge-base-is-ancestor-optim' into next
gitster Jun 18, 2020
8880b35
Merge branch 'ss/submodule-set-branch-in-c' into next
gitster Jun 18, 2020
e8ba1cc
Merge branch 'jt/cdn-offload' into next
gitster Jun 18, 2020
07ae9f0
Introduce CMake support for configuring Git
SibiSiddharthan Apr 18, 2020
74358b3
cmake: generate the shell/perl/python scripts and templates, translat…
SibiSiddharthan Apr 18, 2020
a60e747
cmake: installation support for git
SibiSiddharthan Apr 18, 2020
fedda55
cmake: support for testing git with ctest
SibiSiddharthan Apr 18, 2020
398a578
cmake: support for testing git when building out of the source tree
SibiSiddharthan Apr 18, 2020
10acdbf
cmake: support for building git on windows with mingw
SibiSiddharthan Apr 18, 2020
0b92154
cmake: support for building git on windows with msvc and clang.
SibiSiddharthan Apr 18, 2020
3cdefab
ci: modification of main.yml to use cmake for vs-build job
SibiSiddharthan May 25, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 24 additions & 15 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -145,13 +145,6 @@ jobs:
## Unzip and remove the artifact
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On the Git mailing list, Đoàn Trần Công Danh wrote (reply to this):

On 2020-05-29 13:40:24+0000, Sibi Siddharthan via GitGitGadget <[email protected]> wrote:
> To check for ICONV_OMITS_BOM libiconv.dll needs to be in the working
> directory of script or path. So we copy the dlls before we configure.

If ICONV_OMITS_BOM is such a troublemaker for CMake,
I'm fine with not supporting it at all.

It seems like noone except me have interest in ICONV_OMITS_BOM.

> @@ -302,4 +308,4 @@ jobs:
>      steps:
>      - uses: actions/checkout@v1
>      - run: ci/install-dependencies.sh
> -    - run: ci/test-documentation.sh
> +    - run: ci/test-documentation.sh
> \ No newline at end of file

Please fix your editor ;)

-- 
Danh

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On the Git mailing list, Sibi Siddharthan wrote (reply to this):

On Sat, May 30, 2020 at 7:44 PM Đoàn Trần Công Danh
<[email protected]> wrote:
>
> On 2020-05-29 13:40:24+0000, Sibi Siddharthan via GitGitGadget <[email protected]> wrote:
> > To check for ICONV_OMITS_BOM libiconv.dll needs to be in the working
> > directory of script or path. So we copy the dlls before we configure.
>
> If ICONV_OMITS_BOM is such a troublemaker for CMake,
> I'm fine with not supporting it at all.
>
> It seems like noone except me have interest in ICONV_OMITS_BOM.
>

It is not a problem supporting this check. This check has to be
implemented sometime down the road.(as it is specified in the
Makefile)
The issue currently is that this check is a bit big (~50 loc)
including setup and cleanup. This might be a burden
for the reviewers as the only reason for considering CMake support is
to support windows developers.

> > @@ -302,4 +308,4 @@ jobs:
> >      steps:
> >      - uses: actions/checkout@v1
> >      - run: ci/install-dependencies.sh
> > -    - run: ci/test-documentation.sh
> > +    - run: ci/test-documentation.sh
> > \ No newline at end of file
>
> Please fix your editor ;)
>
> --
> Danh

unzip artifacts.zip
rm artifacts.zip
- name: generate Visual Studio solution
shell: powershell
run: |
& .\git-sdk-64-minimal\usr\bin\bash.exe -lc @"
make NDEBUG=1 DEVELOPER=1 vcxproj
"@
if (!$?) { exit(1) }
- name: download vcpkg artifacts
shell: powershell
run: |
Expand All @@ -163,6 +156,17 @@ jobs:
Remove-Item compat.zip
- name: add msbuild to PATH
uses: microsoft/[email protected]
- name: copy dlls to root
shell: powershell
run: |
& compat\vcbuild\vcpkg_copy_dlls.bat release
if (!$?) { exit(1) }
- name: generate Visual Studio solution
shell: bash
run: |
cmake `pwd`/contrib/buildsystems/ -DCMAKE_PREFIX_PATH=`pwd`/compat/vcbuild/vcpkg/installed/x64-windows \
-DIconv_LIBRARY=`pwd`/compat/vcbuild/vcpkg/installed/x64-windows/lib/libiconv.lib -DIconv_INCLUDE_DIR=`pwd`/compat/vcbuild/vcpkg/installed/x64-windows/include \
-DMSGFMT_EXE=`pwd`/git-sdk-64-minimal/mingw64/bin/msgfmt.exe -DPERL_TESTS=OFF -DPYTHON_TESTS=OFF -DCURL_NO_CURL_CMAKE=ON
- name: MSBuild
run: msbuild git.sln -property:Configuration=Release -property:Platform=x64 -maxCpuCount:4 -property:PlatformToolset=v142
- name: bundle artifact tar
Expand All @@ -171,8 +175,6 @@ jobs:
MSVC: 1
VCPKG_ROOT: ${{github.workspace}}\compat\vcbuild\vcpkg
run: |
& compat\vcbuild\vcpkg_copy_dlls.bat release
if (!$?) { exit(1) }
& git-sdk-64-minimal\usr\bin\bash.exe -lc @"
mkdir -p artifacts &&
eval \"`$(make -n artifacts-tar INCLUDE_DLLS_IN_ARTIFACTS=YesPlease ARTIFACTS_DIRECTORY=artifacts 2>&1 | grep ^tar)\"
Expand Down Expand Up @@ -203,7 +205,7 @@ jobs:
- name: extract build artifacts
shell: bash
run: tar xf artifacts.tar.gz
- name: test (parallel)
- name: test
shell: powershell
env:
MSYSTEM: MINGW64
Expand All @@ -214,12 +216,19 @@ jobs:
# Let Git ignore the SDK and the test-cache
printf '%s\n' /git-sdk-64-minimal/ /test-cache/ >>.git/info/exclude

cd t &&
PATH=\"`$PWD/helper:`$PATH\" &&
test-tool.exe run-command testsuite --jobs=10 -V -x --write-junit-xml \
`$(test-tool.exe path-utils slice-tests \
${{matrix.nr}} 10 t[0-9]*.sh)
ci/run-test-slice.sh ${{matrix.nr}} 10
"@
- name: ci/print-test-failures.sh
if: failure()
shell: powershell
run: |
& .\git-sdk-64-minimal\usr\bin\bash.exe -lc ci/print-test-failures.sh
- name: Upload failed tests' directories
if: failure() && env.FAILED_TEST_ARTIFACTS != ''
uses: actions/upload-artifact@v1
with:
name: failed-tests-windows
path: ${{env.FAILED_TEST_ARTIFACTS}}
regular:
needs: ci-config
if: needs.ci-config.outputs.enabled == 'yes'
Expand Down
20 changes: 16 additions & 4 deletions Documentation/git-diff.txt
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,17 @@ SYNOPSIS
[verse]
'git diff' [<options>] [<commit>] [--] [<path>...]
'git diff' [<options>] --cached [<commit>] [--] [<path>...]
'git diff' [<options>] <commit> <commit> [--] [<path>...]
'git diff' [<options>] <commit> [<commit>...] <commit> [--] [<path>...]
'git diff' [<options>] <commit>...<commit> [--] [<path>...]
'git diff' [<options>] <blob> <blob>
'git diff' [<options>] --no-index [--] <path> <path>

DESCRIPTION
-----------
Show changes between the working tree and the index or a tree, changes
between the index and a tree, changes between two trees, changes between
two blob objects, or changes between two files on disk.
between the index and a tree, changes between two trees, changes resulting
from a merge, changes between two blob objects, or changes between two
files on disk.

'git diff' [<options>] [--] [<path>...]::

Expand Down Expand Up @@ -67,6 +69,15 @@ two blob objects, or changes between two files on disk.
one side is omitted, it will have the same effect as
using HEAD instead.

'git diff' [<options>] <commit> [<commit>...] <commit> [--] [<path>...]::

This form is to view the results of a merge commit. The first
listed <commit> must be the merge itself; the remaining two or
more commits should be its parents. A convenient way to produce
the desired set of revisions is to use the {caret}@ suffix.
For instance, if `master` names a merge commit, `git diff master
master^@` gives the same combined diff as `git show master`.

'git diff' [<options>] <commit>\...<commit> [--] [<path>...]::

This form is to view the changes on the branch containing
Expand Down Expand Up @@ -196,7 +207,8 @@ linkgit:git-difftool[1],
linkgit:git-log[1],
linkgit:gitdiffcore[7],
linkgit:git-format-patch[1],
linkgit:git-apply[1]
linkgit:git-apply[1],
linkgit:git-show[1]

GIT
---
Expand Down
9 changes: 8 additions & 1 deletion Documentation/git-http-fetch.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ git-http-fetch - Download from a remote Git repository via HTTP
SYNOPSIS
--------
[verse]
'git http-fetch' [-c] [-t] [-a] [-d] [-v] [-w filename] [--recover] [--stdin] <commit> <url>
'git http-fetch' [-c] [-t] [-a] [-d] [-v] [-w filename] [--recover] [--stdin | --packfile=<hash> | <commit>] <url>

DESCRIPTION
-----------
Expand Down Expand Up @@ -40,6 +40,13 @@ commit-id::

<commit-id>['\t'<filename-as-in--w>]

--packfile=<hash>::
Instead of a commit id on the command line (which is not expected in
this case), 'git http-fetch' fetches the packfile directly at the given
URL and uses index-pack to generate corresponding .idx and .keep files.
The hash is used to determine the name of the temporary file and is
arbitrary. The output of index-pack is printed to stdout.

--recover::
Verify that everything reachable from target is fetched. Used after
an earlier fetch is interrupted.
Expand Down
30 changes: 26 additions & 4 deletions Documentation/git-sparse-checkout.txt
Original file line number Diff line number Diff line change
Expand Up @@ -200,10 +200,32 @@ directory.
SUBMODULES
----------

If your repository contains one or more submodules, then those submodules will
appear based on which you initialized with the `git submodule` command. If
your sparse-checkout patterns exclude an initialized submodule, then that
submodule will still appear in your working directory.
If your repository contains one or more submodules, then submodules
are populated based on interactions with the `git submodule` command.
Specifically, `git submodule init -- <path>` will ensure the submodule
at `<path>` is present, while `git submodule deinit [-f] -- <path>`
will remove the files for the submodule at `<path>` (including any
untracked files, uncommitted changes, and unpushed history). Similar
to how sparse-checkout removes files from the working tree but still
leaves entries in the index, deinitialized submodules are removed from
the working directory but still have an entry in the index.

Since submodules may have unpushed changes or untracked files,
removing them could result in data loss. Thus, changing sparse
inclusion/exclusion rules will not cause an already checked out
submodule to be removed from the working copy. Said another way, just
as `checkout` will not cause submodules to be automatically removed or
initialized even when switching between branches that remove or add
submodules, using `sparse-checkout` to reduce or expand the scope of
"interesting" files will not cause submodules to be automatically
deinitialized or initialized either.

Further, the above facts mean that there are multiple reasons that
"tracked" files might not be present in the working copy: sparsity
pattern application from sparse-checkout, and submodule initialization
state. Thus, commands like `git grep` that work on tracked files in
the working copy may return results that are limited by either or both
of these restrictions.


SEE ALSO
Expand Down
4 changes: 3 additions & 1 deletion Documentation/git-worktree.txt
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,9 @@ OPTIONS
locked working tree path, specify `--force` twice.
+
`move` refuses to move a locked working tree unless `--force` is specified
twice.
twice. If the destination is already assigned to some other working tree but is
missing (for instance, if `<new-path>` was deleted manually), then `--force`
allows the move to proceed; use --force twice if the destination is locked.
+
`remove` refuses to remove an unclean working tree unless `--force` is used.
To remove a locked working tree, specify `--force` twice.
Expand Down
9 changes: 4 additions & 5 deletions Documentation/git.txt
Original file line number Diff line number Diff line change
Expand Up @@ -775,11 +775,10 @@ for full details.
See `GIT_TRACE2` for available trace output options and
link:technical/api-trace2.html[Trace2 documentation] for full details.

`GIT_REDACT_COOKIES`::
This can be set to a comma-separated list of strings. When a curl trace
is enabled (see `GIT_TRACE_CURL` above), whenever a "Cookies:" header
sent by the client is dumped, values of cookies whose key is in that
list (case-sensitive) are redacted.
`GIT_TRACE_REDACT`::
By default, when tracing is activated, Git redacts the values of
cookies, the "Authorization:" header, and the "Proxy-Authorization:"
header. Set this variable to `0` to prevent this redaction.

`GIT_LITERAL_PATHSPECS`::
Setting this variable to `1` will cause Git to treat all
Expand Down
78 changes: 78 additions & 0 deletions Documentation/technical/packfile-uri.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
Packfile URIs
=============

This feature allows servers to serve part of their packfile response as URIs.
This allows server designs that improve scalability in bandwidth and CPU usage
(for example, by serving some data through a CDN), and (in the future) provides
some measure of resumability to clients.

This feature is available only in protocol version 2.

Protocol
--------

The server advertises the `packfile-uris` capability.

If the client then communicates which protocols (HTTPS, etc.) it supports with
a `packfile-uris` argument, the server MAY send a `packfile-uris` section
directly before the `packfile` section (right after `wanted-refs` if it is
sent) containing URIs of any of the given protocols. The URIs point to
packfiles that use only features that the client has declared that it supports
(e.g. ofs-delta and thin-pack). See protocol-v2.txt for the documentation of
this section.

Clients should then download and index all the given URIs (in addition to
downloading and indexing the packfile given in the `packfile` section of the
response) before performing the connectivity check.

Server design
-------------

The server can be trivially made compatible with the proposed protocol by
having it advertise `packfile-uris`, tolerating the client sending
`packfile-uris`, and never sending any `packfile-uris` section. But we should
include some sort of non-trivial implementation in the Minimum Viable Product,
at least so that we can test the client.

This is the implementation: a feature, marked experimental, that allows the
server to be configured by one or more `uploadpack.blobPackfileUri=<sha1>
<uri>` entries. Whenever the list of objects to be sent is assembled, all such
blobs are excluded, replaced with URIs. The client will download those URIs,
expecting them to each point to packfiles containing single blobs.

Client design
-------------

The client has a config variable `fetch.uriprotocols` that determines which
protocols the end user is willing to use. By default, this is empty.

When the client downloads the given URIs, it should store them with "keep"
files, just like it does with the packfile in the `packfile` section. These
additional "keep" files can only be removed after the refs have been updated -
just like the "keep" file for the packfile in the `packfile` section.

The division of work (initial fetch + additional URIs) introduces convenient
points for resumption of an interrupted clone - such resumption can be done
after the Minimum Viable Product (see "Future work").

Future work
-----------

The protocol design allows some evolution of the server and client without any
need for protocol changes, so only a small-scoped design is included here to
form the MVP. For example, the following can be done:

* On the server, more sophisticated means of excluding objects (e.g. by
specifying a commit to represent that commit and all objects that it
references).
* On the client, resumption of clone. If a clone is interrupted, information
could be recorded in the repository's config and a "clone-resume" command
can resume the clone in progress. (Resumption of subsequent fetches is more
difficult because that must deal with the user wanting to use the repository
even after the fetch was interrupted.)

There are some possible features that will require a change in protocol:

* Additional HTTP headers (e.g. authentication)
* Byte range support
* Different file formats referenced by URIs (e.g. raw object)
48 changes: 38 additions & 10 deletions Documentation/technical/protocol-v2.txt
Original file line number Diff line number Diff line change
Expand Up @@ -325,13 +325,26 @@ included in the client's request:
indicating its sideband (1, 2, or 3), and the server may send "0005\2"
(a PKT-LINE of sideband 2 with no payload) as a keepalive packet.

If the 'packfile-uris' feature is advertised, the following argument
can be included in the client's request as well as the potential
addition of the 'packfile-uris' section in the server's response as
explained below.

packfile-uris <comma-separated list of protocols>
Indicates to the server that the client is willing to receive
URIs of any of the given protocols in place of objects in the
sent packfile. Before performing the connectivity check, the
client should download from all given URIs. Currently, the
protocols supported are "http" and "https".

The response of `fetch` is broken into a number of sections separated by
delimiter packets (0001), with each section beginning with its section
header.
header. Most sections are sent only when the packfile is sent.

output = *section
section = (acknowledgments | shallow-info | wanted-refs | packfile)
(flush-pkt | delim-pkt)
output = acknowledgements flush-pkt |
[acknowledgments delim-pkt] [shallow-info delim-pkt]
[wanted-refs delim-pkt] [packfile-uris delim-pkt]
packfile flush-pkt

acknowledgments = PKT-LINE("acknowledgments" LF)
(nak | *ack)
Expand All @@ -349,13 +362,17 @@ header.
*PKT-LINE(wanted-ref LF)
wanted-ref = obj-id SP refname

packfile-uris = PKT-LINE("packfile-uris" LF) *packfile-uri
packfile-uri = PKT-LINE(40*(HEXDIGIT) SP *%x20-ff LF)

packfile = PKT-LINE("packfile" LF)
*PKT-LINE(%x01-03 *%x00-ff)

acknowledgments section
* If the client determines that it is finished with negotiations
by sending a "done" line, the acknowledgments sections MUST be
omitted from the server's response.
* If the client determines that it is finished with negotiations by
sending a "done" line (thus requiring the server to send a packfile),
the acknowledgments sections MUST be omitted from the server's
response.

* Always begins with the section header "acknowledgments"

Expand Down Expand Up @@ -406,9 +423,6 @@ header.
which the client has not indicated was shallow as a part of
its request.

* This section is only included if a packfile section is also
included in the response.

wanted-refs section
* This section is only included if the client has requested a
ref using a 'want-ref' line and if a packfile section is also
Expand All @@ -422,6 +436,20 @@ header.
* The server MUST NOT send any refs which were not requested
using 'want-ref' lines.

packfile-uris section
* This section is only included if the client sent
'packfile-uris' and the server has at least one such URI to
send.

* Always begins with the section header "packfile-uris".

* For each URI the server sends, it sends a hash of the pack's
contents (as output by git index-pack) followed by the URI.

* The hashes are 40 hex characters long. When Git upgrades to a new
hash algorithm, this might need to be updated. (It should match
whatever index-pack outputs after "pack\t" or "keep\t".

packfile section
* This section is only included if the client has sent 'want'
lines in its request and either requested that no more
Expand Down
2 changes: 1 addition & 1 deletion builtin/branch.c
Original file line number Diff line number Diff line change
Expand Up @@ -693,7 +693,7 @@ int cmd_branch(int argc, const char **argv, const char *prefix)
list = 1;

if (!!delete + !!rename + !!copy + !!new_upstream + !!show_current +
list + unset_upstream > 1)
list + edit_description + unset_upstream > 1)
usage_with_options(builtin_branch_usage, options);

if (filter.abbrev == -1)
Expand Down
Loading