1
0
Fork 0
mirror of https://codeberg.org/forgejo/forgejo.git synced 2024-11-27 09:11:53 -05:00
forgejo/vendor/github.com/dgrijalva/jwt-go/MIGRATION_GUIDE.md

98 lines
4.6 KiB
Markdown
Raw Normal View History

Git LFS support v2 (#122) * Import github.com/git-lfs/lfs-test-server as lfs module base Imported commit is 3968aac269a77b73924649b9412ae03f7ccd3198 Removed: Dockerfile CONTRIBUTING.md mgmt* script/ vendor/ kvlogger.go .dockerignore .gitignore README.md * Remove config, add JWT support from github.com/mgit-at/lfs-test-server Imported commit f0cdcc5a01599c5a955dc1bbf683bb4acecdba83 * Add LFS settings * Add LFS meta object model * Add LFS routes and initialization * Import github.com/dgrijalva/jwt-go into vendor/ * Adapt LFS module: handlers, routing, meta store * Move LFS routes to /user/repo/info/lfs/* * Add request header checks to LFS BatchHandler / PostHandler * Implement LFS basic authentication * Rework JWT secret generation / load * Implement LFS SSH token authentication with JWT Specification: https://github.com/github/git-lfs/tree/master/docs/api * Integrate LFS settings into install process * Remove LFS objects when repository is deleted Only removes objects from content store when deleted repo is the only referencing repository * Make LFS module stateless Fixes bug where LFS would not work after installation without restarting Gitea * Change 500 'Internal Server Error' to 400 'Bad Request' * Change sql query to xorm call * Remove unneeded type from LFS module * Change internal imports to code.gitea.io/gitea/ * Add Gitea authors copyright * Change basic auth realm to "gitea-lfs" * Add unique indexes to LFS model * Use xorm count function in LFS check on repository delete * Return io.ReadCloser from content store and close after usage * Add LFS info to runWeb() * Export LFS content store base path * LFS file download from UI * Work around git-lfs client issue with unauthenticated requests Returning a dummy Authorization header for unauthenticated requests lets git-lfs client skip asking for auth credentials See: https://github.com/github/git-lfs/issues/1088 * Fix unauthenticated UI downloads from public repositories * Authentication check order, Finish LFS file view logic * Ignore LFS hooks if installed for current OS user Fixes Gitea UI actions for repositories tracking LFS files. Checks for minimum needed git version by parsing the semantic version string. * Hide LFS metafile diff from commit view, marking as binary * Show LFS notice if file in commit view is tracked * Add notbefore/nbf JWT claim * Correct lint suggestions - comments for structs and functions - Add comments to LFS model - Function comment for GetRandomBytesAsBase64 - LFS server function comments and lint variable suggestion * Move secret generation code out of conditional Ensures no LFS code may run with an empty secret * Do not hand out JWT tokens if LFS server support is disabled
2016-12-25 20:16:37 -05:00
## Migration Guide from v2 -> v3
Version 3 adds several new, frequently requested features. To do so, it introduces a few breaking changes. We've worked to keep these as minimal as possible. This guide explains the breaking changes and how you can quickly update your code.
### `Token.Claims` is now an interface type
The most requested feature from the 2.0 verison of this library was the ability to provide a custom type to the JSON parser for claims. This was implemented by introducing a new interface, `Claims`, to replace `map[string]interface{}`. We also included two concrete implementations of `Claims`: `MapClaims` and `StandardClaims`.
`MapClaims` is an alias for `map[string]interface{}` with built in validation behavior. It is the default claims type when using `Parse`. The usage is unchanged except you must type cast the claims property.
The old example for parsing a token looked like this..
```go
if token, err := jwt.Parse(tokenString, keyLookupFunc); err == nil {
fmt.Printf("Token for user %v expires %v", token.Claims["user"], token.Claims["exp"])
}
```
is now directly mapped to...
```go
if token, err := jwt.Parse(tokenString, keyLookupFunc); err == nil {
claims := token.Claims.(jwt.MapClaims)
fmt.Printf("Token for user %v expires %v", claims["user"], claims["exp"])
}
```
`StandardClaims` is designed to be embedded in your custom type. You can supply a custom claims type with the new `ParseWithClaims` function. Here's an example of using a custom claims type.
```go
type MyCustomClaims struct {
User string
*StandardClaims
}
if token, err := jwt.ParseWithClaims(tokenString, &MyCustomClaims{}, keyLookupFunc); err == nil {
claims := token.Claims.(*MyCustomClaims)
fmt.Printf("Token for user %v expires %v", claims.User, claims.StandardClaims.ExpiresAt)
}
```
### `ParseFromRequest` has been moved
To keep this library focused on the tokens without becoming overburdened with complex request processing logic, `ParseFromRequest` and its new companion `ParseFromRequestWithClaims` have been moved to a subpackage, `request`. The method signatues have also been augmented to receive a new argument: `Extractor`.
`Extractors` do the work of picking the token string out of a request. The interface is simple and composable.
This simple parsing example:
```go
if token, err := jwt.ParseFromRequest(tokenString, req, keyLookupFunc); err == nil {
fmt.Printf("Token for user %v expires %v", token.Claims["user"], token.Claims["exp"])
}
```
is directly mapped to:
```go
if token, err := request.ParseFromRequest(req, request.OAuth2Extractor, keyLookupFunc); err == nil {
claims := token.Claims.(jwt.MapClaims)
fmt.Printf("Token for user %v expires %v", claims["user"], claims["exp"])
}
```
There are several concrete `Extractor` types provided for your convenience:
* `HeaderExtractor` will search a list of headers until one contains content.
* `ArgumentExtractor` will search a list of keys in request query and form arguments until one contains content.
* `MultiExtractor` will try a list of `Extractors` in order until one returns content.
* `AuthorizationHeaderExtractor` will look in the `Authorization` header for a `Bearer` token.
* `OAuth2Extractor` searches the places an OAuth2 token would be specified (per the spec): `Authorization` header and `access_token` argument
* `PostExtractionFilter` wraps an `Extractor`, allowing you to process the content before it's parsed. A simple example is stripping the `Bearer ` text from a header
### RSA signing methods no longer accept `[]byte` keys
Due to a [critical vulnerability](https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/), we've decided the convenience of accepting `[]byte` instead of `rsa.PublicKey` or `rsa.PrivateKey` isn't worth the risk of misuse.
To replace this behavior, we've added two helper methods: `ParseRSAPrivateKeyFromPEM(key []byte) (*rsa.PrivateKey, error)` and `ParseRSAPublicKeyFromPEM(key []byte) (*rsa.PublicKey, error)`. These are just simple helpers for unpacking PEM encoded PKCS1 and PKCS8 keys. If your keys are encoded any other way, all you need to do is convert them to the `crypto/rsa` package's types.
```go
func keyLookupFunc(*Token) (interface{}, error) {
// Don't forget to validate the alg is what you expect:
if _, ok := token.Method.(*jwt.SigningMethodRSA); !ok {
return nil, fmt.Errorf("Unexpected signing method: %v", token.Header["alg"])
}
// Look up key
key, err := lookupPublicKey(token.Header["kid"])
if err != nil {
return nil, err
}
// Unpack key from PEM encoded PKCS8
return jwt.ParseRSAPublicKeyFromPEM(key)
}
```