(2640) Proposal to Conserve the Name Polypodium Parasiticum ( Thelypteris Parasitica, Christella Parasitica ) ( Thelypteridaceae ) with a Conserved Type
Taxon(2018)SCI 2区SCI 4区
Student Guest House | Univ Calif Berkeley | Nat Hist Museum | Royal Bot Garden | POB 468 | Fern Res Fdn | Bot Survey India
Abstract
TAXONVolume 67, Issue 5 p. 1031-1032 Proposals to Conserve or Reject NamesFree Access (2640) Proposal to conserve the name Polypodium parasiticum (Thelypteris parasitica, Christella parasitica) (Thelypteridaceae) with a conserved type Christopher R. Fraser-Jenkins, Corresponding Author Christopher R. Fraser-Jenkins chrisophilus@yahoo.co.uk Student Guest House, Tridevi Marg, Thamel, P.O. Box no. 5555, Kathmandu, NepalSearch for more papers by this authorAlan R. Smith, Alan R. Smith University Herbarium, University of California, 1001 Valley Life Sciences Bldg. #2465, Berkeley, California, 94720-2465 U.S.A.Search for more papers by this authorCharles E. Jarvis, Charles E. Jarvis Natural History Museum, Cromwell Road, London, SW7 5BD U.K.Search for more papers by this authorMary Gibby, Mary Gibby Royal Botanic Garden, 20A Inverleth Row, Edinburgh, EH3 5LR Scotland, U.K.Search for more papers by this authorMichael G. Price, Michael G. Price P.O. Box 468, Michigan Centre, Michigan, 49254 U.S.A.Search for more papers by this authorBarbara S. Parris, Barbara S. Parris Fern Research Foundation, 21 James Kemp Place, Kerikeri, Bay of Islands, New ZealandSearch for more papers by this authorBhupendra S. Kholia, Bhupendra S. Kholia Botanical Survey of India, Northern Regional Centre, Dehra Dun, 248195 IndiaSearch for more papers by this author Christopher R. Fraser-Jenkins, Corresponding Author Christopher R. Fraser-Jenkins chrisophilus@yahoo.co.uk Student Guest House, Tridevi Marg, Thamel, P.O. Box no. 5555, Kathmandu, NepalSearch for more papers by this authorAlan R. Smith, Alan R. Smith University Herbarium, University of California, 1001 Valley Life Sciences Bldg. #2465, Berkeley, California, 94720-2465 U.S.A.Search for more papers by this authorCharles E. Jarvis, Charles E. Jarvis Natural History Museum, Cromwell Road, London, SW7 5BD U.K.Search for more papers by this authorMary Gibby, Mary Gibby Royal Botanic Garden, 20A Inverleth Row, Edinburgh, EH3 5LR Scotland, U.K.Search for more papers by this authorMichael G. Price, Michael G. Price P.O. Box 468, Michigan Centre, Michigan, 49254 U.S.A.Search for more papers by this authorBarbara S. Parris, Barbara S. Parris Fern Research Foundation, 21 James Kemp Place, Kerikeri, Bay of Islands, New ZealandSearch for more papers by this authorBhupendra S. Kholia, Bhupendra S. Kholia Botanical Survey of India, Northern Regional Centre, Dehra Dun, 248195 IndiaSearch for more papers by this author First published: 01 October 2018 https://doi.org/10.12705/675.16AboutPDF ToolsRequest permissionExport citationAdd to favoritesTrack citation ShareShare Give accessShare full text accessShare full-text accessPlease review our Terms and Conditions of Use and check box below to share full-text version of article.I have read and accept the Wiley Online Library Terms and Conditions of UseShareable LinkUse the link below to share a full-text version of this article with your friends and colleagues. Learn more.Copy URL Share a linkShare onFacebookTwitterLinkedInRedditWechat No abstract is available for this article. Volume67, Issue5October 2018Pages 1031-1032 RelatedInformation
MoreTranslated text
PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest